Summary
Overview
Work History
Education
Skills
Certification
Languages
Timeline
Generic

Kathiresan Murugappan

Bangalore

Summary

Senior Talend Consultant with proven expertise in designing efficient ETL processes and scalable data pipelines. Extensive experience in Talend and cloud storage solutions, with a focus on mentoring teams and implementing best practices. Strong analytical skills contribute to successful project outcomes and ensure data quality across diverse systems.

Overview

12
12
years of professional experience
1
1
Certification

Work History

Senior Talend Consultant

Servier
01.2024 - 06.2025
  • Company Overview: Servier focuses its creative energy and innovative ability on three main therapeutic areas: neuroscience, oncology and nephrology. Aim is to help people with diseases that range from affecting millions to those that affect relatively few.
  • Design and implement complex ETL processes for large-scale data integration, transformation, and storage, often involving multi-source data systems (databases, cloud storage, APIs).
  • Design scalable, efficient, and high-performance data pipelines using Talend, ensuring the architecture aligns with business objectives.
  • Provide leadership and direction for junior data engineers, mentoring them on best practices, debugging complex issues, and ensuring quality standards.
  • Design transformations to convert raw data into usable, structured formats for analysis.
  • Develop scalable, reliable, and efficient data pipelines using Talend.
  • Define how Talend will integrate with other systems, both within and outside the organization.
  • Apply business rules to transform data from its source format to the desired target format.
  • Servier focuses its creative energy and innovative ability on three main therapeutic areas: neuroscience, oncology and nephrology. Aim is to help people with diseases that range from affecting millions to those that affect relatively few.

Senior Talend Application Lead

TELEFONICA
03.2023 - 12.2023
  • Company Overview: Network Data Lake (FastOss_Basis)- is flexible to scale up with fast growing data can onboard new data sources.
  • Creating automation framework in Talend and load huge volume of data from various source sets like on premise / streaming to GCP (Google big query platform).
  • Implementing the Error handling mechanism, pre-checks & interruption in Talend component and capture all details for logging & back tracking perspective.
  • Worked with Business Stakeholders to collect functional requirements.
  • Data quality rules are implemented using Talend component.
  • Used agile development methodology for achieving deadlines.
  • Create Secupi process for archiving the data governance quality & security.
  • Performance tuning implemented for existing & new development jobs for long running.
  • Network Data Lake (FastOss_Basis)- is flexible to scale up with fast growing data can onboard new data sources.

Sr Software Engineer

T-Mobile
03.2022 - 10.2022
  • The revap Data we are getting various source channel like Samson, SAP, EIP, NFS,YAM inject the file from using aptitude ETL tool from multiple source channel to DAP layer(Data acquisition platform), the DAP layer is called as centralized staging area for all source system transactional data.
  • Worked with Business Stakeholders to collect functional requirements.
  • Prepare project document as per functional requirements.
  • Implementing the best practices and coding standards of the project.
  • Processed customer data for the business report using aptitude jobs.
  • Working on identifying the project risk and planning mitigation action.
  • Used agile development methodology for achieving deadlines.
  • Performance tuning for better performance.
  • Creating a package and run the package in server then data get loaded in Teradata.

Sr Software Engineer

HP
06.2021 - 03.2022
  • IIB& commune have various source channel like Global Newton, Olympia, CDAX, APCDI & ICM This all sources consist of repair device information, product information, cost center, repair center, material replaced information and repair cost breakup information.
  • Worked with Business Stakeholders to collect functional requirements.
  • Prepare project document as per functional requirements.
  • Implementing the best practices and coding standards of the project.
  • Processed customer data for the business report using Talend jobs.
  • Working on identifying the project risk and planning mitigation action.
  • Used agile development methodology for achieving deadlines.
  • Developed Talend jobs flow to read data from source to Row Zone (HDFS location).
  • Developed talend job to read data form Row Zone and do data cleansing and pre-processing and push data into an Trusted Zone.
  • Performance tuning for better performance.
  • Worked on ABCR framework.
  • Huge Volume of data are moving form EDL hive to Vertica as per the business logic.
  • Created the view on top of the Base table based on business logic.
  • Schedule the Talend Jobs in Talend Admin Center.
  • Created Automated talend job using publish the Dashboard reports for businesspeople, the report contains everyday data flow counts.

Sr Software Engineer

HP
02.2021 - 05.2021
  • Bullseye have multiple source channel like global newton, Wiz quality, Foresight Consumer, Demand & Consumption, ICS Moving the huge volume of data from EDL to ADLS gen2.
  • The Master Job which follows the ABCR Control table to call in all the Jobs which are present in HP Talend Projects.
  • This job has been generalized so that no rework or re build is needed from developer end when adding a particular stream/POD/new Requirement.
  • The Job follows the basic context parameters which can be set when deployed in TAC.
  • Once the necessary BOW and SBOW names are fed via the context values the Job will dynamically call the necessary components based on ABCR principals.

Technical Expert

LaSalle Investment Management /JLL
08.2020 - 12.2020
  • The scope is to create a single unified platform for all the global users of LaSalle.
  • The data is now available in multiple application, so as a part of the project the data needs to validated and moved into a centralized data warehouse in snowflake.
  • A reporting platform is built as the front end using the data in snowflake.
  • Understand the business logic.
  • Analyze the source and target systems.
  • Handle various Flat Files like Excel (Reporting format) Pdf, Word, PPT, Json.
  • Complex Flat Files screenshot using OCR method to extract data and converting into excel file.
  • Extract various application data and API using Talend.
  • Loading data into snowflake using SCD concept & creating snow pipe.
  • Extensively created Talend mappings using data preparation & stewardship using dataset input/output & tdata prep run component & tdatastwrardship task input/output & tdataswardship task delete.
  • Add recipe to make a clean set of data using in talend data preparation.
  • Create a camping & data model then semantic checks also using in talend data stewardship.
  • Develop strong debugging and defect fixing skills.
  • Establish build and integration tools and mechanisms.
  • Deploy all the Talend job GIT using Azure devops.
  • Schedule & monitor artifact job using in Talend management console.

Sr Software Engineer

Lincoln Financial Group (LFG)
06.2018 - 05.2020
  • Communication Framework focuses on storing the communication preferences of end users at party/contact level.
  • This manages users’ opt-out/opt-in details in accordance with CAN-SPAM and other Acts.
  • This comprises of API programs and ETL data loads which are used to create, edit and retrieve the global communication preferences.
  • At ETL level, we load source data coming from various channels like Salesforce, Eloqua, Earth Integrate, Distribion, MDS Data and RPS E-Delivery into MySQL database hosted on AWS RDS.
  • Further this data is extracted and shared to Data Discovery Platform – analytics users.
  • Also, when a bulk volume of data comes from the source, ETL will be used to load the data into Elastic Search Index.

Sr Software Engineer

Computer Association (CA)
02.2018 - 06.2018
  • Company Overview: Prudential Financial companies serve individual and institutional customers worldwide and include The Prudential Insurance Company of America, one of the largest life insurance companies in the U.S.
  • Experience in using cloud components and connectors to make API calls for accessing data from cloud storage (AzurSql, Salesforce, Amazon S3, AzurBlob) in Talend data fabric6.4.
  • Experience in creating Joblets in TALEND for the processes which can be used in most of the jobs in a project like to Start job and Commit job.
  • Extracted data from multiple operational sources for loading staging area, Data warehouse, DataMarts using SCDs (Type 1/Type 2/ Type 3) loads.
  • Used Talend reusable components like routines, context variable and globalMap variables.
  • Extensively created mappings in TALEND using tMap, tJoin, tReplicate, tConvertType, tFlowtoIterate, tAggregate, tSortRow, tLogCatcher, tNormalize, tDenormalize, tJava, tJavarow, tAggregateRow, tWarn, , tFilter, tDie, tSfdcinput, tsfdcconnection, tjdbcinput, tjdbcoutput, tazurstorageconnection, tAzurStoragePut, tAzurStorageget, tJdbcrow, tRunjob, tSendmail, etc.
  • Implementing fast and efficient data acquisition using Big Data processing techniques and tools.
  • Created jobs to pass parameters from child job to parent job.
  • Monitored and supported the Talend jobs scheduled through Talend Admin Center (TAC).
  • Prudential Financial companies serve individual and institutional customers worldwide and include The Prudential Insurance Company of America, one of the largest life insurance companies in the U.S.

Senior Software Engineer

Toyota Financial Services
08.2015 - 01.2018
  • Company Overview: Toyota Financial Services is a leading provider of automotive financial services, offering an extensive line of financing plans and vehicle and payment protection products to Toyota customers and dealers in the U.S.
  • The scope of this Project is to capture the data from various data sources and to construct a data warehouse to maintain Dealer/Customer's history and current data.
  • Also TFS’s Marketing/support/HR/Insurance/Employee details are stored in data warehouse for taking efficient decisions.
  • Data from Operational Source System was extracted, transformed and loaded into data warehouse using Talend Open Studio.
  • Understanding existing business model and BI requirements.
  • Designed and developed Talend ETL jobs to load the data from file to stage tables and stage to datamarts.
  • Extensively used ETL methodology for performing Data Profiling, Data Migration, Extraction, Transformation and Loading using Talend and designed data conversions from wide variety of source systems including Oracle, SQL server, Teradata, Hive, non-relational sources like flat files, xml, Json, etc.
  • Use different Components of Talend (tOracleInput, tOracleOutput, tHiveInput, tHiveOutput, tHiveInputRow, tUniqeRow, tAggregateRow, tRunJob, tPreJob, tPostJob, tMap, tJavaRow, tJavaFlex, tFilterRow etc) to develop standard jobs.
  • Extensively created mappings in TALEND using tMap, tJoin, tReplicate, tParallelize, tConvertType, tflowtoIterate, tAggregatesortedrow, tSortRow, tLogCatcher, tRowGenerator, tNormalize, tDenormalize, tJava, tjavaflex, tJavarow, tAggregateRow, tOraclScd, tFilter, tDie etc.
  • Extensive experience in using Talend features such as context variables, connectors for Database and flat files like tHiveCreateTable, tHiveLoad, tHiveRow, tHiveConnection, tSqoopImportAllTables, tSqoopImport, tSqoopExport, tSqoopMerge, tOraclrow, tOraclscd tOraclOutput, , tFileCopy, tFileInputDelimited, tFileExists.
  • Handled importing of data from multiple data sources (Orcale, SqlServer) using Sqoop, performed Cleaning, Transformations and Joins using hive.
  • Push data as delimited files into HDFS using Talend Big data studio.
  • Exported analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.
  • Developed Talend mappings for Type1 & Type 2 Slowly Changing Dimensions.
  • Monitored and supported the Talend jobs scheduled through Talend Admin Center (TAC).
  • Toyota Financial Services is a leading provider of automotive financial services, offering an extensive line of financing plans and vehicle and payment protection products to Toyota customers and dealers in the U.S.

Software Engineer

Well mark
08.2013 - 08.2015
  • Company Overview: Well mark provides the technology to consolidate strategic, clinical and financial information into a single, complete picture of healthcare performance.
  • EDW sources data to retrieve Patient, Physician, Episodes, Diagnosis, and Procedure and Cost information.
  • This data is loaded in EDW staging area, transformed and then used for Hospital Episodes, Glycemic Control and Visits reporting.
  • The data used to come in as flat file and from different databases like SQL server, Oracle and loaded into target database Teradata using Informatica Power Center.
  • Created new mappings and updated old mappings according to changes in Business Logic.
  • Extracting data from various sources involving flat files and relational tables.
  • Created simple and complex Mappings using Transformations like Source Qualifier, Lookup, Router, Stored procedure, Aggregator, Filter, joiner, Update Strategy, Expression, Sequence Generator, worklets, Mapplets and Reusable transformations using Informatica.
  • Experience of handling Slowly Changing Dimensions to maintain the complete History.
  • Developed all the mappings according to the design document and mapping.
  • Resolve problems encountered in the daily jobs which fails.
  • Monitor daily jobs running in production.
  • Well mark provides the technology to consolidate strategic, clinical and financial information into a single, complete picture of healthcare performance.

Education

M.sc - software Engineer

Periyar Maniammai College of Technology
Thanjavur
06-2009

Skills

  • Talend ETL tools
  • Data warehousing
  • Informatica PowerCenter
  • Version control systems
  • Data governance
  • Technical documentation
  • Team leadership
  • Cloud data integration
  • DevOps methodologies
  • Windows operating system
  • Database management systems
  • Cloud storage solutions
  • Data scheduling tools

Certification

Azure Data Engineer Associate, 01/13/2024 to 01/13/2025, FD7CC868AE2E1CD8, 2FCCE4-A5ADB3

Languages

English
First Language
English
Proficient (C2)
C2
French
Intermediate (B1)
B1

Timeline

Senior Talend Consultant

Servier
01.2024 - 06.2025

Senior Talend Application Lead

TELEFONICA
03.2023 - 12.2023

Sr Software Engineer

T-Mobile
03.2022 - 10.2022

Sr Software Engineer

HP
06.2021 - 03.2022

Sr Software Engineer

HP
02.2021 - 05.2021

Technical Expert

LaSalle Investment Management /JLL
08.2020 - 12.2020

Sr Software Engineer

Lincoln Financial Group (LFG)
06.2018 - 05.2020

Sr Software Engineer

Computer Association (CA)
02.2018 - 06.2018

Senior Software Engineer

Toyota Financial Services
08.2015 - 01.2018

Software Engineer

Well mark
08.2013 - 08.2015

M.sc - software Engineer

Periyar Maniammai College of Technology
Kathiresan Murugappan