Overall experience of 5+ years in Data Warehousing ETL and Business Intelligence Regular practitioner of Informatica Power Center, IICS, snowflake and Azure data factory Extracted data from oracle , Teradata , SQL database and flat files provided and staged into a single place and used Various Transform components such as Filter, Router, Union, Sequence Generator, Sorter, Lookup, Joiner and Aggregator. Analysis of the specifications provided by the clients. Created Mappings, Sessions and Workflows using Informatica Power Center. Participated in testing and performance tuning by identifying bottlenecks in mapping logic and resolving them. Developed and maintained Unix shell scripts to automate test data setup, job monitoring, and file handling in the ETL process. Used shell scripting to schedule and validate ETL workflows and perform pre- and post-load validations. Execute tests to verify data is extracted, transformed, and loaded correctly. Validate data accuracy, completeness, and consistency across systems. Document all test results and provide summary reports. Reporting daily Development status. Attending Sprint planning calls and status calls. Monitoring Jobs in Control-M Worked closely with the onsite team to understand the business requirements and applying the business rules. Good team player
Designed and developed ETL workflows in Informatica PowerCenter to extract data from multiple sources including flat files and SQL Server databases, transform it based on business rules, and load it into Teradata.
Created source-to-target mappings using Informatica transformations such as Expression, Lookup, Joiner, Filter, Aggregator, Sequence Generator, and Router to implement data cleansing, validation, and business logic.
Developed reusable mapplets and parameterized sessions to improve code reusability, maintainability, and adaptability across environments (DEV/QA/PROD).
Implemented data loading strategies for Teradata, including full loads, incremental loads (based on change data capture or timestamp logic), and optimized bulk loading using Teradata utilities (e.g., TPT, FastLoad).
Performed data profiling and validation to ensure source data quality and transformation accuracy before loading into Teradata, using both SQL queries and Informatica data preview.
Applied performance tuning techniques at the mapping, session, and database levels to improve ETL job efficiency — including pushdown optimization, partitioning, and indexing on Teradata tables.
Conducted unit testing and peer code reviews to ensure data accuracy, validate transformation logic, and reduce defects before system and user acceptance testing.
Handled error logging and exception management, routing rejected or invalid records to error tables and creating informative logs for easier troubleshooting.
Maintained documentation for ETL jobs, including mapping specifications, technical design documents, data flow diagrams, and scheduling details to support operational transparency.
Collaborated with DBAs, QA teams, business analysts, and data architects to understand requirements, troubleshoot issues, ensure adherence to standards, and deliver high-quality data integration solutions.
Project : Met Life
Duration : AUG 2018 to SEP 2020
Technologies : Informatica 10.4 , Oracle , SQL , DBeaver.
I hereby declare that information furnished above is true to my knowledge and belief.
(K S Veronika)