Passionate Data Engineer with 2.6 years of hands-on experience in designing and implementing robust data pipelines. Highly skilled and results-oriented data engineer with a strong background in Informatica PowerCenter development, ETL processes, and data integration. Seeking to leverage expertise in Informatica PowerCenter to contribute effectively as a ETL developer , delivering robust solutions that meet and exceed business requirements. Committed to optimizing data solutions for efficiency and reliability.
• Develop and maintain ETL processes using Informatica PowerCenter and Stream Sets, extracting, transforming, and loading data into databases (Teradata, Greenplum, and Oracle).
• Efficiently manage and optimize the performance of Teradata, Greenplum, and Oracle databases to support data analysis tasks using SQL.
•Led the design, development, and maintenance of data quality rules using Informatica Data Quality, ensuring high data integrity and accuracy.
• Integrate data from diverse sources, ensuring accuracy and consistency, while establishing CI/CD pipelines for automated code deployment using GitHub.
• Write SQL queries for data transformation and analysis, adhering to best practices.
• Implement data quality checks and monitoring solutions to ensure data integrity, addressing bottlenecks in data pipelines and proactively responding to failures.
• Develop comprehensive data mapping documents to illustrate data flow and transformations during table development.
• Maintain documentation for ETL processes and data lineage, collaborating effectively with cross functional teams, and communicating technical concepts to non-technical stakeholders.
• Stay updated with emerging data engineering technologies and trends while conducting a successful proof of concept (POC) on Apache Kafka, showcasing real-time data streaming expertise.
Informatica Data Quality
undefined