Dynamic Data Engineer Analyst with a proven track record at Accenture, specializing in ETL development and data validation. Expert in Informatica and SQL, I excel in designing scalable data pipelines and enhancing data quality.I work with tools like BigQuery for fast, SQL-based analytics on large datasets, Teradata for enterprise-level data warehousing, and Informatica PowerCenter for efficient ETL processes,Control-M for scheduling and automating data jobs, and Git/Bitbucket for version control and collaborative development. I'm passionate about turning raw data into actionable insights and constantly seek opportunities to learn, collaborate, and drive impactful, data-driven outcomes. My strong analytical skills drive innovative solutions, ensuring seamless application functionality and improved efficiency in data processes.
Involved in designing, building, and configuring applications to meet business process and application requirements. Things are revolve around creating innovative solutions to address various business needs and ensuring seamless application functionality.
Roles & Responsibilities: -
-Perform independently and become an SME.
-Active participation/contribution in team discussions.
-Contribute in providing solutions to work related problems.
- Develop and execute test plans, test cases, and test scripts for data warehouse ETL testing.
- Identify and troubleshoot data quality issues in ETL processes.
- Collaborate with cross-functional teams to ensure data integrity and accuracy.
- Testing processes to improve efficiency and accuracy.
- Insights and recommendations for enhancing ETL processes.
Professional & Technical Skills: - Must To Have Skills:
- Proficiency in Data Warehouse ETL Testing.
- Strong understanding of SQL and database concepts.
- Experience with ETL tools such as Informatica.
- Knowledge of data warehousing concepts and methodologies.
- Hands-on experience in data validation and reconciliation processes.
Apply data engineering principles to develop reusable workflows including ingestion, quality, transformation and optimization. Build scalable data pipelines using exact, transform and load tools. Deploy solutions to production environments. Migrate data from legacy data warehouses using cloud architecture principles. Automate the flow of data for consumption.
Solution Environment
Artificial Intelligence (AI), Sql, Informatica Powercenter, Teradata BI, Jira, Jenkins, Winscp,Putty, Bitbucket,Control-M,
Role and Responsibility:
Key Responsibilities :
• Data warehouse ETL Design and Development using Teradata and Informatica • Understand ETL requirements and transform those to ETL design and code • Need to have skill in informatic Technical Experience :
• Experience of 1 years with ETL testing using Informatica, Teradata • Good Knowledge on Informatica and Unix/ Python knowledge • Good working knowledge of Agile methodology
Professional Attributes :
• Have strong analytical skills • Have strong communication skills • Have ability to work under pressure
Analysis and Prediction of Covid -19 Trends using Auto-Correlation Function and Partial Auto-Correlation function in Arima Model and representing it on a Dashboard using Tableau. Insights of the analysis will be majorly on Confirmed Cases,Death Cases, and Recovered Cases of Corona Virus Disease.