I am an experienced Talend ETL, snowflake and DBT developer with 5 years in IT, focused on ETL Data Integration for 4 years.
I have profound expertise in developing and maintaining ETL frameworks, managing data migrations from diverse sources, employing performance tuning techniques, and utilizing cloud services for data warehousing.
Skilled in various versions of Talend, I bring valuable experience from working in AGILE environments and collaborating effectively with cross-functional teams.
I have extensive experience in ETL development, particularly with Talend, focusing on data extraction, transformation, and loading from diverse source systems.
My expertise includes creating complex mappings, pipeline orchestration, and leveraging cloud technologies like Snowflake and AWS. I excel in performance tuning and project management, and I am deeply committed to delivering high-quality data solutions.
Experienced in Creation of Snowflake objects like databases, schemas, tables, stages, views, Procedures and FILE_FORMAT using SnowSQL. Expertise in working with Snowflake Multi-Cluster Warehouse, Snowpipe, Internal/External Stages, Stored Procedures, Clone, Tasks and Streams.
Experience in bulk loading and unloading data into Snowflake tables. Experienced in query performance tuning, Clone, Time travel.
Good exposure in Snowflake Cloud Architecture and SNOWPIPE for continuous data ingestion.
Designed, developed, and maintained data models using DBT also Created and
managed complex DBT transformations using SQL-based models.
Worked on Data Quality and Testing Implemented data validation rules within DBT models
Used DBT snapshots to capture historical data changes, ensuring accurate
versioning of data over time.
Overview
5
5
years of professional experience
Work History
Senior Data Engineer
Tech Mahindra
Hyderabd
07.2023 - Current
Client Overview: Nissan Motors and project is One data - D2P Domain
Responsible for all activities related to the development, implementation for
large scale Snowflake cloud data warehouse.
Bulk loading from external stage (AWS S3) to internal stage (snowflake) using
COPY command.
Loading data into Snowflake tables from the internal stage and on the local machine.
To increase or decrease the time travel, as well as the failsafe. Used COPY, LIST, PUT and GET commands for validating internal and external
stage files.
Used import and export from internal stage (Snowflake) VS external stage (S3 Bucket).
Writing complex Snowflake sql scripts in Snowflake cloud data warehouse to
Business Analysis and reporting.
Responsible for task distribution among the team.
Responsible for task to take DATA LOADING AND UNLOADING DATA from files as well as cloud.
Created a snow pipe and performed operations like ZEROCOPY CLONING, TIME TRAVEL, TABLES, VIEWS, FUNCTIONS, etc.oubleshooting analysis and resolution of critical issues.
Involved in Performance tuning of sessions, mappings, SQL procedures.
Involved in data analysis and handling ad-hoc requests by interacting with business
analysts, clients and resolve the issues as part of production support.
Creating UNIX Shell scripts and SQL procedures to automate data Loads.
Used AWS Glue Studio for visual ETL job design.
Built and optimized ETL processes using DBT, transforming raw data into
structured datasets for business intelligence, improving reporting speed.
Created automated DBT workflows to transform and load data from diverse sources into Snowflake, reducing manual intervention.
Developed robust DBT models for key business metrics, ensuring consistency across the reporting layer, and improving data reliability.
Implemented automated data quality checks within DBT, identifying and resolving data discrepancies early in the pipeline.
Integrated DBT with Gitlab, automating data pipeline schedules and increasing
operational efficiency across multiple teams.
Sr. Data Engineer
LTIMindtree
06.2022 - 05.2023
Client Overview; National Hockey league and EDP project
As part of EDP project, NHL maintain a data warehouse for team and player analysis. This project objective is to move existing data from Teradata to Snowflake cloud with the same sources.
Understanding the requirement and developed ETL design document to develop mappings.
Prepared a strategy to load historical and incremental loading of the mappings.
Stage the data for different source system code.
Analyzed source system data, which helped in designing the ETL rules.
Designed and Developed complex ETL mappings for type 1 and type 2 Dimensions and complex Fact tables.
Prepared test cases and completed Unit testing of developed mappings. Coordinated ETL testing with Business analysts and user.
Bulk loading from external stage AWS S3 to internal stage (snowflake) using COPY command.
Creating views and materialized views in Snowflake cloud data warehouse to Business Analysis and reporting.
Responsible for all activities related to the development, implementation and support of ETL processes for large scale Snowflake cloud data warehouse.
Analyzed ETL code and tuned the mappings to improve their load performance by tuning mappings or redesigning the code.
Migrated code from shared folder to Project folder.
Analyzed the defects identified by the project team and fixed them.
Used JIRA Agile project management tool to plan, track and manage progress of the project.
Associate Engineer
Capgemini
02.2020 - 03.2022
Client Overview: The National Employment Savings Trust is a defined contribution workplace pension scheme in the United Kingdom. It was set up to facilitate automatic enrolment as part of the government's workplace pension reforms.
As part this project to build OLAP warehouse for customer data.
Gathering Requirements and implement them into Source to Target mappings in loading and staging.
Interacting with Business analysis in understanding the Database requirements whenever debugging is required without changing the code.
Created source to target mapping documents from staging area to Data Warehouse and from Data Warehouse to various Data Marts.
Extensively Identifying and creation of reusable components/jobs using Talend studio.
Designed ETL pipeline to read Data from API and pagination.
Data Loading and Testing on Redshift Tables Using AWS S3.
Responsible for end-to-end Data Migration from RDBMS to Snowflake Cloud Data Warehouse Involved in the Integration Testing.
Effectively understood session Error logs used debugger to test mapping and fixed bugs in Dev in following change procedures and validation.
Precisely documented mappings to ETL Technical specification document for all stages for future reference.
Effectively played a multi dynamic role facing all challenges and managed working on similar projects as well.
The National Employment Savings Trust is a defined contribution workplace pension scheme in the United Kingdom. It was set up to facilitate automatic enrolment as part of the government's workplace pension reforms.
Product Cost Engineer at Tech Mahindra Ltd., India & Tech Mahindra (America) Inc. USAProduct Cost Engineer at Tech Mahindra Ltd., India & Tech Mahindra (America) Inc. USA