7 years of experience in data engineering across retail, logistics, and pharmaceutical industries.
Skilled in designing and building ETL pipelines to support reporting, analytics, and cloud data migration.
Hands-on experience with tools and platforms like Azure Data Factory, Azure Databricks, SQL Server, and BigQuery.
Comfortable working in Agile environment, and collaborating with clients and cross-functional teams.
Overview
7
7
years of professional experience
1
1
Certification
Work History
Senior Data Engineer
Tredence Analytics
Pune
02.2025 - Current
Client: Retail Industry.
Built a data validation framework in Databricks using BigQuery as the source, ensuring accuracy during data ingestion and transformation.
Developed a PII masking framework based on existing Unity Catalog tables, applying role-based access to protect sensitive data.
Migrated datasets from BigQuery to Databricks, aligning with the client’s cloud modernization goals.
Handled the deployment of pipelines and frameworks using GitHub and Databricks Asset Bundle.
Used Databricks Workflows for orchestration, and integrated with Control-M for production-grade job scheduling and monitoring.
Refactored legacy pipelines to improve performance, reliability, and ease of maintenance.
Created detailed technical documentation to support development, deployment, and client handover.
Collaborated closely with the client team for requirement gathering, progress updates, and ongoing support.
Azure Data Engineer
WNS Global Services
Pune
11.2022 - 02.2024
Client: Logistics Industry.
Implementing an ETL solution for the seamless migration and transformation of data into Azure storage.
Developed ETL pipelines for the smooth transfer of data from sources like Azure PostgreSQL and Azure Data Lake to Azure Data Lake, through Azure Data Factory.
Maintained task-based execution, and conducted post-deployment monitoring of the pipelines.
Created documentation outlining business requirements for seamless ETL integrations.
Data Engineer
Genpact
05.2022 - 11.2022
Client: Pharmaceutical Industry.
Developed launch trackers for the client's products.
Leveraged Spark SQL in Databricks to produce the necessary metrics for launch trackers.
Handled multiple business requests.
Associate Consultant
Capgemini
04.2018 - 05.2022
Client: Retail Industry
Developed data pipelines with Azure Data Factory, Azure Databricks, and Azure Synapse Analytics, ensuring task-based execution and post-production deployment monitoring.
Leveraged diverse data sources, including on-prem (Teradata, shared drive), Azure Cosmos DB, Azure Blob, Azure DevOps, and Adobe Analytics application, for integration into Azure Data Explorer.
Used PySpark within Databricks for a brief period to perform data transformations.
The pipeline deployment was executed through Azure DevOps. Prepared detailed documentation delineating business requirements for ETL integrations.
Handled several applications, monitored jobs, and fulfilled customer requirements.
Education
B.E. - ET&T
SSGI
Bhilai, C.G.
05.2017
Senior Secondary -
03.2013
Higher Secondary -
03.2011
Skills
Azure Data Factory
Azure Databricks
SQL
Python
BigQuery
VS Code
Control-M
Certification
AZ-900, Microsoft Azure Fundamentals
DP-900, Microsoft Azure Data Fundamentals
Databricks Certified Data Engineer Associate
Accomplishments
Awarded 'Firefighter' recognition by Capgemini for outstanding commitment and timely delivery during a high-pressure project
Received client appreciation for delivering end-to-end data solutions that closely aligned with business goals, and exceeded expectations
Recognized with the 'Pat on the Back' award at Tredence for exceptional performance and successful delivery of key data engineering frameworks