Experienced Professional Data Engineer with 8+ years of expertise in developing and implementing big data analytics solutions. Skilled in designing and executing dynamic data pipeline solutions for diverse business domains.
Overview
9
9
years of professional experience
Work History
Data Engineer
Blenheim Chalcot
04.2023 - Current
Designed and maintained scalable, fault-tolerant data pipelines leveraging GCP Pub/Sub, Cloud Storage, and Snowflake, ensuring efficient data ingestion and processing.
Reduced data ingestion latency by 99% through Snowflake Snowpipe streaming, Streams, and Tasks, enabling near real-time updates with automated CDC processes.
Migrated post-load transformations to dbt, creating modular SQL models, custom macros, and dynamic templates to improve pipeline efficiency and maintainability.
Enhanced Snowflake query performance by 40% through optimized partitioning, clustering, and materialized views, supporting high-concurrency workloads with multi-cluster warehouses.
Developed robust pipelines for parsing XML files from third party applications to JSON and integrating data into MongoDB, ensuring seamless updates and efficient management.
Secured sensitive data using Snowflake Dynamic Data Masking, Row Access Policies, and implemented Time Travel for reliable data recovery and audit compliance.
Key Skills: Python, SQL, Snowflake, GCP, DBT, Airbyte, Docker, Data Modeling, ETL, MongoDB, etc.
Consultant Data Engineer
ZS Associates
09.2022 - 03.2023
Worked for one of the top pharmacy clients in transitioning their data warehouse ecosystem from Teradata to Snowflake. My involvement not only upgraded their infrastructure but also fine-tuned data processing for optimal performance.
Worked with the Operational Analytics team, driving the design, testing, and maintenance of an agile and precise end-to-end ETL data management system.
Developed robust data pipelines on Hortonworks Data Platform (HDP) 2.0, directly empowering the pre-sales analytics team to uncover compelling, data-driven insights.