Summary
Overview
Work History
Education
Skills
Technical Skills
Certification
Timeline
Generic

Tarun Raj Sondi

Data Engineer
Bangalore

Summary

Data Engineer with 3.5+ years of experience in ETL/ELT development, data warehousing, and backend data troubleshooting. Proven expertise in Snowflake, DBT, SQL, Azure Databricks, and Python for orchestrating large-scale data workflows and analytics solutions. Strong understanding of dimensional modelling, CI/CD pipelines, and collaboration-driven project delivery across cloud platforms.

Overview

4
4
years of professional experience
4
4
years of post-secondary education
1
1
Certification

Work History

Data Engineer

Henkel Client
Bengaluru
09.2021 - Current
  • Migrated legacy data warehouse workflows to Snowflake, building modular and scalable DBT models for curated datasets.
  • Developed CI/CD workflows using Git and DBT Cloud, enabling robust data versioning, deployment, and testing.
  • Designed and implemented Star Schema and Snowflake Schema for analytical workloads.
  • Created staging, intermediate, and mart layers using DBT models and macros.
  • Developed incremental models, tests (schema & data), snapshots, and seeds in DBT.
  • Wrote optimized SQL transformations for performance across large datasets.
  • Created Git-based CI/CD pipelines for DBT deployment and validations.
  • Handled data validation, deduplication, and audit mechanisms in Python and SQL.
  • Collaborated with analytics and BI teams for dashboard-ready model outputs.
  • Improved model refresh performance by 30% with partition pruning.
  • Implemented model-level testing and alerts for schema changes and data anomalies.
  • Enabled fully automated daily loads with zero manual intervention.
  • Collaborated on ETL (Extract, Transform, Load) tasks, maintaining data integrity and verifying pipeline stability.

Data Engineer

ODS - Unisys Internal
09.2021 - 12.2022
  • Built a centralized ingestion framework to load structured and semi-structured data into Snowflake from multiple internal and external data sources including APIs, flat files, and databases.
  • Developed Python-based ingestion scripts to fetch and parse external JSON/CSV sources.
  • Designed Snowflake staging and raw layers for flexible schema onboarding.
  • Implemented reusable ingestion patterns with dynamic column mapping and null handling.
  • Scheduled data loads and validations using Python + Snowflake Tasks.
  • Created audit trails and control tables to manage pipeline metadata.
  • Built dashboards to track load status and SLA metrics using Power BI.
  • Reduced manual load failures by 80% by automating validation and error logging.
  • Created a reusable ingestion framework supporting 10+ data domains.

Education

Bachelor of Technology - Computer Science and Engineering

Lovely Professional University
01.2016 - 01.2020

Skills

Snowflake

DBT

SQL

Python

PySpark

Spark SQL

Azure DevOps

YAML pipelines

DBT models

undefined

Technical Skills

Snowflake, DBT, Dimensional Modelling, SQL, Python, PySpark, Spark SQL, Git, Azure DevOps, DBT Cloud, YAML pipelines, DBT models, scheduled triggers, GitOps workflows, Data validation, reconciliation, metadata-driven pipelines, Power BI (basic), Familiar with Azure

Certification

Microsoft Azure Data Fundamentals, DP-900

Timeline

Data Engineer

Henkel Client
09.2021 - Current

Data Engineer

ODS - Unisys Internal
09.2021 - 12.2022

Bachelor of Technology - Computer Science and Engineering

Lovely Professional University
01.2016 - 01.2020
Tarun Raj SondiData Engineer