Summary
Overview
Work History
Education
Skills
Accomplishments
Domain Sector
Personal Information
Work Availability
Timeline
Generic
Vishal Zadoo

Vishal Zadoo

Tech Lead - Data Engineering
Jaipur

Summary

Data Engineer with 9+ years of experience designing and delivering scalable cloud data platforms across Snowflake, AWS and modern ETL frameworks. Proven track record in building high-volume data pipelines (terabyte+ scale processing) and integrating 10+ diverse enterprise data sources to enable analytics, reporting and data-driven insights. Skilled at SQL optimization, Python development and data pipeline architecture with a focus on cost efficiency (25% Snowflake savings achieved). Experienced in leading teams ,stakeholder alignment , mentoring engineers and driving business impact through data.

Overview

9
9
years of professional experience
3
3
Languages

Work History

Tech Lead - Data Engineering

Simplilearn
02.2025 - Current
  • Architected and delivered cloud-native mission-critical data pipelines on Snowflake + Matillion ETL on AWS VM integrating heterogenous data sources, supporting analytics at scale.
  • Partnered with cross functional teams like business, marketing, product, post-sales and analytics teams to deliver enterprise-ready datasets fueling dashboards for enrollment tracking, learner behavior, course completion, and revenue analysis for CXO-level decision-making.
  • Optimized Snowflake compute and storage resources, reducing warehouse costs by ~25% through query optimisation, clustering, and query tuning enabling scalable growth during high-enrollment periods.
  • Facilitated Agile delivery processes to align engineering, analytics and business teams, ensuring consistent delivery of high-impact data products that support evolving educational strategies.
  • Helped talent acquisition team in hiring of talented data engineers and entored a team of data engineers on best practices in pipeline design, SQL performance, and cloud governance.

SDE4 - Data Engineering

Simplilearn
10.2022 - 01.2025
  • Led integration of data from an acquired company, building a centralized Snowflake warehouse (multi-TB scale) to unify reporting across geographies.
  • Built ETL pipelines in Matillion ETL on AWS + Python to ingest 10+ enterprise systems (Salesforce, HubSpot, Ordway, Alchemer, PandaDoc, Telephony systems, marketing platforms).
  • Designed and implemented a dedicated Snowflake database with well-defined schemas for data extraction, transformation and reporting, streamlining data flow and governance.
  • Delivered scalable, reusable data models to standardize analytics & reporting, reducing pipeline delivery time by 40%.
  • Designed and implemented Lead scoring data pipeline using Snowpipe enabling almost real time analytics on customer behaviour for the sales team.
  • Partnered with analytics teams to enable revenue forecasting, marketing attribution, and leakage analysis across global regions.

SDE3 - Data Engineering

Simplilearn
06.2021 - 09.2022
  • Worked with Snowflake for writing complex SnowSQL queries for creating data pipelines using Matillion for AWS ETL tool.
  • Data extraction from heterogenous data sources, like AWS RDS MySQL, PostgreSQL, Salesforce, and social media marketing data using APIs built in the Matillion ETL.
  • Analysed and implemented JSON data ingestion pipelines to increase the various product related learner behaviour data coverage.
  • Created key milestone behaviour data pipelines (multi-TB scale) to predict a consumer's journey through the product till completion supporting the product innovation & learner success team with better shaping of our products.
  • Implemented hourly job status notification system to apprise the team of the current data pipeline status which replaced reactive monitoring with proactive monitoring..
  • Sole subject matter expert for all post-sales related data ingestion pipelines, analysed and implemented B2B consumer reporting data.

Analyst Programmer

Fidelity International Limited
06.2020 - 06.2021
  • Worked on enhancing existing Informatica workflows by onboarding new security types and capping the data leakage for one of the major data vendors.
  • Successfully upgraded Informatica Powercenter v10.1 to Informatica Powercenter cloud v10.4 by performing various test case scenarios and coordinating with various downstream teams.
  • Worked upon providing root cause analysis for various data issues to the business analyst team as part of Bloomberg Enhancement.
  • Developing & maintaining SQL statements (DML/DDL), Stored procedures,Triggers, Packages & Functions in Oracle.
  • Worked upon enhancements for scheduling changes in Control-M scheduling tool.
  • Performed unit testing and support to the QA team. Supported end-to-end testing for User acceptance and release deployment activities.

Associate Consultant

Capgemini
09.2016 - 03.2020
  • Worked as BoW Developer on implementation of various market regulatory changes for Cash Equity, FI & F&O applications as PL/SQL, Informatica developer.
  • Understood, analyzed and corrected different data quality issues in the existing data warehouse.
  • Implemented time-series for corrected data reporting of various parameters like MIFID, BBTICKER, etc.
  • Onboarded new match and merge logic, parking logic for the unprocessed records using PL/SQL by creating generic & configurable packages.
  • Connected with onshore business analysts/stakeholders to understand the functional requirements and created technical documentations for the team.
  • Worked on upgrading Informatica v9.5.1/ v9.6 to Informatica v10.1 by running two different versions of Informatica on a single server for data reconciliation purposes.
  • Debugging packages, procedures & triggers, creating and executing unit test cases. Performed unit testing and support to the QA team. Supported end-to-end testing for User acceptance.
  • Performed Dry run activities for final release activities using UBS deploy and team-city.

Education

Bachelors in Technology/B.Tech - Electronics And Communication Engineering

Jaipur Engineering College And Research Centre
Jaipur, India
07.2016

Skills

Cloud Data Platforms: Snowflake, AWS (S3, RDS), Matillion ETL on AWS VM

Accomplishments

  • Regressive performance tuning leading to 25% Snowflake usage credit reduction.
  • Maintained 98% success rate/uptime for Data ingestion pipelines.
  • Designed & implemented modular ETL frameworks integrating 10+ diverse sources like (HubSpot, Salesforce,Ordway, Quickbooks) into Snowflake.
  • Lead scoring prediction model by analyzing our own LMS system to predict high conversion leads.
  • Designed & implemented one touch dashboard to represent a customer's journey from order till product journey completion, supporting the customer success team and better product analysis.

Domain Sector

  • Edtech
  • Investment Banking
  • Wealth Management
  • Insurance - Term & Life

Personal Information

Work Availability

monday
tuesday
wednesday
thursday
friday
saturday
sunday
morning
afternoon
evening
swipe to browse

Timeline

Tech Lead - Data Engineering

Simplilearn
02.2025 - Current

SDE4 - Data Engineering

Simplilearn
10.2022 - 01.2025

SDE3 - Data Engineering

Simplilearn
06.2021 - 09.2022

Analyst Programmer

Fidelity International Limited
06.2020 - 06.2021

Associate Consultant

Capgemini
09.2016 - 03.2020

Bachelors in Technology/B.Tech - Electronics And Communication Engineering

Jaipur Engineering College And Research Centre
Vishal ZadooTech Lead - Data Engineering