Summary
Overview
Work History
Education
Skills
Accomplishments
Domain Sector
Personal Information
Timeline
Generic
Vishal Zadoo

Vishal Zadoo

Tech Lead - Data Engineering
Jaipur

Summary

Experienced with leading data engineering team and managing development projects with extensive experience at Simplilearn, specializing in Snowflake, Matillion ETL and Python. Successfully optimized data pipelines, reducing costs by 25%, while mentoring junior engineers and facilitating Agile processes. Proven ability to drive data-driven decisions and enhance collaboration across teams, showcasing strong technical and leadership skills.

Overview

9
9
years of professional experience
4
4
years of post-secondary education
3
3
Languages

Work History

Tech Lead - Data Engineering

Simplilearn
Bangalore
02.2025 - Current
  • Led the architecture and delivery of mission-critical data pipelines using Snowflake, Matillion, and Python, enabling CXOs to make timely, data-informed decisions on learner engagement, program performance, and revenue optimization.
  • Mentored junior developers through regular 1-on-1 meetings, providing guidance on best practices, coding standards, and career growth opportunities.
  • Facilitated Agile delivery processes to align engineering, analytics, and business teams, ensuring consistent delivery of high-impact data products that support evolving educational strategies.
  • Streamlined collaboration with various teams like product, learner success, marketing and analytics teams to deliver reliable, well-structured datasets fueling dashboards for enrollment tracking, learner behavior, course completion, and revenue analysis.
  • Optimized Snowflake compute and storage resources, reducing warehouse costs by ~25% through strategic refactoring and governance, enabling scalable growth during high-enrollment periods.
  • Helped talent acquisition team in hiring of talented data engineers and trained junior team members and ensured best practices are followed across the team.

SDE4 - Data Engineering

Simplilearn
Bangalore
10.2022 - 01.2025
  • Led the end-to-end integration of a newly acquired firm’s data systems, enabling unified reporting and visibility for cross-country leadership and stakeholders.
  • Designed and implemented a dedicated Snowflake database with well-defined schemas for data extraction, transformation, and reporting, streamlining data flow and governance.
  • Independently designed, developed, and deployed robust ETL pipelines using Snowflake and Python to integrate diverse data sources, including HubSpot, Ordway, Alchemer, PandaDoc, Telephony systems (Five9, Zeta, Dialpad),various marketing platforms, etc.
  • Developed scalable and reusable data models to standardize reporting across business units, significantly reducing turnaround time for pipeline development and data delivery.
  • Collaborated closely with the analytics team to power critical initiatives such as revenue forecasting, marketing spend attribution, and leakage analysis across geographies, contributing to data-driven performance improvements.
  • Delivered integration projects on schedule with minimal post-deployment issues, receiving recognition from senior leadership for enabling effective post-merger analytics and reporting.

SDE3 - Data Engineering

Simplilearn
Bangalore
06.2021 - 09.2022
  • Acted as Scrum master, managed day-to-day scrum tasks, created JIRA strategies and stories. Led a team of 4 data engineers.
  • Worked with domain experts, engineers and other data scientists to develop, implement and improve upon the existing system.
  • Worked with Snowflake DB for writing complex SnowSQL queries for creating data pipelines using Matillion for AWS ETL tool.
  • Data sourcing from various data sources like AWS RDS MySQL, PostgreDB, Salesforce, Social Media marketing data using API's built in the Matillion ETL.
  • Analysed and implemented JSON data ingestion pipelines to increase the various product related learner behaviour data coverage.
  • Created key milestone behaviour data pipelines to predict a consumer's journey through the product to completion. This data pipeline analysis supported the product innovation team with better shaping of our products.
  • Created Python script for hourly job status notification to apprise the team of the current ETL pipeline status which replaced reactive monitoring with proactive monitoring.
  • Sole subject matter expert for all post-sales related data ingestion pipelines, analysed and implemented B2B consumer reporting data.
  • Helped talent acquisition team in hiring of talented data engineers, mentored and trained junior team members and ensured best practices are followed across the team.

Analyst Programmer

Fidelity International Limited
06.2020 - 06.2021
  • Worked on enhancing existing Informatica workflows by onboarding new security types and capping the data leakage for one of the major data vendors.
  • Successfully upgraded Informatica Powercenter v10.1 to Informatica Powercenter cloud v10.4 by performing various test case scenarios and coordinating with various downstream teams.
  • Worked upon providing root cause analysis for various data issues to the business analyst team as part of Bloomberg Enhancement.
  • Developing & maintaining SQL statements (DML/DDL), Stored procedures,Triggers, Packages & Functions in Oracle.
  • Worked upon enhancements for scheduling changes in Control-M scheduling tool.
  • Performed unit testing and support to the QA team. Supported end-to-end testing for User acceptance and release deployment activities.

Associate Consultant

Capgemini
09.2016 - 03.2020
  • Worked as BoW Developer on implementation of various market regulatory changes for Cash Equity, FI & F&O applications as PL/SQL, Informatica developer.
  • Understood, analysed and corrected different data quality issues in the existing data warehouse.
  • Implemented time-series for corrected data reporting of various parameters like MIFID, BBTICKER, etc.
  • Onboarded new match and merge logic, parking logic for the unprocessed records using PL/SQL by creating generic & configurable packages.
  • Connected with onshore business analysts/stakeholders to understand the functional requirements and created technical documentations for the team.
  • Worked on upgrading Informatica v9.5.1/ v9.6 to Informatica v10.1 by running two different versions of Informatica on a single server for data reconciliation purposes.
  • Debugging packages, procedures & triggers, creating and executing unit test cases. Performed unit testing and support to the QA team. Supported end-to-end testing for User acceptance.
  • Performed Dry run activities for final release activities using UBS deploy and teamcity.

Education

Bachelors in Technology/B.Tech -

Jaipur Engineering College And Research Centre
Jaipur
08.2012 - 07.2016

Skills

  • Snowflake

  • SnowSQL

  • Matillion

  • Python

  • Scrum master

Accomplishments

  • Regressive performance tuning leading to 25% Snowflake usage credit reduction.
  • Maintained 98% success rate/uptime for Data ingestion pipelines.
  • Lead scoring prediction model by analyzing the JSON Data from BigQuery.
  • One point dashboard to represent a customer's journey from order till product journey completion.

Domain Sector

  • Edtech
  • Investment Banking
  • Wealth Management
  • Insurance - Term & Life

Personal Information

Notice Period: 60 Days

Timeline

Tech Lead - Data Engineering

Simplilearn
02.2025 - Current

SDE4 - Data Engineering

Simplilearn
10.2022 - 01.2025

SDE3 - Data Engineering

Simplilearn
06.2021 - 09.2022

Analyst Programmer

Fidelity International Limited
06.2020 - 06.2021

Associate Consultant

Capgemini
09.2016 - 03.2020

Bachelors in Technology/B.Tech -

Jaipur Engineering College And Research Centre
08.2012 - 07.2016
Vishal ZadooTech Lead - Data Engineering