Summary
Overview
Work History
Education
Skills
Websites
Certification
Awards
Timeline
Generic

Mahesh Ramaswamy Vulli

Hyderabad

Summary

Lead Data Engineer with over 13 years of experience — including 6 years in the USA — architecting and managing cloud-based data solutions and large-scale data pipelines. Proven track record in delivering innovative data products, enhancing platform observability, and driving successful outcomes through strong stakeholder engagement. Adept at Python programming and recognized for mentoring and leading high-performance engineering teams.

Overview

14
14
years of professional experience
1
1
Certification

Work History

Lead Data Engineer

LTIMindtree
Hyderabad
03.2024 - Current

Team: OpenDMP

  • Leading a team of Data Engineers to design and build scalable data pipelines for a Data Management Platform (DMP), enabling syndication to media partners such as The Trade Desk, Facebook, and TikTok
  • Architecting ingestion workflows that process user click-stream data from Tapad into the OpenDMP system, deriving attributes, segments, and audiences for downstream activation
  • Developed Looker dashboards to monitor pipeline performance and key campaign data points
  • Collaborating with SRE teams to instrument Spyglass metrics and implement Grafana-based observability dashboards and alerting
  • Driving delivery of customer-centric data products aligned with advertiser requirements

Senior Lead Engineer - Data

Quest Global
Hyderabad
10.2023 - 03.2024

Team: Airflow (WFO)

  • Served as Airflow administrator within the Ad-Platforms domain, managing multiple environments and ensuring high availability and operational stability
  • Led onboarding of new customers to Apple’s custom Airflow platform, tailoring configurations to meet specific business requirements
  • Managed and maintained Apple’s customized Airflow codebase, ensuring alignment with evolving functional and technology needs
  • Configured and monitored end-to-end observability using Airflow alert dashboards, proactively identifying and resolving issues
  • Provided expert support to customers and internal data engineering teams on Airflow deployments, CI/CD pipelines, repositories, and troubleshooting

Lead Data Engineer

Miracle Software Systems
Dearborn
01.2023 - 10.2023

Teams: Data Factory – Data Lake & Data Warehouse, Customer 360, Customer Experience

  • Led a team of Data Engineers in building Data Warehouses, Data Marts, and customer-facing products on GCP (BigQuery)
  • Designed and deployed data pipelines to ingest and transform large-scale datasets into Data Lake and Data Warehouse environments
  • Built scalable ETL workflows using GCP services to enable high-performance analytics across diverse data sources
  • Established stakeholder notification mechanisms for BigQuery and Airflow DAGs using email and Webex alerts
  • Conducted code reviews, mentored engineers, and ensured adherence to best practices for DAGs and BigQuery development
  • Implemented Infrastructure-as-Code approach by developing Terraform repositories for GCP resource provisioning
  • Integrated CI/CD pipelines with GitHub (Terraform) for automated deployment, incorporating FOSSA and SonarQube scans for security and compliance

Lead Data Engineer

Bed Bath and Beyond
Union
07.2022 - 12.2022
  • Designed and supervised data pipeline architecture within the Data & Analytics team
  • Led code reviews and mentored junior engineers to ensure adherence to development best practices
  • Developed end-to-end, scalable data pipelines using Apache Airflow, PySpark, and Beam
  • Managed migration of legacy Teradata workflows and objects into BigQuery on GCP
  • Implemented data quality checks on ingested datasets to ensure accuracy and reliability

Lead Data Engineer

Equifax Inc
Alpharetta
01.2022 - 07.2022
  • Developed Python-based credit score models and attributes within an Apache Beam framework
  • Led and mentored an offshore team to translate complex business requirements into scalable data models
  • Containerized models and deployed them on Google Kubernetes Engine (GKE) for production use
  • Built PySpark and Apache Beam jobs to enable distributed processing of large datasets
  • Implemented robust data quality checks on ingress data prior to loading into BigQuery
  • Provided application support for debugging and troubleshooting deployed models
  • Conducted code reviews to ensure maintainability, performance, and adherence to best practices

Senior Data Engineer

Cardinal Integrated Technologies
Union
08.2021 - 01.2022
  • Designed, developed, and scheduled data pipelines using Apache Airflow (Cloud Composer)
  • Served as code owner, conducting peer reviews and enforcing development best practices
  • Ingested data from diverse external sources into Google Cloud Storage and BigQuery
  • Developed custom Airflow operators to enable seamless integration with external APIs
  • Led migration of legacy Teradata tables and stored procedures into BigQuery
  • Built ETL/ELT pipelines using Nexla to streamline and optimize data workflows

Lead Data Engineer

Tata Consultancy Services
Alpharetta
08.2020 - 08.2021
  • The primary aim of this project is to design and implement Python models for the precise calculation of consumer credit scores.
  • A pivotal task within this initiative involves the migration of legacy data flows into modern frameworks.
  • As a project lead, I have overseen a team of five developers.
  • I am actively involved in the deployment of data processing jobs across various platforms.
  • Participating in architecture reviews related to BigQuery Data Warehousing is a key aspect of my role.
  • A significant responsibility involves evaluating project requirements and facilitating communication with various stakeholders.
  • To support audit activities, I have developed ad hoc scripts capable of executing the models.

Lead Datawarehouse Engineer

Tata Consultancy Services
New York
01.2018 - 08.2020
  • The primary goal of this project is to conceptualize and construct a data warehouse while simultaneously developing robust ETL processes.
  • As part of this endeavor, I've engaged in data modeling activities aimed at shaping the structure of the Enterprise Data Warehouse (EDW).
  • I've been responsible for the creation and modification of various database objects.
  • To guarantee the integrity of our ETL processes, I've meticulously developed ETL testing programs using Python.
  • Utilizing both Informatica and Pentaho Data Integration, I've designed and implemented ETL processes.
  • My role extends to performing data transformations through a combination of SQL, Python, and ETL tools.
  • In pursuit of optimal performance, I've been actively engaged in query optimization and ETL performance tuning activities.

Datawarehouse and Backend Developer

Tata Consultancy Services
Hyderabad
11.2011 - 12.2017
  • The primary objective of this project is to advance the backend databases supporting the Ratings applications.
  • As part of this initiative, I've been responsible for the execution of maintenance and enhancement change requests.
  • I've actively collaborated with Business Analysts to gather project requirements.
  • My role has involved extensive work with SQL Server and T-SQL.
  • To enable seamless data access, I've created backend APIs using both Python and Java.
  • I've designed and implemented shell and Python scripts to automate critical data processing and report generation tasks.
  • For efficient scheduling and monitoring of batch jobs, I've utilized Control-M.
  • Client: Moody’s
  • Project: Ratings applications, Single Ratings Data Source, One Ratings platform

Education

Bachelor of Technology - Computer Science and Engineering

Koneru Lakshmaiah College of Engineering
Andhra Pradesh, India
01.2011

Skills

  • Python and Java programming
  • Cloud solutions and services
  • Data pipeline architecture
  • ELT and data integration
  • Database management systems
  • Big data processing
  • Technical documentation
  • Project coordination and leadership
  • Effective communication skills
  • Stakeholder engagement strategies

Certification

  • Google Cloud Platform (GCP) certified Professional Data Engineer, https://google.accredible.com/224ec58a-f415-40fb-a041-9f797906f49c
  • Google Cloud Platform (GCP) certified Associate Cloud Engineer, https://google.accredible.com/ec242e67-7147-49d2-a715-6aaa1917c8ee
  • Microsoft certified Technology Associate in Python programming, https://www.credly.com/badges/ebfc98b0-802c-451e-a881-18f1f2a78bbb/public_url
  • IBM certified Database Associate DB2

Awards

  • Ford appreciation: Modernizing Everywhere
  • TCS “CLP Faculty” award twice
  • TCS “On the spot” award
  • TCS “Applause” award

Timeline

Lead Data Engineer

LTIMindtree
03.2024 - Current

Senior Lead Engineer - Data

Quest Global
10.2023 - 03.2024

Lead Data Engineer

Miracle Software Systems
01.2023 - 10.2023

Lead Data Engineer

Bed Bath and Beyond
07.2022 - 12.2022

Lead Data Engineer

Equifax Inc
01.2022 - 07.2022

Senior Data Engineer

Cardinal Integrated Technologies
08.2021 - 01.2022

Lead Data Engineer

Tata Consultancy Services
08.2020 - 08.2021

Lead Datawarehouse Engineer

Tata Consultancy Services
01.2018 - 08.2020

Datawarehouse and Backend Developer

Tata Consultancy Services
11.2011 - 12.2017

Bachelor of Technology - Computer Science and Engineering

Koneru Lakshmaiah College of Engineering
Mahesh Ramaswamy Vulli