Summary
Overview
Work History
Education
Skills
Websites
Certification
Languages
Awards
Personal Information
Patents
Languages
Timeline
Generic

Abhishek Dasgupta

Bengaluru

Summary

Results-driven Data Engineering Leader with 18+ years of experience in building and scaling enterprise data platforms. Currently leading a team of 15 at Zoom, driving Go-To-Market data strategy with a strong focus on GenAI, VectorDB, and real-time fraud detection. Expert in data architecture, ingestion, transformation, and modeling (Data Vault, Dimensional, OBT) using AWS (S3, Lambda, Glue, Step Functions), Snowflake, and DBT. Proven ability to benchmark SQL databases, optimize data pipelines, and enhance data quality, operational efficiency, and analytics at scale. Passionate about leveraging AI-powered insights to drive business growth, ensuring seamless data integration across sales, marketing, and customer intelligence. Adept at balancing strategy and execution, delivering high-performance, scalable data solutions that empower organizations to make faster, data-driven decisions. Seasoned engineer with several-year background driving engineering projects. Action-oriented professional offers top skills in product development, project management and technical analysis. Determined problem-solver proficient in cad and other various job-related software programs.

Overview

20
20
years of professional experience
1
1
Certification

Work History

Data Engineering Manager

Zoom
Bangalore Urban
05.2024 - Current
  • At Zoom, I lead a Data Engineering team of 15, driving the Go-To-Market data charter with a focus on enabling and democratizing data through GenAI and VectorDB
  • My work involves architecting and scaling data products using GenAI, AWS, Snowflake, and DBT, with a strong emphasis on data modeling (Data Vault, Dimension, OBT, etc.)
  • I spearhead initiatives to enhance data quality, operational efficiency, and engineering excellence, ensuring seamless integration across sales, marketing, and customer insights platforms
  • Additionally, I play a key role in data ingestion, transformation, and pipeline optimization, leveraging AWS services (S3, Lambda, Glue, Step Functions) and AI-powered analytics to drive scalable, data-driven decision-making that fuels business growth

Senior Manager Data Engineering

New Relic, Inc.
Bangalore Urban
04.2023 - 01.2025
  • At the forefront of innovation, as a Senior Engineering Manager - Data at New Relic, Inc., I lead a high-impact team in India, spearheading the development of a resilient data platform
  • My expertise in DBT, Snowflake, AWS, Python, Spark, Kafka, and Tableau reporting positions me as a specialist in crafting efficient data pipelines and optimizing reports
  • Proficient in dimensional modeling and AWS services like S3, I bring a comprehensive understanding to design scalable, high-performance data solutions
  • Furthermore, my role extends to being a seasoned Scrum Master with proficiency in Agile methodologies, ensuring effective team collaboration and project management on the roadmap using JIRA

Director - Risk Data and Product

Paytm
Bengaluru
07.2022 - 04.2023
  • Led the development of real-time fraud detection systems using Kafka, Cassandra, Spark, and ELK, demonstrating expertise in cutting-edge technologies
  • Initiated the establishment of a data lakehouse to democratize data, enabling scalable, data-driven decision-making across the organization
  • Provided strategic guidance, fostering innovation, and continuous improvement
  • Successfully scaled high-performing teams, attracting top talent and driving employee development
  • Collaborated with stakeholders to align technical initiatives with business goals
  • Mentored and coached team members for professional growth
  • Proficient in a technology stack encompassing AWS, Apache Spark, Kafka, SQL, DBT, ELK, Sagemaker, GitLab/GitHub, Airflow, and DataHub for comprehensive cloud-based data solutions

Sr Manager Data Engineering

Vistaprint
Bangalore Urban
11.2020 - 07.2022
  • Led the data organization's transformation at Vistaprint, I am spearheading the migration from on-premise Oracle data warehouse to resilient big data cloud solutions within the AWS ecosystem
  • Key achievements include the design and implementation of the 'Common Data Model,' a centralized repository empowering over 150 global members with comprehensive business data for data scientists, analysts, and marketers
  • Significantly reduced Snowflake usage costs through scallion strategies and query performance tuning
  • Enhanced data ingestion pipeline performance, improving SLA effectiveness from 30% to over 75%
  • Established robust data governance for quality and discoverability, facilitating data democratization, and actively contributed to ML pipeline development
  • Navigating challenges, I have addressed issues in data sourcing and unification within the data lake, implemented fine-grained access control for data governance, and monitored data quality
  • Achieved excellence in SLA and observability, efficiently processed and optimized large datasets in Spark jobs, and focused on enhancing user experience and reducing the learning curve

Big Data Specialist

LogMeIn
Bangalore
11.2016 - 11.2020
  • In a leadership role, I drive the development of a state-of-the-art big data platform on AWS, leading a team of 10 engineers
  • My responsibilities span goal-setting, appraisals, conflict resolution, and project delivery planning, with a focus on technology specialization
  • Managing teams across different geographies, I collaborate closely with business users to understand needs and deliver innovative solutions, fostering business growth
  • I actively contribute to project planning, priority setting, and ensuring timely delivery
  • Proven expertise in building data pipelines using Apache Spark, including SparkSQL, DataFrame, and performance tuning
  • Extensive work on the Hadoop ecosystem, utilizing tools such as Hive, PIG, Oozie, Presto, and Airflow
  • Specialized in modeling data and ETL pipeline implementation in big data and OLTP/OLAP spaces
  • Proficient in Python, data structure, problem-solving, and data visualization using Tableau, SAP BO, and Apache Druid
  • Hands-on experience in streaming applications using Spark Streaming, along with proficiency in the AWS tech stack, including S3, SQS, Kinesis, and EMR
  • Utilized CI/CD tools like GitHub and Jenkins
  • Extensive experience in Oracle PLSQL coding, data warehousing concepts, dimension modeling, and performance tuning
  • Managed Splunk infrastructure, configuring data collectors and indexers
  • Demonstrated understanding of machine learning techniques, highlighted by winning the Best Internal Innovation Award in 2017 at LogMeIn
  • I developed a machine learning model predicting up-sell and cross-sell opportunities, resulting in $4M+ new business opportunities for LogMeIn

Senior Software Engineer

Intuit
Bangalore
01.2013 - 07.2016
  • During my tenure at Intuit India in the QuickBooks desktop cloud data services division from January 2013 to the present, I have been instrumental in facilitating robust data analysis, modeling, and ETL processes on cloud platforms
  • Conducting insightful daily analyses, I manage 40 to 50TB of QuickBooks data sourced from desktop systems
  • Applied dimensional modeling techniques to enhance data modeling processes
  • Proficient in Hadoop ecosystems such as PIG, HIVE, etc., contributing to a versatile skill set
  • Expertise in data analysis, data profiling, and real-time data warehousing, ensuring dynamic insights
  • Demonstrated end-to-end proficiency in data warehouse development
  • Implemented PLSQL frameworks (e.g., File Loader, File Generator, Process Scheduler, Reconciliation, and Alert framework) for streamlined data warehouse development

Senior Reporting Engineer

Citrix Systems
Bangalore
03.2011 - 01.2013
  • Data professional with extensive experience in leading teams technically, dimensional modeling, and implementing end-to-end data warehouse solutions
  • Proven expertise in BI reporting, ETL processes, and optimizing data performance across diverse platforms

Senior Associate

Sapient Global Markets
Bangalore
01.2011 - 03.2011
  • Joined Sapient Global market team
  • Worked on following technologies: Oracle PL SQL, Unix

Associate

J.P. Morgan
Bangalore
04.2005 - 12.2010
  • Working as a senior PL SQL developer and data modeler
  • Used BOXIR3 to implement complex credit risk reports
  • Hands on experience in developing PL/SQL, Stored Procedure and UNIX Scripting
  • Exposure to critical production support environments

Senior Software Engineer

Accenture
Bangalore
09.2008 - 06.2009
  • Worked on PL SQL development in data warehouse environment
  • Data warehousing designing
  • Performance tuning of SQL code
  • Unix shell scripting
  • UAT & production support from onshore

Education

B.Tech - Information Tecnolgy

Kalyani University
01.2004

Skills

  • Spark, Snowflake, DBT
  • AWS, AI/ML, Python
  • GenAI, RAG, LLMs, VectorDB
  • Data modeling
  • Data governance
  • Team leadership

Certification

Academy Accreditation - Generative AI Fundamentals

Languages

  • English, Full Professional
  • Bengali, Native or Bilingual
  • Hindi, Limited Working

Awards

  • STAR Of Small Business Group
  • Spotlight Awards
  • Best Internal Innovation Award 2017 (Falcon)

Personal Information

Title: Data Engineering Manager

Patents

Intelligent Database Queue System

Languages

English
Full Professional

Timeline

Data Engineering Manager

Zoom
05.2024 - Current

Senior Manager Data Engineering

New Relic, Inc.
04.2023 - 01.2025

Director - Risk Data and Product

Paytm
07.2022 - 04.2023

Sr Manager Data Engineering

Vistaprint
11.2020 - 07.2022

Big Data Specialist

LogMeIn
11.2016 - 11.2020

Senior Software Engineer

Intuit
01.2013 - 07.2016

Senior Reporting Engineer

Citrix Systems
03.2011 - 01.2013

Senior Associate

Sapient Global Markets
01.2011 - 03.2011

Senior Software Engineer

Accenture
09.2008 - 06.2009

Associate

J.P. Morgan
04.2005 - 12.2010

B.Tech - Information Tecnolgy

Kalyani University
Abhishek Dasgupta