Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic
Sandeep Meda

Sandeep Meda

Summary

Data Engineering Manager with 18+ years of experience in analysis, design, development, lead and manage Data Warehousing, Big Data and AWS cloud platforms. Worked with various clients in banking, healthcare, manufacturing, telecom, aviation, CRE and financial industries.

Overview

18
18
years of professional experience
5
5
Certifications
2
2
Languages

Work History

Data Engineering Manager

Berkadia, a Berkshire Hathaway Company
03.2024 - Current
  • Drive design discussions, technology choices, and participate in design & code reviews.
  • Ensure Data Quality and Data Integrity with effective Data Pipelines.
  • Scaling Data Solutions and Overseeing Data security and Compliance.
  • Crisis Management and Problem Solving and contributing to strategy and vision setting.
  • Break down projects into executable tasks, allocate and track progress through Azure DevOps
  • Lead and mentor the team members in developing and designing system design and architecture.
  • Guide technical roadmaps for the team and help transition the platform with Big Data and AWS Cloud Services.
  • Designed and implemented Retrieval-Augmented Generation (RAG) pipelines using vector databases to improve LLM factual accuracy and reduce hallucinations by 30–60%.
  • Engaged in integrating SLM to data pipeline to handle many complex data integration issues.
  • Automated Job failure handling issues by using LLM.
  • Extesive experience of using Cursor AI.


Technologies Used: (Pyspark, AWS CloudFormation, Glue, Power BI, Azure DevOps, GIT)


Technical Account Manager

Amazon
09.2020 - 03.2024
  • Managing data engineers including vendor consulting resources to ensure successful delivery.
  • Provide governance and subject matter expertise support for GE Aviation platform.
  • Lead, design, deploy and maintain critical data migration from Greenplum to Redshift.
  • Advocate the best practices and processes involved in data engineering life cycle.
  • Design build plans, assist developers with failures and manage software configurations.
  • Automate solutions for repeatable problems observed during migration and post migration activities.
  • Develop/Enhance database PLSQL scripts to migrate from Greenplum to Redshift.
  • Identity workarounds for specific issues and corner scenarios observed during migration.
  • Conduct tech talks for internal teams, partners, and customers.


Technologies Used: (S3, Redshift, SCT, DMS, Lake Formation, Glue, Pyspark, Starfish, Jenkins, GIT)


Data Engineering Manager

Wipro
02.2019 - 09.2021
  • Collaborate and work closely with cross-functional teams and translate business requirements into engineering implementation documents.
  • Break down the project into smaller tasks and sub-tasks. This involves identifying the specific activities that need to be performed and determining the dependencies and constraints of the project.
  • Lead and mentor the team members in developing and designing system design and architecture.
  • Help teams troubleshoot and find ways to move faster and raise the quality bar.
  • Created complex AWS Glue jobs using pyspark scripting.
  • Created Glue Crawlers on S3 Buckets and queried Data present in S3 bucket using Athena.
  • Worked on optimizing the Query Performance on Redshift tables using Distributed keys, sort keys, data and compression types Design and implementation of CI/CD work process.
  • Schedule AWS jobs through Step Functions.
  • Worked on migrating from PowerBI to Amazon QuickSight.


Technologies Used: AWS (EMR, S3, Glacier, Redshift, Lake Formation, Glue, Code Commit, Code Pipeline, Quick Sight, Pyspark, Power BI)


Technical Architect

Genpact Headstrong Capital Markets
10.2017 - 02.2019
  • Designed and implemented an efficient method of data collection from multiple sources.
  • Process data of complex/nested json and xml’s using Dataframe API.
  • Transforming the data implementing the business logic by using AWS EMR
  • Experience in processing large volume of data from AWS S3 to AWS Redshift.
  • Developed Nifi flow in connecting to the Remote Host Server and injesting the data into HDFS and Kafka Topics.
  • Developed Pyspark framework in reading the data from HDFS and implement the business rule and load the data into Hive partitioned table.
  • Work with team to covert the requirements into design, Technical specifications documents.


Technologies Used: Hive, Spark, Nifi, AWS (EMR, S3, Redshift)


Lead Technical Specialist

J P Morgan Chase
05.2015 - 10.2017
  • Processing large sets data and support system application using Spark.
  • Moving data from HDFS to RDBMS and vice-versa using SQOOP.
  • Experience in processing different kind of data files like csv, json, xml files using Dataframe API to Cassandra.
  • Hands on with Informatica Data quality, Data profiling (Quick Profile, Custom Profile, Midstream Profiling, Score Card generation and reference table creation) and Web Services.
  • Perform major role in understanding the business requirements and Performed loading data into Data Warehouse.


Technologies Used: Informatica, Spark, Cassandra, Sqoop , Unix Shell Scripting and Oracle Exadata.

Team Lead

Symphony Teleca (HARMAN)
01.2014 - 03.2015
  • Analyzed the source systems for erroneous, duplicative, and integrity issues related to the data Defined System Trust and Validation rules for base object column
  • Configured Search Strategy, Match rule sets, Match Columns for Match and Merge Process in improving data quality by match analysis
  • Created Queries, Query Groups and packages in MDM Hub Console
  • Review and suggest changes to ETL process prior to code migration.


Technologies Used: Informatica IDQ, MDM, Autosys, Unix Shell scripting, SVN and Oracle


Senior Consultant

Oracle Financial Software Services
07.2007 - 12.2013

ETL Developer

  • Client: Citi Bank, Singapore
  • Client: Mashreq Bank, Dubai


Education

Master of Science - information technology (M.Sc. in IT)

Alagappa University
04.2001 -

Bachelor’s Degree - computer science (B.Sc. in Computers)

S K University

Skills

Sprint Planning Tools: Azure DevOps, Jira and Asana

Certification

AWS Certified Solutions Architect Associate.

Timeline

Data Engineering Manager

Berkadia, a Berkshire Hathaway Company
03.2024 - Current

Technical Account Manager

Amazon
09.2020 - 03.2024

Data Engineering Manager

Wipro
02.2019 - 09.2021

Technical Architect

Genpact Headstrong Capital Markets
10.2017 - 02.2019

Lead Technical Specialist

J P Morgan Chase
05.2015 - 10.2017

Team Lead

Symphony Teleca (HARMAN)
01.2014 - 03.2015

Senior Consultant

Oracle Financial Software Services
07.2007 - 12.2013

Master of Science - information technology (M.Sc. in IT)

Alagappa University
04.2001 -

Bachelor’s Degree - computer science (B.Sc. in Computers)

S K University
Sandeep Meda