Summary
Overview
Work History
Education
Skills
Certification
Awards
Timeline
Generic

Madhvesh ML

Bangalore

Summary

12+ Years’ experience in Data Engineering, Data Analytics and Data Visualization working in both Product developing companies and Consultant organizations. Have strong experience working for domains like Semiconductor, Pharma, Digital Marketing, Electronics, Insurance and Retail. Designing and building DWH Architecture, Data modelling, Data Marts, Analytical platforms and Bigdata Tech stakes to build robust data driven technologies to help facilitate the growth of the organization. Implementing and designed architectures for Data Mesh, Data Fabric, Data lineage, Data Security and Data catalog across origination levels. Strong knowledge of cloud platforms like AWS, Salesforce, ETL pipelines like Informatica, Matillion, AWS Glue, and DataStage. Building big data systems using Apache Atlas, Apache NIFI, Spark, Pyspark, Kafka and Hive. Using Python, SQL and PLSQL programming to build data driven tech stacks. Working on AWS tech stacks like Redshift, S3, Glue, SNS, SQS, Lambda and CloudWatch to build a robust system which are highly scalable, performance friendly and Low latent products. Good knowledge of Using Cubes, BI tools like SSRS, Quicksight. Have led teams to create success stories across projects which help the management win bigger and better projects.

Overview

12
12
years of professional experience
1
1
Certification

Work History

Solution Architect

Siemens Technology And Services Private
01.2024 - Current
  • Guided and influenced existing partners on recommended upgrades and enhancements to integrated solutions.
  • Designed and Implemented a Data platform Application which is built using Medallion and Lambda architecture which would process data into a Data fabric or Centralized data repo on AWS.
  • Worked with customers to develop integrated solutions on AWS using services such as Redshift, S3, Airflow, and Spark and lead detailed architectural dialogues to facilitate delivery of comprehensive solution.
  • Collaborated on ETL (Extract, Transform, Load) tasks, maintaining data integrity and verifying pipeline stability.
  • Installed, integrated and deployed Centralized Data Platform Called SHIP product in client environments.
  • Supervised deployments and provided troubleshooting and user support.
  • Conducted research to evaluate systems design and process efficiency.
  • Executed application database upgrades, backups and restore duties.
  • Presented roadmap and technology infrastructure to customers, demonstrating deep familiarity with APIs, platform infrastructure, security and integration capabilities.

Lead Data Engineer

Jubilant FoodWorks LTD.
08.2022 - 12.2023
  • Design and implementation of Analytics and Insights system for JFL to help the products such as Dominos, Popeyes, Hong’s kitchen, and Dunkin Donuts elevate the decision-making capability with data driven methodologies.
  • Collaborated on ETL (Extract, Transform, Load) tasks, maintaining data integrity and verifying pipeline stability.
  • Building Effective Data Mesh and Data governance compatibility on the domain data so that cross functional teams can analyze the required data.
  • Built insightful Data platform using AWS Services and Big data tech stacks to drive the marketing and sales of brands across JFL.
  • Proposed and enhanced the existing architecture by improving the data efficiency by 40% and decreased the latency of the cluster response, and helped the cluster computation more faster.
  • Leading team of Data engineers to build robust DWH and Big data Analytical platform on AWS with an insight to help the org grow.
  • Contributing hands on Designing and Developing ETL Pipelines to streamline raw data sets from multiple source points from Amplitude and Google Analytics, & different DBs i.e., SQL Server, MySQL, and Flat files to create one source of truth on AWS redshift using ETL techniques of AWS Glue (Python, Pyspark) and SQL/PLSQL.
  • Built consumer 360 View on Redshift DWH.
  • Implemented data pipeline techniques similar to GDPR to protect customer data.
  • Building DWH models, Data mart, OLTP and OLAP system on AWS Redshift.

Senior Associate Consultant

Elli Lilly Health Care
02.2020 - 08.2022
  • Design and Implementation of end-to-end inhouse cloud-based product Consumer analytics Workbench.
  • Building the Architecture for digital media platform consumer analytics workbench.
  • Lead Jr team members on the data Engineering and Data Analytics front.
  • Built Effective Data Mesh and Data fabric, facilitated cross product and cross team domain data and also built a centralized repository of the Data on redshift.
  • Worked on Designing and Developing ETL Pipelines to streamline raw data sets from multiple source points i.e., Google ads, Google double clicks, Google Analytics, GA360, Facebook, different DBs i.e., SQL Server, MySQL, and Flat files to one source of truth on AWS redshift using ETL techniques of AWS Glue (Python, Pyspark) and SQL/PLSQL.
  • Involved in developing ETL Pipelines using ETL Glue and PLSQL.
  • Worked on setting up DWH models, Data mart, OLTP and OLAP system on AWS Redshift.
  • Designing Data Analytics applications on Jupyter Hub using Python, Pyspark.
  • Designing Data driven application using ETL, SQL, Big Data framework Pyspark, Spark.
  • Data Integration using Glue on AWS S3 and AWS Redshift.
  • Used AWS cloud for both Application hosting and Data storage system, used AWS Redshift, EC2 and S3 to store data, Glue, lambda, and cloud logs for ETL processing and SQS, SNS for notification.
  • Built a data lake layer on S3 for storing raw data sets from multiple data source.
  • Followed rigorous design standards, elegant code, and performance tuning with thorough unit tests.
  • Worked on code repo i.e., Git, & bitbucket.
  • Worked closely with Business and planning leadership to deliver targeted items.
  • Created high-level presentations and reports to outline technical review findings.

Senior Data Engineer

Epsilon
09.2018 - 01.2020
  • Worked on Design and Building the SaaS CRM product called Agility Unite on AWS.
  • Built ETL pipelines using Matillion, Apache Nifi (Pyspark), and SQL techniques to transform raw data from different vendors or sources to setup the data centric application.
  • Usage of AWS services like EMR, EC2, and Lambda to setup/host our SaaS application, used AWS redshift & S3 to store data.
  • Worked on Data governance and Data cataloging open-source tool called Apache Atlas.
  • Worked on setting up DWH models, Data mart, OLTP and OLAP system on AWS Redshift.
  • Data Integration using Matillion on AWS S3 and AWS Redshift.
  • Configuration of AWS services like EMR, EC2, SQS, SNS and defining Lambda.
  • Developed SCD DWH mappings on Matillion cloud, Python and using SQL/HIVE queries.
  • Writing Complex PostgreSQL, SQL, HIVE queries for building the data pipeline.
  • Designed Data Flows Using Open Stack tools like Apache Atlas, Nifi, Kylo.
  • Used JupyterHub & Pyspark to build data analytical dashboards or data sets to render business centric reports to enhance the value of product and help make business decisions.
  • Implemented data protection regulations of European Union i.e., GDPR compliance and Data Encapsulation of customer data in ETL Pipelines.
  • Verified rigorous design standards, elegant code, and performance tuning with thorough unit tests.
  • Worked on code repo i.e., Git, & bitbucket.
  • Defined architecture for data management systems based on needs of internal consumers.
  • Cleaned and processed raw data with different types of statistical software.
  • Delivered robust cloud-based solutions in collaboration with digital product managers.

Senior Data Engineer

Indecomm Global Service
09.2015 - 01.2018
  • Collaborated on ETL (Extract, Transform, Load) tasks, maintaining data integrity and verifying pipeline stability.
  • To Setup a data workbench on CRM salesforce for a healthcare organization (AHM).
  • Upgrading a legacy system to cloud based application system on AWS (Sony India Electronic).
  • Worked on upgrading legacy systems setup on MS Access to cloud-based application on AWS.
  • Usage of AWS services like EMR, EC2, and Lambda to setup our application, used AWS redshift & S3 to store data.
  • Involved in developing and testing Mappings on Informatica cloud.
  • Data Integration with ICS, AWS S3 and AWS Redshift, loaded data onto Redshift and S3 using ICS.
  • Data Integration with ICS and Salesforce loaded data onto salesforce using ICS.
  • Designing Datawarehouse Model on AWS Redshift and Salesforce.com.
  • Tuning of SQL Queries and Setting up Databases on SQL Server and Oracle.
  • Writing PL/SQL Programs and Functions.
  • Worked on SCDs.
  • Have worked on PLSQL Programming and UNIX Scripting to derive the data and to fetch the required set of records for Reports and Business Analysis.
  • Implemented data protection regulations of European Union i.e., GDPR compliance and Data Encapsulation of customer data in ETL Pipelines on Salesforce.com.
  • Verified rigorous design standards, elegant code, and performance tuning with thorough unit tests.
  • Followed CI/CD methods using Git & Jira.
  • Collaboration with stake holders for requirement gathering, design and development presentation.
  • Worked on code repo i.e., Git, & bitbucket.

Software Engineer

Wipro Technologies
09.2013 - 09.2015
  • Specified requirements and supported software implementation, test case development, verification test execution, and certification.
  • Worked on Informatica Power Centre 8.5, 9.6.
  • Worked on database like Oracle 8i and 11g.
  • Worked on application migration from informatica 8.5 to 9.6.
  • Created, coordinated, and conducted complex tests and debugged systems.
  • Drafted software requirements and managed design and testing processes.
  • Produced clean, error-free code to meet aggressive timelines.
  • Maintained software performance with regular updates.
  • Corrected errors in software designs during development and after installation.

Education

Bachelor of Science - Computer Applications Development

Sri Krishna Degree College
Bangalore
05.2013

Skills

  • AWS/GCP Data Services
  • ETL/ELT
  • Spark
  • Python
  • DWH (OLTP/OLAP), SQL/PLSQL & Data Lake
  • Apache tools Atlas, Ranger, Iceberg, Airflow, Spark, Kafka
  • Big Data Processing
  • Component integration
  • Requirements gathering
  • Process evaluation
  • Database performance
  • Business process analysis

Certification

  • Informatica Cloud data modernization: cloud data warehouse and cloud data lake
  • Salesforce ADM 201

Awards

  • Best Functional Star Award
  • Best Team Player Award
  • Best Performer Award

Timeline

Solution Architect

Siemens Technology And Services Private
01.2024 - Current

Lead Data Engineer

Jubilant FoodWorks LTD.
08.2022 - 12.2023

Senior Associate Consultant

Elli Lilly Health Care
02.2020 - 08.2022

Senior Data Engineer

Epsilon
09.2018 - 01.2020

Senior Data Engineer

Indecomm Global Service
09.2015 - 01.2018

Software Engineer

Wipro Technologies
09.2013 - 09.2015

Bachelor of Science - Computer Applications Development

Sri Krishna Degree College
Madhvesh ML