Summary
Overview
Work History
Education
Skills
Timeline
Generic

Mahesh D

Cloud Solution Manager
Bengaluru

Summary

Experienced Technical Project Manager with top-notch implementation and project management abilities. Highly organized, methodical, and skilled at overseeing daily milestones across high-performance teams. Well-versed in IT planning and deployment. Results-driven leader with positive attitude and passion for providing high-quality advice and guidance to clients. Proven ability to identify customer needs, resolve conflicts and build strong relationships. Possesses excellent problem-solving, communication and interpersonal skills. Over 13 years of experience in development and implementation of Data Warehousing solutions using Informatica, Azure ADF, Azure Synapse, Data Quality, Snowflake, Oracle PLSQL, OLTP and OLAP Expertise data extraction, Data transformation, Data loading, Reporting and Data analysis. Responsible for designing, building, deploying, and maintaining data integration pipelines within the Azure Data Factory environment. Strong Data Warehousing ETL experience of using Informatica 9.6/9.1/8.6.1/10.1 PowerCenter Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools like Informatica Server, Repository Server manager. Good Knowledge on AI/ML, Gen AI Designed and developed data ingestion pipelines from CSV's or API to different layers into the ADLS using Azure Data Factory (ADF V2). Orchestrated data integration pipelines in ADF using various Activities like Get Metadata, Lookup, For Each, Wait, Execute Pipeline, Set Variable, Filter, until, etc Involved in providing production support to various Ab Initio and Informatica ETL jobs. Configured Ab Initio environment to load/unload data records from and to databases using the Input Table, Output Table components. Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting. Experience in using Automation Scheduling tools like Autosys UC4 and Control-M. Worked extensively with slowly changing dimensions.



Overview

15
15
years of professional experience

Work History

Cloud Solution Adviser

NTT DATA Global Delivery Services Pvt Ltd
12.2022 - Current
  • Overview: Specialty insurance company Ascot has operations in the US, Bermuda, and London. It wants to apply technology to adopt a unified strategy in all of its business operations and provide cutting-edge solutions for atypical circumstances. The goal of this project is to modernize the Ascot Reinsurance Claim System (ARCS) to satisfy both technical and business requirements. constructing a solution by gathering information from various sources, such as RI Security, warehouses, subscription services, and API applications. Data is then loaded into the ArcsV2 application via Microsoft Azure, where it is used by downstream subscription UI, analytics, and reporting.
  • Working with various stakeholders to gather requirements, I oversaw the project by guiding new developers, conducting code reviews, and establishing a structure for knowledge exchange within the team. I also migrated on-premises data warehouses to Azure Synapse Analytics.
  • To enable end-to-end data workflows, Azure Data Factory pipelines can be integrated with other Azure services, including Azure Storage, Azure SQL Database, and more.
  • Using Azure Data Factory, scalable ESTL pipelines were designed and put into place to process data from a variety of sources, such as SQL Server, REST APIs, and on-premises systems. ADF triggers and Azure Logic Apps were used to automate pipeline scheduling and monitoring. Databricks Jobs were created and scheduled to automate workflows using Workflows.
  • Monitoring data pipeline performance, spotting problems or bottlenecks, and improving the pipelines for increased effectiveness and performance as part of an Agile development process by taking part in stand-ups, sprint planning, and retrospective sessions

Tech Lead/Developer

Genpact Headstrong Capital Market
11.2021 - 12.2022
  • Overview: The goal of this project is to offer a comprehensive analytical solution for ABB's intricate data structure that deals with orders, revenue, volume, and other factors. Developing the solution involves gathering vast amounts of data from multiple sources, loading the data into Microsoft Azure, processing and transforming the data using Qlik Compose, creating complex mapping, and utilizing Snowflake to create the Compose layer, which will be utilized for analytics, self-service, and dashboarding. Create a sophisticated dashboard to help with reporting and analytics.
  • Designed and implemented scalable ETL pipelines using Azure Data Factory to process data from various sources, including SQL Server, REST APIs, and on-premises systems;
  • Facilitated end-to-end data workflows by integrating Azure Data Factory pipelines with other Azure services, such as Azure Storage, Azure SQL Database, and more
  • actively participated in stakeholder communication and documentation
  • Prepare Release strategy and Release plan for code movement to other Environments
  • Conducted code reviews and make sure quality solutions are built
  • Participate in an Agile development process, contributing to sprint planning, stand-ups, and retrospective sessions.

DW-BI Developer

Genpact Headstrong Capital Market
01.2018 - 11.2021
  • Overview: CME Group Inc. (Chicago Mercantile Exchange & Chicago Board of Trade) is an American financial market company operating an options and futures exchange. It owns and operates large derivatives and futures exchanges in Chicago, New York City, and exchange facilities in London, using online trading platforms. It also owns the Dow Jones stock and financial indexes, and CME Clearing Services, which provides settlement and leading of exchange trades. The exchange-
    traded derivative contracts include futures and options based on interest rates, equity indexes,
    foreign exchange, energy, agricultural commodities, rare and precious metals,weather and real estate. It has been described by The Economist as "The biggest financial exchange you have never heard of."
  • Responsible for end to end implementation
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor
  • Parsed high-level design specification to simple ETL coding and mapping standards
  • Mapping development using Informatica and Unix Shell scripts which expands the flexibility of the project
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager
  • Used Type 1 SCD.Type 2 and Type 4 SCD In snowflake Streams to update slowly Changing Dimension Tables
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer
  • Developed mappings to load into staging tables and then to Dimensions and Facts
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse
  • Modified existing mappings for enhancements of new business requirements
  • Used Debugger to test the mappings and fixed the bugs
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels
  • Extracted data, performed data transmission and created data model in Power BI
  • Performed data load
  • CME Group Inc
  • (Chicago Mercantile Exchange & Chicago Board of Trade) is an American financial market company operating an options and futures exchange

ETL/BI Developer

Century Link Technology
07.2013 - 07.2017
  • Overview: The Century Link Company Data Warehouse (CCDW) is intended to serve as the single, official source for detailed billing and inventory information for the combined entities of Legacy Q and Legacy CTL
  • Responsible for Business Analysis and Requirements Collection
  • Parsed high-level design specification to simple ETL coding and mapping standards
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse
  • Mapping development using Informatica and Unix Shell scripts which expands the flexibility of the project
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse
  • Developed mappings to load into staging tables and then to Dimensions and Facts
  • Used existing ETL standards to develop these mappings
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables
  • Modified existing mappings for enhancements of new business requirements
  • Used Debugger to test the mappings and fixed the bugs
  • Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels
  • Prepared migration document to move the mappings from development to testing and then to production repositories
  • The Century Link Company Data Warehouse (CCDW) is intended to serve as the single, official source for detailed billing and inventory information for the combined entities of Legacy Q and Legacy CTL

ETL Developer

Century Link Technology
08.2011 - 07.2013
  • Overview: The purpose of the project is to extract the data from the different sources including CRIS, IABS, LATIS, CUSTHUB, EXTERNAL, ACXIOM etc
  • Analysis of the specifications provided by the clients
  • Enhancement of the existing graph to the latest versions
  • Graph development using Ab Initio and Unix Shell scripts which expands the flexibility of the project
  • Unit testing of developed graphs
  • Prepare Release strategy and Release plan for code movement to other environments
  • Conducted code reviews and make sure quality solutions are built and best practices/standards have been followed
  • Trained Team member on AbInitio to set up Team
  • Involved in Unit, Integration, System, and Performance testing levels
  • Migrated the code into QA (Testing) and supported QA team and UAT (User)
  • The purpose of the project is to extract the data from the different sources including CRIS, IABS, LATIS, CUSTHUB, EXTERNAL, ACXIOM etc

ETL Developer

Hewlett-Packard Global Soft .Pvt. Ltd
04.2010 - 05.2011
  • Overview: Citigroup is one of the largest commercial banks of the world
  • An initiative has been taken to build a data hub (DDI), which will source data from all source systems and will store the data in a centralized place
  • Analyzing the Data Integration requirements and developing Ab-Initio graphs
  • Enhancement of the existing graph to the latest versions
  • Developing the autosys job to schedule the graph
  • Unit testing of developed graphs
  • Moving code to SIT (System Integration Testing)
  • Carry out the testing in SIT Environment
  • Citigroup is one of the largest commercial banks of the world
  • An initiative has been taken to build a data hub (DDI), which will source data from all source systems and will store the data in a centralized place

ETL Developer

Hewlett-Packard Global Soft .Pvt. Ltd
10.2009 - 04.2010
  • Transformation and delivery of real time data from sources such as JMS topic, MQ Queue and batch files into a Common Message Model (CMM)
  • Persistence of messages into the ODS and files in the file server
  • Delivery of messages to JMS Topic in a XML format
  • Analyzing the Data Integration requirements and developing Ab-Initio graphs
  • Enhancement of the existing graph to the latest versions
  • Developing the autosys job to schedule the graph
  • Unit testing of developed graphs
  • Moving code to SIT (System Integration Testing)
  • Carry out the testing in SIT Environment

Education

Bachelor of Engineering - Instrumentation Technology

Visvesvaraya Technological University

Skills

ETL Tools: Microsoft Azure Data Factory, Azure databricks, Azure Synapse, Informatica

Languages: SQL/PLSQL

Database: SQL Server,Oracle 9i/10g,11g

Operating System: Windows, HP-AIX UNIX, Linux

Application Software: Microsoft Excel, Microsoft PowerPoint,

undefined

Timeline

Cloud Solution Adviser

NTT DATA Global Delivery Services Pvt Ltd
12.2022 - Current

Tech Lead/Developer

Genpact Headstrong Capital Market
11.2021 - 12.2022

DW-BI Developer

Genpact Headstrong Capital Market
01.2018 - 11.2021

ETL/BI Developer

Century Link Technology
07.2013 - 07.2017

ETL Developer

Century Link Technology
08.2011 - 07.2013

ETL Developer

Hewlett-Packard Global Soft .Pvt. Ltd
04.2010 - 05.2011

ETL Developer

Hewlett-Packard Global Soft .Pvt. Ltd
10.2009 - 04.2010

Bachelor of Engineering - Instrumentation Technology

Visvesvaraya Technological University
Mahesh DCloud Solution Manager