Summary
Overview
Work History
Education
Skills
Websites
Certifications Training
Timeline
Generic

Ramakant Chaturvedi

Summary

Having 14+Years of experience in IT industry, specializing in Banking, Retail, Insurance and Healthcare domains, with a focus on Azure Data Engineering for 4 years. Proficient in MS SQL, Azure Data Factory, Azure Databricks, Azure Data Lake Storage and Azure SQL. Extensive experience working with clients in the US, UK, and Germany. Demonstrated ability in computational and analytical problem-solving, with a proven track record of working independently. Self-motivated professional with strong interpersonal skills, recognized for adaptability, problem-solving, and driving successful project outcomes.

Overview

15
15
years of professional experience

Work History

Module Lead

Mphasis Limited
03.2024 - Current
  • Designing the Data Lake Architecture using Modern Medallion Architecture on Azure Cloud Platform
  • Produced flexible Data models on Azure SQL Datawarehouse databases to redesign Operational Datawarehouse Reporting Platform
  • Developed Metadata Driven DataFactory Pipelines to Ingest data from different source systems
  • Designing and Developing Reusable Data Factory pipelines to load Dimension Tables
  • Designing and Developing DataFactory pipelines to load Fact Tables
  • Designing and Developing DataFactory pipelines to Process Semi Structured Data in Data Lake Silver Layer
  • Designing and Developing DataFactory pipelines to Integrate Data and load into Data Lake Gold Layer
  • Designed Reusable Data Ingestion Pipelines using Azure Databricks Notebooks
  • Developed Parameter Driven Databricks Notebooks Ingest data from Multiple source tables
  • Developed Databricks PySpark Reusable Notebooks to Create Delta Lake Tables for all source datasets
  • Developed Spark SQL scripts to implement CDC (Change Data Capture Mechanisms) using Azure Databricks Notebooks
  • Developed Databricks Spark SQL scripts to populate Reporting Dimension and Fact Tables
  • Developed Databricks Pyspark and SQL Scripts to Transform the source data and Publish in the New Datawarehouse designed

Technical Architect

HCL Technology Pvt Limited
04.2022 - 12.2023
  • Created Azure data factory pipelines to pull data from on-premise Oracle database, and Azure SQL DB to Azure Data Lake storage as a full load daily into corresponding folders
  • Prepared queries to ensure source and target data load into mart as per user requirement, data test to ensure that all system components are working as per mapping logic
  • Designed and implemented the delta load process to transfer only the changed or new data from source to destination, reducing processing time and network bandwidth consumption
  • Extracted data from various sources like MS-SQL server, Manual files, and Blob Storage
  • Having experience in transforming and loading data into delta tables in Azure Databricks
  • Good understanding of storing and reading data from cloud storage like Azure blob storage
  • Worked as a project lead to integrate various aspects of the project team
  • Mentor Junior Team member

Technical Lead

Wipro Technology Pvt Limited
03.2021 - 04.2022
  • Understanding the Business requirements, coordinating with business analysts and Core team members to get specific requirements for the application development/migration and define data conversion requirements
  • Played key role to Establish connectivity for Snowflake/S3 connector within DataStage on AWS Cloud platform
  • Analyze existing conversion processes and identify/address problems impacting the quality of our data conversion and implementations
  • Setup EMFT (Secure File transfer) environment to send/received files from different sources
  • Liaise with business area and SMEs to articulate and define the high-level requirements and challenges where necessary
  • Worked on IIS version Migration tool and JFROG to deployed and migrate the code from V9.1 to V11.7
  • SG (Security group) entry done into Bit bucket on AWS for Info Sphere Infra to communicate DB and File server
  • Build the compatibility of power shell, window script into Linux box
  • Build and modify Info sphere MDD as per on-premises server and modify the new requirement as per HC (Hybrid Cloud) compatibility
  • Build script for BMC and Dynatrace tool to monitor application/infrastructure
  • Involved in scrum call to update the customer on project status on daily basis
  • Facilitate the team members as per their requirement
  • Designed and developed parallel job using DataStage Designer
  • Created test plans as per the requirement
  • Performed the Unit testing for jobs developed to ensure that it meets the requirements
  • Worked as a project lead to integrate various aspects of the project team
  • Mentor Junior Team member
  • Creation of estimates for the end-to-end ETL process

Consultant

Capgemini Technology Services Pvt Limited
03.2016 - 03.2021
  • Involved with application and database support teams to maintain system and database which supports current data scenarios and easily adapts to changing business needs
  • Used parallel extender extensively by using processing, development/debug, file and real-time stages
  • Involved in all phases including requirement analysis, design, coding, testing, support, and documentation
  • Extensively used DataStage designer to develop processes for extracting, transforming, integrating and loading data from various sources into the data warehouse database
  • Extensively used quality stage stages like standardize, match frequency, duplicate & unduplicated match and survive stages
  • Widely used different types of DataStage stages like modify, sequential file, copy, aggregator, surrogate key, transformer, dataset, look up, join, remove duplicates, CFF sorter, column generators, CDC, and funnel
  • Used database stage like db2/UDB enterprise, db2 bulk load, Drs, Sybase, ODBC connector
  • Designed and developed the process for real-time data movement of reference data from Sybase to db2 using a mix of DataStage jobs
  • Developed information service stages (WISED) for real-time data movement between db2 and Sybase
  • Involved in generating an SOA service of DataStage quality stage job and deploy/manage the services using IBM console to receive service requests
  • Worked with data modelers, technical architects, customer/end user, business analysts, and data analysts to design technical specification documents/mapping documents

Associate Project

Cognizant Technology Solutions India
02.2010 - 03.2016
  • Worked with Business Analyst to identify, develop business requirements, transforming it into technical requirements and responsible for deliverables
  • Implemented join, lookup, aggregates, filters, rank and update strategy transformations
  • Worked towards optimal performance when using stages like lookup, join, and merge
  • Extensively worked with DataStage Designer and Director to load data from source extract files to the warehouse
  • Provided the staging solutions for data validation and cleansing with PL/SQL and DataStage ETL jobs
  • Developed various jobs using database connectors, aggregator, and sequential file stages
  • Designed and developed DataStage jobs for loading staging data from different sources like files, Oracle, SQL server DB into data warehouse applying business rules which consist data loads, data cleansing, and data massaging
  • Created a shared container to simplify the DataStage design, and used it as a common job component throughout the project
  • Tuning DataStage jobs for better performance by creating DataStage lookup files for staging the data and lookups
  • Monitor scheduled Control-M jobs from Control-M tools

Education

B. Tech - CSE

VBS Purvanchal University
Jaunpur, U.P

Skills

Cloud & Data Engineering

Azure Services: Azure Data Factory (ADF), Databricks, Delta Lake, Azure SQL Database, Azure Blob Storage

AWS Services: S3, Snowflake, EMFT, AWS DataStage

Data Warehousing & Modeling: Star Schema, Snowflake Schema, Data Lakehouse

Programming & Scripting

PySpark, Python, SQL, Unix/Linux Scripting

ETL & Data Integration

IBM DataStage (9 years), Control-M, Autosys

ETL pipeline development, Data Cleansing & Transformation

Project & Issue Tracking

Jira, ServiceNow, Bitbucket, Dynatrace

Certifications Training

  • Microsoft Certified: Azure Data Engineer Associate, Microsoft, 03/01/25
  • Databricks Certified Associate Developer, Databricks, 04/01/25

Timeline

Module Lead

Mphasis Limited
03.2024 - Current

Technical Architect

HCL Technology Pvt Limited
04.2022 - 12.2023

Technical Lead

Wipro Technology Pvt Limited
03.2021 - 04.2022

Consultant

Capgemini Technology Services Pvt Limited
03.2016 - 03.2021

Associate Project

Cognizant Technology Solutions India
02.2010 - 03.2016

B. Tech - CSE

VBS Purvanchal University
Ramakant Chaturvedi