Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

Ramesh Maliki

Summary

Around 8 years of experience in Microsoft Azure Cloud Computing and ETL Developing. Having good knowledge of Microsoft Azure Components like Azure Data Factory, Azure Data bricks, Azure Data Lake storage, Azure Blob Storage, Azure SQL Data base, Logic Apps and Key Vault, Pyspark, Delta Lake. Building pipelines, Datasets, Linked services for automatic data integration, transformation on cloud. Good Experience in pipelines creation to process, transforms, manage and compute the data in Azure Data Factory. Experience in scheduling and monitoring pipelines. Good Experience in Azure Data bricks. Good Experience in PySpark for cleaning, transforming, and analyzing datasets. AZ-900 and DP-203 certified. Having Knowledge in Azure Synapse. Having Knowledge in Python. Having Knowledge on Microsoft Fabric. Familiar with Data Warehousing using Data Extraction, Data Transformation, Data Loading (ETL). Having knowledge of Power BI desktop and Power BI services. Having experience in Informatica and ETL Testing.

Overview

8
8
years of professional experience
2
2

Certifications

Work History

Data Analyst

DXC Technology
04.2023 - 05.2024
  • Developed data pipelines using ADF to automate data movement between various sources and sinks
  • Configured data transformations and data cleansing within the pipelines
  • Scheduling and monitoring pipeline execution
  • Perform complex data transformations and aggregations on large datasets
  • Utilized PySpark for cleaning, transforming, and analyzing datasets
  • Developed automated data pipelines using PySpark for data ingestion, transformation, and loading
  • Optimized Spark jobs for performance and scalability

Data Engineer

ITC Infotech
12.2021 - 01.2023
  • Involved in Business requirement document walk through to understand functionality
  • Extracted, transformed, and loaded data from source systems to Azure Data Storage services
  • Created pipelines in Azure Data Factory for batch job automation
  • Ingested data to Azure Services like Azure Data Lake, Azure SQL, Azure Synapse using Azure Databricks
  • Used Spark-SQL for faster data processing and testing
  • Interacted with client teams to gain a better understanding of their deployment needs

Application Development

Accenture Solutions
12.2020 - 10.2021
  • Worked with Microsoft Azure Components like Data factory, to move data from on premises database to Azure database via azure portal
  • Requirement gathering from the Client and provide better solutions
  • Map source system data elements to target system and develop, test, and support extraction, transformation, and load processes
  • Developed Pipelines in Azure Data factory using Multiple Activities
  • Integrated the data from on-premises like ORACLE, and BLOB Source to Azure Data Lake Store (ADLS)
  • Created Pipelines, Datasets using activities for automatic data integration in Azure Data factory
  • Worked with multiple activities in the process of creating pipeline framework to process the data
  • Created Scheduled based, Event based, Tumbling window triggers on demand basis
  • Performed Data Validation and Data Quality checks after completion of Data ingestion

Consultant

Capgemini
11.2016 - 11.2019
  • Participated in business process walk through to understand functionality
  • Analyzed the Transformation rules using the mapping rule document
  • Understanding business requirement and functional requirements specification and documenting test scenarios, test cases, test execution, defect management, status reporting
  • Coordinating with different teams for doing end to end testing
  • Running the workflows and shell scripts to validate the results
  • Involved in ETL validations like Structure Validations, Count Validation, Constraint Validations, Duplicate check validations and data validations

Software engineer

Extarc Software solution
09.2014 - 10.2016
  • Participated in business process walk through to understand functionality
  • Analyzed the Transformation rules using the mapping rule document
  • Understanding business requirement and functional requirements specification and documenting test scenarios, test cases, test execution, defect management, status reporting
  • Coordinating with different teams for doing end to end testing
  • Running the workflows and shell scripts to validate the results
  • Involved in ETL validations like Structure Validations, Count Validation, Constraint Validations, Duplicate check validations and data validations
  • Attending weekly dev sync up calls with development teams for the project discussions and participating triage (defect) calls
  • Running test cases and reported the defects through HP ALM
  • Participating defect triage call, QA status call and coordinating dev team regarding defects on daily basis

Education

Bachelor of Technology (B.Tech) - ECE

JNTUH

Skills

  • Azure Data Factory
  • Azure Databricks
  • Azure Data Lake storage
  • Azure Blob Storage
  • Azure SQL Data base
  • Logic Apps and Key Vault
  • Pyspark
  • SQL

Certification

  • AZ-900
  • DP-203

Timeline

Data Analyst

DXC Technology
04.2023 - 05.2024

Data Engineer

ITC Infotech
12.2021 - 01.2023

Application Development

Accenture Solutions
12.2020 - 10.2021

Consultant

Capgemini
11.2016 - 11.2019

Software engineer

Extarc Software solution
09.2014 - 10.2016

Bachelor of Technology (B.Tech) - ECE

JNTUH
Ramesh Maliki