Summary
Overview
Work History
Education
Skills
Projects
Accomplishments
Timeline
Generic

Ravikiran Patil

Pune

Summary

Experienced Senior Software Engineer specialized as an Azure Data Engineer with a proven track record in transforming enterprise reporting and analytics. Adept at analyzing functional requirements, developing SQL code, optimizing data workflows, and conducting thorough data validations. Strong background in Azure Data Factory, Azure Synapse, Azure Data Lake, and SQL Server, with hands-on experience in project management, client interactions, and ensuring high-quality deliverables.

Overview

4
4
years of professional experience

Work History

Software Engineer

Impetus technology
Pune
04.2022 - Current

Project 1: Manitoba Public Insurance

Role: Azure Data Engineer

Platform & Skills: Azure Synapse Analytics, PySpark, Azure Key Vault, Azure Data Lake Gen2, SQL Server, Azure DevOps

  • Ingested and transformed data from multiple sources, achieving a 30% improvement in data processing efficiency.
  • Gathered requirements from clients and implemented robust solutions, enhancing client satisfaction.
  • Ensured code correctness and optimized performance, reducing query response time.
  • Deployed code through Azure DevOps, achieving a 95% success rate in deployments.
  • Created PySpark notebooks to extract and transform data, storing it in ADLS Gen2 and sending it to SQL Server.
  • Conducted data transformations using PySpark on Parquet files, significantly improving data accuracy and integrity.
  • Loaded final tables using SQL stored procedures, contributing to a 25% reduction in data latency.

Project 2: RAC

Role: Azure Data Engineer

Platform & Skills: Azure Synapse Analytics, PySpark, Azure Key Vault, Azure Data Lake Gen2, SQL Server, Azure DevOps

  • Ingested and transformed insurance data, leading to a 35% increase in data integration efficiency.
  • Gathered and implemented client requirements, boosting solution effectiveness
  • Ensured code correctness and performance optimization, resulting in a 50% reduction in processing time.
  • Deployed code via Azure DevOps, maintaining a high deployment success rate of 97%.
  • Developed PySpark notebooks to extract, transform, and load data, enhancing data processing workflows.
  • Conducted data transformations on Parquet files, improving data quality and reducing errors by 30%.
  • Loaded final tables using Pyspark Notebooks and SQL stored procedures.

SR Software Engineer | Analyst

Capgemini India Pvt Ltd
Pune
02.2020 - 12.2022

Hewlett Packard – Transforming Enterprise Reporting & Analytics (HP-TERA) Designation/Role: Azure Data Factory Developer

Platform & Skills: Azure Data Factory, Azure Synapse, Azure Key Vault, Azure Data Lake Gen2, SQL Server, Visual Studio

  • Analyzed Functional Requirement documents, reducing requirement gathering time by 20%.
  • Developed and enhanced SQL code, improving data retrieval efficiency by 25%.
  • Created and optimized multiple Data Flows and Pipelines using ADF, achieving a 30% increase in data processing speed.
  • Utilized Azure SQL to develop complex metrics in Azure Synapse Analytics, enhancing reporting accuracy by 15%.
  • Conducted data validations and generated reports using AAS and Power BI, resulting in 30% more accurate insights.
  • Developed, optimized, and executed ADW SQL queries, reducing query run time by 40%.
  • Collaborated with project managers on estimates, timelines, and resource plans, improving project delivery timelines by 10%.
  • Organized client and business team calls, ensuring clear communication and support requirements, leading to a 15% improvement in client satisfaction.
  • Performed unit and regression testing, maintaining a 98% defect-free deployment rate.
  • Identified and tracked defects to closure, ensuring compliance with SLA and achieving 100% defect resolution.

Education

Bachelor's degree -

AISSMS IOIT, PUNE
Pune
01-2019

HSC -

Pad.Dr.D.Y.Patil ACS College
01-2015

Secondary School Certificate -

Maharashtra State Board
01-2013

Skills

  • SQL
  • Azure Data Factory
  • ETL
  • Azure Synapse
  • Pyspark
  • Stored Procedure

Projects

  • Hewlett Packard - Transforming Enterprise Reporting & Analytics (HP-TERA), HP-TERA is a cloud based project where we move HP related data to cloud platform for which extensively used various technologies like Hadoop, Hive, Spark, Talend, and Azure. These things will be handled in three different layers: Ingestion, Distillation, and Consumption. Ingestion will get the data from different sources which is cleansed using Hive and moved to distillation. Distillation will prep the data and drop the files to Azure data lake storage using Talend. Consumption loads data into Azure Synapse Analytics, Azure Data Base through Azure Data Factory and also creates complex merge tables and metrics using SQL as required for reports. Final job is to create tabular models and report generation as per the Business requirements using PowerBI.
  • Manitoba Public Insurance (MPI), MPI is a public insurance company aiming to improve its business through data insights. The project uses Azure cloud platform for data storage, transformation, and cleaning. Data is sourced from the CRM system and Celtic source fetched using Synapse Link, and transferred to ADLS Gen2. Pyspark notebooks clean and transform the data, which is then loaded into SQL Server. Power BI generates reports and insights using the SQLServer tables. This project enables MPI to make informed decisions and enhance their business operations.
  • RAC, RAC is automobile insurance company aiming to enhance business operations through data insights using the Azure cloud platform. Data was sourced from D365 systems,API, XMLsources and various databases, ingested using Azure Synapse Pipelines, and transferred to Azure Data Lake Gen2 (ADLS Gen2). Utilized PySpark notebooks for cleaning and transforming data in Delta Lake. The data processing workflow included landing, raw, integration, and data product zones, ensuring data quality and enrichment. Processed data was stored in SQL Serverless Pool, with SQL views created for reporting. Power BI was used to generate interactive reports and insights based on the processed data. This solution improved data integration efficiency, enabled real-time analytics, and supported informed decision-making for the insurance company.
  • Cloud Springboard Programme'22-23, Azure Data factory, Azure key vault. Azure data lake Gen2, Azure Logic Apps,Azure SQL Database. Power BI, The project required working with API data for a government Pollution record daily. The data needed to be cleaned and transformed and then statistical aggregation was to be done on the dataset.

Accomplishments

  • Rising star award for the first half of the year 2021
  • Completed Specialization on Data Visualization with Tableau from Course-Era
  • Certified with Python programming Essentials from Course-Era
  • Completed Microsoft certified certificate of Azure AZ-900

Timeline

Software Engineer

Impetus technology
04.2022 - Current

SR Software Engineer | Analyst

Capgemini India Pvt Ltd
02.2020 - 12.2022

Bachelor's degree -

AISSMS IOIT, PUNE

HSC -

Pad.Dr.D.Y.Patil ACS College

Secondary School Certificate -

Maharashtra State Board
Ravikiran Patil