Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic
Ramesh K

Ramesh K

Summary

Results-driven professional with nearly 9 years of experience leveraging diverse technologies within the Azure cloud ecosystem, including Databricks, PySpark, Synapse Analytics, and Azure Data Factory. Expertise in building and optimizing data pipelines using Data Lake Store Gen2, Azure SQL, and Synapse SQL Pool, complemented by strong proficiency in Python and Spark for performance enhancement and data transformations. Proven track record of delivering high-quality solutions that drive efficiency and support data-driven decision-making. Committed to continuous learning and staying abreast of emerging technologies to deliver innovative results.

Overview

9
9
years of professional experience
1
1
Certification

Work History

Lead Data Engineer

TEK Systems India Pvt Ltd
03.2024 - Current
  • Executed data transformation activities, including Pipeline, Get Metadata, and Filter in Azure SQL Server.
  • Loaded data from SAP systems into Data Lake and Azure SQL tables using Azure Data Factory.
  • Managed data transfer from ADLS to data frames via mount points, creating notebooks for efficiency.
  • Authored PySpark and Python scripts for effective data transformations in Azure Databricks.
  • Extracted JSON data, converting it into parquet format for optimized storage solutions.
  • Applied diverse data transformation techniques, encompassing joins and window functions.
  • Utilized delta format conversions from parquet and implemented merge logic for incremental loads.

Data Engineer

Kagool Data Pvt Ltd
10.2023 - 02.2024
  • Designed ADF pipelines to efficiently execute SSIS packages for lift and shift operations.
  • Executed data transfers from blob files to ADLS using mount point utilities.
  • Developed PySpark and Python code in notebooks for optimal data transformations.
  • Extracted and transformed JSON files into parquet format to enhance storage efficiency.
  • Applied advanced data transformation techniques, including joins and window functions.
  • Implemented merge logic for seamless incremental data loads to improve workflow.
  • Converted parquet files to delta format while structuring semi-structured files effectively.
  • Managed ETL development lifecycle, ensuring comprehensive testing and deployment.

Consultant

Atos Syntel Global Pvt Ltd
04.2022 - 04.2023
  • Created Azure ADF pipelines to move data from On Premise data load and SFTP to Azure Data Lake Gen2.
  • Loading data from ADLS to data frames via mount point, Creating Notebooks, mount point utilities, secret scopes writing the PySpark code using Azure Databricks for data transformations.
  • Extracting data from Json files to Data frames and transform them into parquet data format.
  • Experienced with data transformations like joins, window functions, broad cast joins.
  • Converting semi structured files to structured files.
  • Familiar with delta format conversions from parquet.
  • Implemented merge logics for incremental loads.
  • Familiar with spark SQL along with PySpark.
  • Worked closely with the Business users and warehouse team to understand the requirements and analyzing the issue.

Sr Software Engineer

Tech Mahindra Pvt Ltd
06.2016 - 04.2022
  • Studied user stories to understand customer requirements.
  • Plan with PO, SM to get optimal development strategy, planning.
  • Design reusable pipelines and stored procedure with dynamic parameterization.
  • Designed Azure ADF pipelines to move data from On Premise data load and SFTP to Azure Data Lake Gen2.
  • Created pipelines to load from ADLS to Synapse data warehouse, created external tables.
  • Implemented Control flow activities, data flow activities, pipeline Activities for On-cloud ETL processing.
  • Created pipelines to load data from SQL server databases to ADLS gen2 (parquet) in synapse studio.
  • Load the files to stage tables and then to data warehouse tables in dedicated SQL pool in synapse.
  • Writing views, stored procedures for transformations to load data from stage tables to Data warehouse tables.
  • Designed external tables from parquet files via poly base mechanism Testing the notebook and validating the scripts using SQL.
  • Converting views to external tables used by PBI direct query for faster retrieval.
  • Designed dashboards visuals with slicers, filters, drill through options.
  • Preparing views data, data models for PBI reports.
  • Provided views to design Power BI reports.
  • Involved in PBI data modelling and designing reports as per business.

Education

Bachelor Of Arts -

Alagappa University
05.2016

Skills

  • Data engineering with Databricks
  • Skilled in data processing with PySpark and Python
  • Experience with Azure Data Factory
  • Experience with Azure Synapse Analytics
  • Azure SQL analytics solutions
  • Power BI data visualization
  • Spark application for data efficiency
  • DevOps collaboration tools
  • DAX formula development

Certification

  • Microsoft Certified: AZ900- Azure Fundamentals
  • Microsoft Certified: DP203- Azure Data Engineer Associate
  • RPA Developer Foundation

Timeline

Lead Data Engineer

TEK Systems India Pvt Ltd
03.2024 - Current

Data Engineer

Kagool Data Pvt Ltd
10.2023 - 02.2024

Consultant

Atos Syntel Global Pvt Ltd
04.2022 - 04.2023

Sr Software Engineer

Tech Mahindra Pvt Ltd
06.2016 - 04.2022

Bachelor Of Arts -

Alagappa University
Ramesh K