Summary
Overview
Work History
Education
Skills
Timeline
Generic

Siddheshwar Kurkute

Data Engineer
Pune

Summary

Versatile, high-energy senior data engineer with 6+ years of experience in building data intensive pipelines, extracting data from multiple sources, wrangling big datasets and loading this data. Result oriented project team leader experienced in executing multiple Information Technology projects within strict time schedules with comprehensive understanding of all aspects of Projects, targeting assignments in Snowflake, Pyspark, Azure Databricks, Python, Informatica, Teradata, SQL, UNIX and Data Warehousing technologies. Flexible collaborator who also works well independently. Adept at managing a high-pressure, fast-paced work environment, agility & eagerness to learn and accept new responsibilities.

Overview

5
5
years of professional experience
4
4
years of post-secondary education
1
1
Language

Work History

Data Engineer

Directv Data Hub
Pune
09.2022 - Current
  • Work on data mapping from various sources to match the data model.
  • Develop queries to source the data from various sources and load this data into staging tables.
  • Develop data pipelines for data cleansing and data wrangling as per the standards and requirements.
  • Collaborated on ETL (Extract, Transform, Load) tasks, maintaining data integrity and verifying pipeline stability.
  • Create appropriate schedules for both staging and dimension and fact table loads
  • Implement integrity checks between source and target datasets.

Data Engineer

AT&T CDO – Deep Ingestion
Pune
11.2021 - 08.2022
  • Ingest data from source systems like Oracle, SQL Server, Teradata, Vertica in Foundry.
  • Analyze ingestion requirements and based on the no of records decide the appropriate ingestion type to be used like Append, Snapshot & Update.
  • Crete Raw syncs for the various source connection such as JDBC or File based sources.
  • Write Pyspark code to transform the raw syncs into clean datasets as per business requirements.
  • Create appropriate schedules for both Raw and clean datasets as per requirements.
  • Implement integrity checks between source and deep datasets whenever needed.

Snowflake Developer & UAT Engineer

SAS Transformation
Pune
11.2020 - 10.2021
  • Analyze source data from various sources to help decide on ingestion mode as Full Refresh or Incremental.
  • Create automated scripts to generate SQL for extracts and unix scripts to be used for daily extracts and historical data load.
  • Running and monitoring extraction and transfer of historical as well as incremental files to the Azure File Share.
  • Responsible for the unit testing of the code developed to covert SAS workflows to snowflake.
  • Responsible for analyzing and compare the data from SAS and Snowflake to verify all the data is loaded correctly.
  • Finding and helping correct data issues caused while converting SAS flows in Snowflake.
  • Helping with the data loads from Azure containers to Snowflake.

Application Developer - Data Warehouse

DIRECTV BI
Bangalore
11.2016 - 06.2020


  • Work with Business client to understand and analyze the business requirements and implement solutions.
  • Create automated scripts to generate SQL for extracts and Unix scripts to be used for daily extracts and historical data load.
  • With the in-depth expertise in the Teradata cost-based query optimizer, identify potential bottlenecks with queries from the aspects of query writing, skewed redistributions etc.
  • Support the building of data flow channels and processing systems to extract, transform, load and integrate data from various sources.
  • Created and modified number of mappings, reusable sessions, workflows as per requirements.
  • Created BTEQ, Fload, Mload scripts and fast export scripts according to client requirements and Experience on performance tuning.
  • Experience in preparing Service level Metrics for projects and responsible for presenting them every month to client as well internal project management.
  • Coding a script in a most efficient way to execute a given query by considering the possible query plans.
  • Experience in handling the client communication for all project related activities.
  • Responsible for completion all critical business process running daily, weekly & monthly.

Education

Bachelor of Engineering -

MIT Academy of Engineering, University of Pune
Pune
06.2012 - 02.2016

Skills

Snowflake, Azure Databricksundefined

Timeline

Data Engineer

Directv Data Hub
09.2022 - Current

Data Engineer

AT&T CDO – Deep Ingestion
11.2021 - 08.2022

Snowflake Developer & UAT Engineer

SAS Transformation
11.2020 - 10.2021

Application Developer - Data Warehouse

DIRECTV BI
11.2016 - 06.2020

Bachelor of Engineering -

MIT Academy of Engineering, University of Pune
06.2012 - 02.2016
Siddheshwar KurkuteData Engineer