Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

Avinash Kurra

Azure Data Engineer
Bengaluru

Summary

An accomplished professional with 8+ years of experience in development and support projects in Azure Databricks, Azure data Factory, Big-Data(Spark), Terraform, Github and SAP HANA XSA and SAP BW. Aspiring for a suitable position in IT field, where my skills and consulting expertise can be broadly utilized.

Overview

9
9
years of professional experience
4
4
years of post-secondary education
1
1
Certification

Work History

Azure Data Engineer & SRE

Xebia IT Architects
Bangalore
08.2023 - Current
  • Company Overview: Client: Shell India
  • Contributed significantly to platform and data engineering efforts in the Azure ecosystem by streamlining and automating manual processes. Delivered impactful support and development for EDS Insights, ensuring timely and efficient outcomes.
  • Client: Shell India
  • Contributed to Azure platform enhancements, including User Access Management (UAM) and Role-Based Access Control (RBAC) ownership transfers.
  • Automated UAM and RBAC processes, reducing manual effort and improving operational efficiency.
  • Managed daily and monthly data loading activities and deployments for EDS Insights in the production environment.
  • Played a key role in enhancing EDS Insights by implementing updates to the subscription master and business domains across non-production and production environments.
  • Led the successful lift-and-shift migration of EDS Insights from legacy to new business domains, involving Azure components such as Databricks, ADLS Gen2, Azure Key Vault, and Azure Data Factory (ADF).

Azure Data Engineer

Xebia IT Architects
Bangalore
03.2022 - 08.2023
  • Company Overview: Client: Etihad Airways: Both Onsite and offshore
  • End to end data migration and development from cloudera to Azure environment. Responsible for complete design discussion, development of the complete existing workflows migration and deliverables in time.
  • Client: Etihad Airways
  • Understanding the existing architecture design and ETL applications.
  • Extracted the end-to-end workflows logics from zaloni data platform which contains critical business logics and configurations.
  • Implemented exact workflows in Azure databricks notebooks with new driven approach.
  • Created pipelines in data factory as per the requirements.
  • Created the mail alerts with the help of logic apps.
  • Suggested easy metadata driven approach to client with clear implementation. Got the appreciations for the new ideas on metadata approach.
  • Got good appreciations on reusable logger concept development.
  • Implemented notebook logic to migrate historic data from Hue storage to adls in parquet format.
  • Designed SIT, NFT(Non-Functional Testing), HLD(High Level Design/Solution Design) and service design package documents for production deployment.
  • Involved in production deployment with databricks Git-Repos and data factory Git Branches.
  • Completely handled hci system production issues like EDP1 vs EDP2 data mismatch.
  • Provided good support on production workflows like hci ingestion, transformations.
  • Converted SQL, Python code based logic to Azure Databricks notebooks with help of pyspark dataframes.
  • Created logical apps for sending the Mail for getting notifications on job failures.
  • Created ADF pipelines which can run in parallel to the load the data from source files to raw adls layers.
  • Unit tested all the pipelines end to end by loading incremental data.
  • Parameterized all the ADF pipelines to pass respective parameters.
  • Created metadata approach for workflows business logic and configurations.
  • Worked with various file formats. Parquet, csv, avro, text, and delta.
  • Generated hci workflows in azure which are extracted from Dataiku tool.
  • Experience gathering and analyzing system requirements.
  • Ability to effectively function in a cross team's environment.
  • Extensively worked on Defining synapse external tables along with data.
  • Creating a Load Summary Chat and Unit Testing Results.
  • Created stream analytics jobs to load captured messages from event hub to Azure ADLS.
  • Created external tables in production SSMS and provided the detailed session to product owner.
  • Involved in operational reports using Power BI to monitor ETL stats.
  • Monitoring production loads in ADF v2 Monitor and debug the failures.
  • Created schemas, tables, file formats & views in Azure SQL data warehouse.
  • Involved in enhancements and maintenance activities of the data warehouse.

SAP HANA XSA Development/Support and Migration of Source Data

TCS
Bangalore
01.2020 - 12.2021
  • Company Overview: Client: Standard Chartered Bank
  • Implementation, support on SAP HANA XSA. Data migration from SAP HANA to Azure platform.
  • Client: Standard Chartered Bank
  • Worked with multiple containers creation like sabre, rdm, mdg, Murex etc in SAP HANA XSA.
  • Metadata preparation for various tables in different sources.
  • Worked on different containers and hdbcds tables.
  • Experienced with manual and BPA data loading process into FSDP based on sourcing data files to target containers.
  • Involving in Redwood load monitoring and jobs adjustments.
  • Used different Transformations for aggregating, filtering, joining and sorting the business data.
  • Worked with initial POC on data migration from SAP HANA to Azure system.
  • Report parameters included single valued parameters, multi-value parameters.
  • Generating the reports weekly and monthly wise in Excel, CSV, PDF format.
  • Worked on creating the RDDs, DFs for the required input data and performed the data transformations using Spark-core.
  • Performing the operations on Source data by using Spark.
  • Worked with SQL queries on Data frames.

SAP HANA XSA Development/Support

TCS
Bangalore
10.2016 - 12.2019
  • Company Overview: Client: NXP Semiconductors
  • End to End development or implementation of SAP HANA XSA. Responsible for complete design discussion, development of the HANA and BW objects logics and deliverables in time.
  • Client: NXP Semiconductors
  • Experience to create metadata structures in HANA XSA(webIDE) database.
  • Having knowledge on SAP BO, BW.
  • Worked on different containers and hdbcds tables.
  • Working with manual and BPA data loading process into FSDP and involving into logic changes based on sourcing data files to target containers.
  • Worked on Data ingestion framework to load data from file server to HANA CDS tables.
  • Modified stored procedures to load data from non-SAP Sources to HANA tables.
  • Handling the data integration functionality based on NXP requirements.
  • Working with BPA Redwood tool for loading .dat, .csv files to different target containers from file server.
  • Involved in SAP B4/HANA implementation.
  • Delivering the development work on time with no/minimal defects to test environment and with no effects to Production environment. Working as a one team.
  • Received Star of the month Award for my work in the short span and appreciation from the lead, manager and client.

Education

Bachelor of Science - Computer And Information Systems

Vignan University
Guntur, India
05.2012 - 06.2016

Skills

data engineering : Azure Data Factory(V2), Azure Databricks, Azure Synapse Analytics, pySpark, SAP HANA XSA, SAP BW 75, ZDP

Reporting: SAP BO, PowerBI

CI/CD: Github, Github Actions, Change Management using Charm process

Programming Languages: Python

Storage: Azure Blob Storage, Azure Data Lake Storage

Infrastructure Tools : Terraform

Certification

DP-203, Azure Data Engineer

Timeline

Azure Data Engineer & SRE

Xebia IT Architects
08.2023 - Current

DP-203, Azure Data Engineer

02-2023

Azure Data Engineer

Xebia IT Architects
03.2022 - 08.2023

SAP HANA XSA Development/Support and Migration of Source Data

TCS
01.2020 - 12.2021

SAP HANA XSA Development/Support

TCS
10.2016 - 12.2019

Bachelor of Science - Computer And Information Systems

Vignan University
05.2012 - 06.2016
Avinash KurraAzure Data Engineer