Summary
Overview
Work History
Education
Skills
Timeline
Generic

Sravan

Bengaluru,KA

Summary

Experienced IT professional with 5+ years of expertise in IT Development & Infrastructure support, specializing in Microsoft Azure Cloud technologies. Skilled in Azure Data Factory, CI/CD scripting, and DevOps implementation for seamless deployment. Strong background in customer front-end interactions, requirements gathering, and project delivery, including testing plans and deployments. Proficient in Azure PAAS components such as Azure Data Factory, Azure SQL DW, Azure Data Lake storage, and log analytics. Hands-on experience in Azure Analytics Services like Azure Data Lake Store (ADLS), Azure Data Lake Analytics (ADLA), and Azure SQL DW. Well-versed in building components within ADF, including Integration Runtime, Linked Services, Data Sets, Pipelines, and Activities. Successful track record in supporting data extraction, transformation, and processing into Azure Data Lake Storage and blob storage using Azure Data Factory and ETL methodology. Designed and developed data ingestion pipelines from on-premise to different layers within ADLS using ADF V2. Proficient in data transformations using tools like Azure Data Bricks (ADB) with Spark-Scala, Python, and PySpark Notebooks. Experienced in orchestrating data integration pipelines within ADF using various activities. Implemented dynamic pipelines for extracting multiple files into multiple targets through a single pipeline. Automated ADF pipeline execution using triggers and proficient in basic admin activities related to ADF. Designed an Azure Logic app application for pipeline alert emails and notifications for file unavailability.

Overview

2025
2025
years of professional experience

Work History

Aviva
  • Company Overview: Aviva plc is a British multinational insurance company headquartered in London, England
  • It has about 18 million customers across its core markets of the United Kingdom, Ireland, and Canada
  • In the United Kingdom, Aviva is the largest general insurer and a leading life and pension provider
  • Aviva is also the second-largest general insurer in Canada
  • Aviva has a primary listing on the London Stock Exchange and is a constituent of the FTSE 100 Index
  • Created pipelines to extract data from on-premises source systems to Azure cloud data lake storage; extensively worked on copy activities and implemented copy behaviors such as flattening, preserving, and merging hierarchies
  • Implemented Error Handling concept through copy activity
  • Extracted data from on-premises systems like SQL server and then load data into ADLS Gen2
  • Involved in End-to-end logging frameworks for Data factory pipelines
  • Exposure to Azure Data Factory activities such as Lookups, Stored procedures, if condition, for each, Set Variable, Append Variable, Get Metadata, Filter, and Wait
  • Create a dynamic pipeline to handle multiple sources extracting to multiple targets; extensively used Azure key vaults to configure the connections in linked services
  • Configured and implemented the Azure Data Factory Triggers and scheduled the Pipelines; monitored the scheduled Azure Data Factory pipelines and configured the alerts to get notifications of failure pipelines
  • Aviva plc is a British multinational insurance company headquartered in London, England
  • It has about 18 million customers across its core markets of the United Kingdom, Ireland, and Canada
  • In the United Kingdom, Aviva is the largest general insurer and a leading life and pension provider
  • Aviva is also the second-largest general insurer in Canada
  • Aviva has a primary listing on the London Stock Exchange and is a constituent of the FTSE 100 Index

One Source
  • Company Overview: One Source is an end-to-end sales cycle management tool that integrates the functionalities of the four existing applications, ePoing, VPP, and Bases II- used by the internal sales team (BPC) and external partners
  • Created Linked Services for multiple source systems (i.e.: Azure SQL Server, ADLS, BLOB, Rest API)
  • Created a Pipeline to extract data from on-premises source systems to Azure cloud data lake storage; Extensively worked on copy activities and implemented the copy behaviors such as flattening hierarchy, preserving hierarchy, and Merge hierarchy; Implemented Error Handling concept through copy activity
  • Exposure to Azure Data Factory activities such as Lookups, Stored procedures, if condition, for each, Set Variable, Append Variable, Get Metadata, Filter, and Wait
  • Extensively used Azure key vaults to configure the connections in linked services
  • Configured and implemented the Azure Data Factory Triggers and scheduled the Pipelines; monitored the scheduled Azure Data Factory pipelines and configured the alerts to get notification of failure pipelines
  • Implemented delta logic extractions for various sources with the help of a control table; implemented the Data Frameworks to handle the deadlocks, recovery, and logging of the data of pipelines
  • Deployed the codes to multiple environments with the help of the CI/CD process worked on code defects during the SIT and UAT testing and provided support to data loads for testing
  • Implemented reusable components to reduce manual interventions
  • One Source is an end-to-end sales cycle management tool that integrates the functionalities of the four existing applications, ePoing, VPP, and Bases II- used by the internal sales team (BPC) and external partners

Azure Data Engineer

Custologix Solutions India Private Limited
01.2020 - Current
  • 5+ years of work experience in IT Development & Infrastructure support with good experience in Microsoft Azure Cloud technologies
  • Experienced in Azure Data Factory and preparing CI/CD scripts, and DevOps for the deployment
  • Customer front ending, requirements gathering, Functional spec, technical design, reviews with the customers, project delivery, testing plan, deployments, project stabilization Etc
  • Good exposure to Azure PAAS components like Azure Data Factory, Azure SQL DW, Azure Data Lake storage, and log analytics
  • Hands-on experience in Azure Analytics Services – Azure Data Lake Store (ADLS), Azure Data Lake Analytics (ADLA), Azure SQL DW, Azure Data Factory (ADF), Azure Data Bricks (ADB) etc
  • Excellent knowledge of ADF building components – Integration Runtime, Linked Services, Data Sets, Pipelines, Activities
  • Extensively used ETL methodology for supporting Data Extraction, Transformation, and processing of data from Sources like SAP HANA and Non-SAP (Oracle, SQL Server, Teradata, SFTP) using Azure Data Factory into Azure Data Lake Storage and blob storage
  • Designed and developed data ingestion pipelines from on-premise to different layers into the ADLS using Azure Data Factory(ADF V2)
  • Experience building Azure Data Bricks (ADB) Spark-Scala, python, and pyspark Notebooks to perform data transformations
  • Orchestrated data integration pipelines in ADF using Activities like Get Metadata, Lookup, For Each, Wait, Execute Pipeline, Set Variable, Filter, until, etc
  • Implemented a dynamic pipeline to extract multiple files into multiple targets with the help of a single pipeline
  • Automated execution of ADF pipelines using Triggers
  • Know about Basic Admin activities related to ADF like providing access to ADLs using service principles, installing IR, creating services like ADLS, logic apps, etc
  • Designed the Azure Logic app application to send pipeline success/failure alert emails, file unavailability notifications, etc

Education

B.Tech -

GVPCOE College of Engineering
01.2016

Skills

  • Linux
  • Windows
  • Azure Data Factory
  • Azure Data Bricks
  • Azure Data Lake Analytics
  • Azure Synapse Analytics
  • Python
  • Pyspark
  • SQL

  • AZURE DevOps CI/CD
  • GIT
  • REST API
  • Data modeling
  • ETL development
  • Data warehousing
  • Performance tuning
  • Agile Methodology

Timeline

Azure Data Engineer

Custologix Solutions India Private Limited
01.2020 - Current

Aviva

One Source

B.Tech -

GVPCOE College of Engineering
Sravan