Summary
Overview
Work History
Education
Skills
Accomplishments
Certification
Timeline
Generic

Raghavarao Pothuri

Cary, NC

Summary

Data Engineer with 9+ years of experience in ETL development, data integration, and cloud technologies.
Expertise includes working with Informatica PowerCenter,SQL,ADF, and Azure Databricks to design and implement efficient data pipelines and solutions.

Overview

10
10
years of professional experience
1
1
Certification

Work History

Azure Data Engineer

Carelon Global Solutions
Hyderabad
01.2021 - 07.2024
  • Ingested data from multiple source systems into Azure storage and processing environments, including Azure Data Lake Storage (ADLS) and Azure SQL Data Warehouse, using Azure Data Factory (ADF) for seamless data integration and orchestration.
  • Led the data transformation layer, implementing complex data transformations using Apache Spark in Azure Databricks, including tasks such as data cleaning, aggregation, and enrichment to prepare data for analytics.
  • Automated and orchestrated data ingestion and transformation workflows by scheduling ADF pipelines to run at defined intervals, ensuring continuous data processing with minimal downtime and reliable data flow.
  • Maintained and provided ongoing support for existing data pipelines, troubleshooting, and optimizing workflows to ensure data accuracy, efficiency, and reliability throughout the project lifecycle.
  • Possess in-depth knowledge of Spark architecture, including DataFrames, Spark SQL, transformations, actions, and cluster management. Focused on optimizing distributed data processing and performance tuning in large-scale environments to improve processing speed and reduce resource consumption.
  • Applied strong domain expertise in Facets claims processing and the US healthcare sector, ensuring that data processes and transformations adhered to industry requirements and regulatory standards.
  • Documented data pipeline architectures, transformation logic, and best practices for both ADF and Databricks, ensuring clarity, consistency, and knowledge transfer across teams.

Data Engineer

Virtusa Consulting Services
Hyderabad
07.2018 - 12.2020
  • Performed source system analysis to build ETL jobs, extracting data from diverse sources (database tables, CSV files) to raw layers and applying transformations to load data into ODS and data warehouse environments.
  • Built data lakes by extracting data from Salesforce to AWS S3, supporting analytics and reporting needs.
  • Developed and optimized ETL scripts using SQL, implementing dimensional modeling to load data into dimension and fact tables. Enhanced Redshift performance through strategic key selection, compression encoding, and SQL query optimization, achieving processing efficiency improvements by up to 30%.
  • Designed and executed AWS data pipelines to trigger ETL jobs in Redshift, ensuring seamless data integration and transformation.
  • Managed SQL script versioning with Git, creating feature branches, pull requests, and leading migrations across environments to support agile development practices.
  • Developed complex data integration tasks in IICS Cloud Data Integration to transform data from Redshift into data warehouses, aligning with business logic and transformation requirements.
  • Experienced in configuring deployment migration models to ensure stable and secure code promotion.
  • Hands-on in all SDLC stages, from requirements gathering to user acceptance testing.
  • Skilled in data mapping, unit testing, and production load monitoring.
  • Facilitated coordination with onsite and offshore teams to ensure seamless task delivery and adherence to best practices.

ETL Developer

Cognizant Technology Solutions
Chennai
01.2015 - 06.2018
  • Played a key role in the extraction, transformation, and loading (ETL) of data from various source systems to target systems using Informatica PowerCenter, ensuring the integrity and efficiency of the data pipeline.
  • Performed data cleansing and transformation to standardize and enrich raw data, ensuring it met business and analytical requirements.
  • Utilized Informatica Designer tools, such as Source Analyzer, Transformation Developer, Mapping Designer, and Mapplet Designer, to design, develop, and maintain reusable ETL workflows and transformations.
  • Created data mappings to extract data from diverse sources (e.g., flat files, relational databases), applying transformations such as Filter, Update Strategy, Aggregator, Expression, and Joiner to ensure accurate and consistent data loading into the target systems.
  • Implemented Slowly Changing Dimension Type 2 (SCD2) methodology to maintain historical data for accounts and transactions, ensuring full data lineage and auditability.
  • Managed ETL workflows using Informatica Workflow Manager, automating the process of reading data from source systems, transforming it, and loading it into target databases or flat files.
  • Coordinated the resolution of issues, optimization of processes, and timely delivery of ETL workflows in alignment with project timelines.

Education

Bachelor of Technology - Electronics And Communications Engineering

Narasaraopeta Engineering College
AndhraPradesh,India
04-2014

Skills

  • Azure Data Factory
  • Azure Databricks
  • Oracle
  • Informatica PowerCenter
  • IICS
  • GIT
  • JIRA
  • AWS S3
  • SQL
  • FACETS
  • Data warehousing
  • PySpark

Accomplishments

  • Received multiple IMPACT awards for outstanding performance in the project.

Certification

  • Databricks Certified Data Engineer Associate

Timeline

Azure Data Engineer

Carelon Global Solutions
01.2021 - 07.2024

Data Engineer

Virtusa Consulting Services
07.2018 - 12.2020

ETL Developer

Cognizant Technology Solutions
01.2015 - 06.2018

Bachelor of Technology - Electronics And Communications Engineering

Narasaraopeta Engineering College
Raghavarao Pothuri