Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic
Rekapalli Ome Sai Vijay

Rekapalli Ome Sai Vijay

Azure Data Engineer/ETL Developer

Summary

I joined Capgemini in Apr’21 and have been in the training in cloud technologies for around 3 months and got in the project as a ADF Developer. I'm skilled in working on Azure Data Factory, I used to build pipelines, datasets and dataflows and worked on the optimizations to minimize the run time. With my ongoing experience, I have been validating and the data from the tables and identifying the mismatch in the data if we are having data issues. Resources available on the project learned about code promotion to different environments, creating linked services and modifying the Integration runtimes. Results-oriented Analyst skillful in managing and breaking down large volumes of information. Proactive at heading off issues in operations, workflow and production by uncovering trends affecting business success. Rational Azure Data engineer/Etl Developer with 3.1 years of experience enhancing operations for organizations through information systems solutions. Proven track record of transforming specific business goals related to growth and efficiency improvements into new system designs. Knowledgeable innovator known for recommending new technologies to enhance existing systems and introduce new systems.

Overview

3
3
years of professional experience
5
5
years of post-secondary education
1
1
Certification

Work History

Analyst/Software Engineer (ADF Developer/Tester)

Udtrucks, UD
02.2023 - Current
  • Migaration of IPC code to IICS(Informatica) and IPC Code to ELT approach using Azure Data Factory,Ingestion of data from flat files and dat files to SQL Server Using IICS
  • Technology used in project
  • Azure Data Factory
  • Azure SQL MI
  • Informatica Power Center
  • Informatica Intelligent Cloud Services
  • SSMS
  • Unix Operating System
  • Roles and Responsibilities performed in Project
  • In these project we injected data from Dat files and xml files to Azure SQL MI and built pipelines for that task
  • Converted code from IPC to IICS using RPA tool
  • Validated IICS mapping and added all session properties and filter conditions which tool cannot do
  • Did some performance tuning things for mapping like adding push down optimization to source or target
  • Did performance tunning at Database side also like adding indexes
  • Created automated script in linux os to move the files from one location to another location for IICS to pick the files
  • Created database objects like views and tables and validated them with existing system
  • Created a stored procedure to test the data with all transformtions and count with the source
  • Also tested the data manually using sql buitin functions like expect ,etc.,
  • Training/Certification..
  • Trained in Azure Cloud platform (Azure Data Factory, Azure DataBricks, Azure Synapse), Pyspark, MS SQL.Scala

Analyst/Software Engineer (ADF Developer/Tester)

Volvo Mobil
03.2022 - 03.2023

In this project, we migrated the data from On-Prem Netezza server to Cloud Azure SQL DB/Azure Synapse/Azure SQL Managed Instance.

My stream of work is on ETL and I'm working as an ADF Developer, coming on to the roles and responsibilities.,

· Need to have a thorough check on the On-prem Informatica structure and had to prepare an analysis sheet for DataMart.

· Must build the model (Dataflows/Pipelines) on Azure data factory based on the On-Prem model (Workflows/Worklets/Tasks).

· Developing the ELT/ETL pipelines.

· Performed Initial/Historical load using csv as source files and informatica DTT task.

· Performing the data validations with the SIT test cases for the build models with a capagemini integrated tool called DVE.

· Performed data validations using DVE tool for count check and data check.

· Scheduling triggers for End-2-End pipelines.

· Monitoring the triggers for success or failure.

· Code movement to Prod, Dev, and QA.using Azure Devops

· Performing tunings to the pipelines, if required.

· Been as a primary spoc for the team.

· Implemented pipeline for scaling SQL DB using ADF whenever pipeline was triggered to improve performance


Consultant

capgemini it solutions
Hyderabad
04.2021 - 10.2022

Project Name :Blue Shield of California(BSC).

Domain : Health Care

Role : ETL Developer(Informatica)

Description:

BSC (Blue Shield of California) is a biggest healthcare system in US.Our project is integrating the data of the patients with their medical details.We extract the data and load into new tables which will be used to send the reminders for future medical tests and analysis

Responsibilities

· Gathered Business requirements and prepared Source to Target Mapping

specifications and Transformation rules.

· Creating Mappings based on the Data Mapping Document.

· Extensively used the transformations such as Source qualifier, Aggregator, Expression, Lookup, Filter, Update strategy, normalizer, and Sequence Generator.

· Creating Sessions, Worklets and Workflows using Workflow Manager.


  • Used the Update Strategy Transformation to update the Target Dimension tables
  • Created connected and unconnected Lookup transformations to look up the data from the source and target tables

· Prepared Unit Test Case Document for the mapping developed.

· Involved in code reviewing of the other mappings developed.


  • Actively Participated in Team meetings and discussions to propose the solutions to the problems.
  • Used PL/SQL and UNIX Shell Scripts for scheduling the sessions in Informatica  Wrote SQL, PL/SQL for implementing business rules and transformations.
  • Developed batch file to automate the task of executing the different workflows and sessions associated with the mappings on the development server

Environment: InformaticaPowerCenter 9.6.1,Oracle,Unix,DBeaver

Trainee

Capgemini
01.2022 - 03.2022
  • Trained in Azure cloud platform
  • Technology used in project
  • Azure Data Factory
  • Pyspark
  • Azure Data Bricks
  • Scala
  • Azure Synapse
  • Roles and Responsibilities performed in Project
  • Participating in every training module and performing the activities/exercises
  • I have done one mini project olympics Analysis using Performed activities as Azure SQL, Data Factory, Data Bricks
  • It aims to generate reports to show the facts, graphs and ratios of the analysis on medals won by countries in various sports based on query require.

Education

Bachelor of Science - Civil Engineering

Sagi RamaKrishnam Raju Engineeriing College
Bhimavaram
06.2017 - 06.2022

Skills

undefined

Certification

Azure Data Explorer

Timeline

Azure Data Explorer

04-2023

Analyst/Software Engineer (ADF Developer/Tester)

Udtrucks, UD
02.2023 - Current

Analyst/Software Engineer (ADF Developer/Tester)

Volvo Mobil
03.2022 - 03.2023

Trainee

Capgemini
01.2022 - 03.2022

Consultant

capgemini it solutions
04.2021 - 10.2022

Bachelor of Science - Civil Engineering

Sagi RamaKrishnam Raju Engineeriing College
06.2017 - 06.2022
Rekapalli Ome Sai VijayAzure Data Engineer/ETL Developer