Summary
Overview
Work History
Education
Skills
Certification
Accomplishments
Primaryskills
Visacategory
Projects
Roledesignation
Professional Snapshot
Timeline
Generic

Durga Praveena Jakka

Hyderabad

Summary

Seasoned Data Engineer with a proven track record at Triple S, adept in leveraging Python and SQL to enhance data pipeline efficiency. Excelled in deploying Azure Data Factory solutions, leading to a 20% increase in process automation. Demonstrates strong analytical skills and a confident approach to managing complex data projects in the insurance domain.

Overview

9
9
years of professional experience
1
1
Certification

Work History

Data Engineer

Triple S
Hyderabad
06.2022 - Current
  • Company Overview: Triple S is an insurance company in USA, The insurance data from a member till provider needs to pipelined for the frontend applications
  • Responsible for performing analysis of the Insurance Data Warehouse requirements as per the business mappings
  • To update mapping rules for tables and to modify the sqls, SSIS packages, stored procedures and load the data using Azure Pipelines till cosmos database
  • Involved in analyzing data as per the requirements to define the to-do list for development
  • To perform data validations in tables and to validate reports after loading the data
  • Involved in performing regression validations before implementation and performed implementation in production
  • Reported development and deployment for ETL process to the clients and other stake holders regularly
  • Updated user stories in ADO board as per the requirements and leaded scrum activities in the team
  • Responsibility in managing resources and in delivering the data and reports to the clients
  • Triple S is an insurance company in USA, The insurance data from a member till provider needs to pipelined for the frontend applications

Data Engineer

Quantexa
Hyderabad
01.2021 - 05.2022
  • Company Overview: Quantexa is an investigation tool which used to detect fraud or default customers by analyzing data pattern obtained from backend datamarts
  • Analyzed the incoming data to perform data modelling from source by performing curated assurance and data exploration methods as part
  • Developed ETL scripts to load data from source to AWS s3 content zone through hive wrappers for different patterns
  • Involved in tuning the queries to load huge volume of data in less time from aws s3 curated bucket to content zone
  • Performed end to end testing and scenario-based testing like Insert, Delete & Update from source to target
  • Involved in developing packages to deploy into productions through Jenkins
  • Performing TVT after production deployment using Jupyter
  • Prepared the technical design document and other release documents
  • Involved in round the clock support for the production support team during implementation
  • Quantexa is an investigation tool which used to detect fraud or default customers by analyzing data pattern obtained from backend datamarts

Analyst Programmer

Large Exposure
Melbourne
07.2018 - 12.2020
  • Company Overview: This project is load source file into data warehouse according to the mapping sheet and Oracle queries
  • Involved in project level analysis and documented the source and target mapping details for the design phase
  • Developed product setup rules to handle new products coming to the system
  • Developed SQL queries and shell scripts to load data into Datawarehouse system as per the business requirement
  • Involved in end-to-end testing like ST and SIT
  • Have good experience in writing complex sqls to validate data according to the requirement
  • Delivered projects in agile methodology & involved in end-to-end activities
  • Debugged and resolved the loading failures by verifying the Unix jobs log files
  • Designed Test case documents to perform ST and SIT testing, organized defect triage meetings, provided inputs to offshore team, provided Design, MTP and TSR walkthrough to client and support teams and supported the project during implementation
  • This project is load source file into data warehouse according to the mapping sheet and Oracle queries

Java Developer

Bi Tester Automation tool
12.2015 - 10.2018
  • Company Overview: Bi Tester is an automation tool to test reports between source and target
  • Development java code to send the reports to respective team once the validation is completed
  • Involved in developing UI for validating source and target report files
  • Involved in Debugging and Fixing the Issues raised by Testing Team during the SIT Phase
  • Involved in Production Support to Operation Team during Initial runs and presenting KT Sessions
  • Bi Tester is an automation tool to test reports between source and target
  • Provided technical solutions in increasing the performance by 70% on migrating the vast data volume to the cloud and smoothly delivered the project on time with zero defect
  • Developed multiple tools using python, Java and VBA to automate the repetitive tasks, and some of them are used widely across the organization, which has saved the efforts by 20% and cost by 15%

Data Analyst

Instant Pricing Tool
09.2016 - 06.2018
  • Company Overview: Instant Pricing Tool the one which simplifies the process of pricing or re-pricing the home loans and gives an optimal pricing for home lending for the customers of bank
  • Involved in Analyzing data for requirement gathering and System Integration Testing
  • Worked on JIRA for accessing User Stories and for managing the User stories in different stages of the life cycle
  • Creating test scenarios and test cases from the User Story and executing test cases
  • Executed multiple jobs in Unix to load data into test environment
  • Validated the data according to the requirement
  • Have good experience in writing complex sqls to validate data according to the requirement
  • Raising Defects and managing the defects in ALM
  • Generating reports of the test execution and sending to clients on daily basis
  • Training new team members on the execution process
  • Developed Automation tools to reduce the manual efforts involved in testing the data by using python, Java and VBA tools
  • Involved in Regression testing for the new code drop
  • Instant Pricing Tool the one which simplifies the process of pricing or re-pricing the home loans and gives an optimal pricing for home lending for the customers of bank

Education

B. Tech -

Vasireddy Venkatadri Institute of technology
Namburu
01.2015

Skills

  • Python
  • Shell Scripting
  • SQL
  • HTML
  • VB Scripting
  • Power BI
  • Toad
  • Jenkins
  • JIRA
  • Jupyter notebook
  • WinSCP
  • GitHub
  • ETL
  • My SQL
  • Eclipse
  • Unix
  • Qtest
  • Visual Studio
  • Gitbash
  • MobaXtreme
  • Excel
  • Banking
  • Payments
  • Investments
  • Product Setup
  • Insurance Domain
  • AWS EC2
  • Azure
  • Azure Data Factory
  • S3
  • Hadoop
  • Spark sql

Certification

  • Certified in Python
  • Devops
  • Agile
  • Az-900
  • DP-900

Accomplishments

  • Received Insta Award for Excellence, thrice from INFOSYS.
  • Received outstanding performance rating in 10/19 to 10/20 cycle and 10/21 to 10/22 cycle.
  • Received multiple appreciations from Direct client NAB Australia & Triple S.
  • Provided technical solutions in increasing the performance by 70% on migrating the vast data volume to the cloud and smoothly delivered the project on time with zero defect.
  • Developed multiple tools using python, Java and VBA to automate the repetitive tasks, and some of them are used widely across the organization, which has saved the efforts by 20% and cost by 15%.

Primaryskills

  • ETL Development
  • Banking & Insurance Domain
  • Azure Data Factory
  • Oracle
  • UNIX
  • Control-M
  • Python
  • SQL
  • SSIS
  • Data warehousing
  • DataMarts

Visacategory

L2

Projects

Triple S Insurance Domain, Hyderabad, One of the leading Insurance Domain in USA, 06/01/22, Present, Data Engineer, Azure, ADF, Devops, Python, Github, Jenkins, VDI, SSIS, Responsible for performing analysis of the Insurance Data Warehouse requirements as per the business mappings., To update mapping rules for tables and to modify the sqls, SSIS packages, stored procedures and load the data using Azure Pipelines till cosmos database., Involved in analyzing data as per the requirements to define the to-do list for development., To perform data validations in tables and to validate reports after loading the data., Involved in performing regression validations before implementation and performed implementation in production., Reported development and deployment for ETL process to the clients and other stake holders regularly., Updated user stories in ADO board as per the requirements and leaded scrum activities in the team., Responsibility in managing resources and in delivering the data and reports to the clients. Quantexa, Hyderabad, One of the leading Banking Firms in Australia, 01/01/21, 05/31/22, Data Engineer, Aws, Python, Github, Jenkins, Rally, Qtest, spark, Hive, Visual Studio, Sql, Analyzed the incoming data to perform data modelling from source by performing curated assurance and data exploration methods as part., Developed ETL scripts to load data from source to AWS s3 content zone through hive wrappers for different patterns., Involved in tuning the queries to load huge volume of data in less time from aws s3 curated bucket to content zone., Performed end to end testing and scenario-based testing like Insert, Delete & Update from source to target., Involved in developing packages to deploy into productions through Jenkins., Performing TVT after production deployment using Jupyter., Prepared the technical design document and other release documents., Involved in round the clock support for the production support team during implementation. Large Exposure, Melbourne, One of the leading Banking Firms in Australia, 07/01/18, 12/31/20, Analyst Programmer, Data Integrator, Oracle, UNIX, Control M, Sql, Datawarehouse, Involved in project level analysis and documented the source and target mapping details for the design phase., Developed product setup rules to handle new products coming to the system., Developed SQL queries and shell scripts to load data into Datawarehouse system as per the business requirement., Involved in end-to-end testing like ST and SIT., Have good experience in writing complex sqls to validate data according to the requirement., Delivered projects in agile methodology & involved in end-to-end activities., Debugged and resolved the loading failures by verifying the Unix jobs log files., Designed Test case documents to perform ST and SIT testing, organized defect triage meetings, provided inputs to offshore team, provided Design, MTP and TSR walkthrough to client and support teams and supported the project during implementation. Instant Pricing Tool, One of the leading Banking Firms in Australia, 09/01/16, 06/30/18, Data Analyst, Data Integrator, Oracle, UNIX, Control M, Involved in Analyzing data for requirement gathering and System Integration Testing., Worked on JIRA for accessing User Stories and for managing the User stories in different stages of the life cycle., Creating test scenarios and test cases from the User Story and executing test cases., Executed multiple jobs in Unix to load data into test environment. Validated the data according to the requirement., Have good experience in writing complex sqls to validate data according to the requirement., Raising Defects and managing the defects in ALM., Generating reports of the test execution and sending to clients on daily basis., Training new team members on the execution process., Developed Automation tools to reduce the manual efforts involved in testing the data by using python, Java and VBA tools., Involved in Regression testing for the new code drop. Bi Tester Automation tool, One of the leading Banking Firms in Australia, 12/01/15, 10/31/18, Java Developer, Java, JavaScript, Oracle, Development java code to send the reports to respective team once the validation is completed., Involved in developing UI for validating source and target report files., Involved in Debugging and Fixing the Issues raised by Testing Team during the SIT Phase., Involved in Production Support to Operation Team during Initial runs and presenting KT Sessions.

Roledesignation

Data Engineer

Professional Snapshot

  • A passionate Data Engineer who is keen on learning new tools and technologies.
  • Established as a data warehouse analyst for 7+ years in the banking domain and Insurance domain.
  • Pioneer in delivering and showcasing the products in the agile methodology to the stakeholders.
  • Advanced SQL query knowledge in performing a complete ETL process, Database concepts and reporting technology.
  • Experience in building pipelines in Azure and AWS.
  • Enhanced in designing tools and applications using Python, Java, and VBA, pyspark.
  • Ability to adapt to a new work environment, collaborate with a team, and attained cross cultural communication skills while working in Melbourne for two years.
  • Excellent communication and interpersonal skills and involved in Client interactions for scoping, to understand business requirements, effort estimates and status reporting.
  • Excellent problem-solving skills with strong technical background and good interpersonal skills.
  • Quick learner and excellent team player, ability to meet deadlines and work under pressure.

Timeline

Data Engineer

Triple S
06.2022 - Current

Data Engineer

Quantexa
01.2021 - 05.2022

Analyst Programmer

Large Exposure
07.2018 - 12.2020

Data Analyst

Instant Pricing Tool
09.2016 - 06.2018

Java Developer

Bi Tester Automation tool
12.2015 - 10.2018

B. Tech -

Vasireddy Venkatadri Institute of technology
Durga Praveena Jakka