Summary
Overview
Work History
Education
Skills
Certification
Work Availability
Timeline
Awards
AssistantManager
GANESH VEMPARALA

GANESH VEMPARALA

Hyderabad

Summary

Results-driven Data Engineer with over 14 years of experience in application development and migration, specializing in programming languages, databases, and cloud technologies. Expertise includes Oracle SQL, PL/SQL, MS-SQL, PostgreSQL, Python, PySpark, and Azure services such as Azure Data Factory and Azure Databricks. Proven track record across all phases of the Software Development Life Cycle (SDLC), emphasizing data processing, workflow automation, and performance tuning in diverse database environments. Skilled in collaborating within Agile settings and implementing innovative solutions that enhance operational efficiency and support business objectives.

Overview

15
15
years of professional experience
1
1
Certification

Work History

Data Engineer

Concentrix Catalyst
05.2025 - Current
  • Frontline Care ITSM logs Atlas support tickets using PIER ticket monitoring system. As part of this process support tickets are created using PIER system and stored in NCI DB. As part of Phase-I migration the ticket system is migrating from PIER to ServiceNow (SNOW) system.
  • Responsibilities:
  • Migrated the data from PIER to SNOW for the modules like voice, data, digits and many other applications using ADF pipelines.
  • Worked with various control flow activity tasks like Copy Data, Dataflow, Get Metadata, For Each and Transformations like Union, Aggregate, Sort, Exists, Sort and much more.
  • Creating new database objects required for the Service now ticketing system.
  • Creating and Altering Stored procedures as per the low-level Redesign requirement after understating the existing PIER system by closely working with application BA.
  • Involved in Database Table designing in the New System.
  • Mapping columns from Source system to SQL Server database.
  • Project: Frontline Care ITSM
  • Client: T-Mobile Telecommunications
  • Designation: Data Miner, Global Analytics
  • Role: Data Engineer
  • Technologies: SQL, ADF

Data Engineer

NSCC
04.2021 - 05.2025
  • Norfolk Southern CrewPro Cutover - railroad crew management and rail crew scheduling provide real-time crew supply and demand matching, diminishes employee qualification information upkeep, minimizes vacation bidding time, ensures Guarantee Payments are awarded correctly, and allows for simplified travel accommodation scheduling. NS railroad Crew management was developed on mainframe with DB2 database. I am working for the migration team, involved in migrating the data from DB2 to Oracle database for Union Pacific Railroad (UPRR) project.
  • Responsibilities:
  • Working on a solution which interacts with Huge Data from CrewCall Legacy system to CrewPro new system.
  • Worked on CrewPro application Redesign after understating the existing CrewCall system by closely working with application BA.
  • Migrated data from CrewCall to CrewPro for the modules like employee, Employee history, Job History and Crew Admin controls.
  • Responsible for Extract, Transform and Load data from different sources Systems to Azure Data Storage services using a combination of Azure Data factory, T-SQL, Spark SQL.
  • Worked with various control flow activities such as Filter, For Each, Set Variable, Copy Data and Transformations like Derived Column, Aggregate, Joint, Sort, Lookup, Conditional Split, etc.
  • Creating mapping data flows and implement data transformations to alter data transformation using Azure data factory.
  • Building the pipelines to run the dataflows and activities on the data in ADF.
  • Developed pipelines in azure data factory to fetch data from different sources and apply the transformation and loaded it into Azure synapse analytics
  • Scheduling pipelines and monitoring the data movement from Source to destinations.
  • Creating Stored Procedures and Schedule them in Azure Environment.
  • Developed Spark applications using Spark-SQL in Databricks for data extraction, transformation, and aggregation from multiple file formats for Analysing & transforming the data.
  • Successfully completed the migration Project which involved huge volume of data and physical files for multiple districts in aggressive timelines.
  • Project: NSCC
  • Client: Union Pacific Railroad (UPRR)
  • Designation: Data Miner, Global Analytics
  • Role: Data Engineer
  • Technologies: Oracle, DB2, ADF, ADB, REST Services, GIT

Oracle DB Developer

Sprint
08.2020 - 03.2021
  • RMS (Retail Management System) Is a ‘Point of Sale’ application used by Sprint PCS for providing telecom services across customers in USA. This application has 2 main interfaces Back Office and Front Office.
  • Back Office & its Operations - Using this interface, Sprint employee performs administrative operations such as Add/Edit Sprint employee privileges, Add/Edit Store details and Add/Edit register, inventory, etc.
  • Front Office & its Operations - Using this interface, Sprint employee performs customer-oriented operations as Activation/Upgradation of customers devices, plans and Exchange of customers devices, plans, etc.
  • Responsibilities:
  • As part of the Legacy application migration, developed SQL code as per the legacy data files.
  • Involved in Database Table designing in the New System.
  • Developing the activities and dataflows and run the pipeline for the data migration in ADF.
  • Project: Sprint
  • Client: Sprint mobile Telecommunications
  • Designation: Senior Technical Specialist
  • Role: Oracle DB Developer
  • Technologies: Oracle, ADF, REST API

Technical Lead

Verizon
02.2020 - 07.2020
  • IntraDaPro is Order entry and Order Management system for Local Voice products including VoIP and Long Distance (opt1) and acts as a front end to the IXPlus billing system. Interfaces with over 40 Upstream and Downstream systems including but not limited to PQ, OM+, Activation Manager, Prime Biller, NRM, NOES/CAMEO, OSS, IBRS, WIN, PC, etc. IntraDaPro currently housed the Local customer profile. CASPER is a part of IntraDaPro and is the database of record for VoIP Shared Trunk Groups and ALL VZB 911 Network elements. CASPER enables systematic migrations for VoIP Solutions/Local Translations for Switch Decoms/Migrations. IntradaPro had the functional flow logic to support VoIP Automation for all VoIP enabled ebonded orders on Enterprise Ordering Platforms.
  • Responsibility:
  • Working and tracking on the Jira development tasks and assigning to the team.
  • Providing technical support for the IntraDaPro application and bug fixing.
  • Writing complex SQL Scripts as the data Solutions for identifying the affected process.
  • Enhancing the new design requirements by implementing the Change Request.
  • Daily activities such as attending AYS with Project Director to represent application status, DTP with team, Issue tracking.
  • Project: IntraDaPro
  • Client: Verizon
  • Designation: Technical Lead
  • Role: Oracle and Java Developer
  • Technologies: Oracle, Python

TOMS Developer

NetCracker
11.2017 - 02.2020
  • Maxis is a leading Telecommunication operator in Malaysia offering Mobile Services, Fixed Line Services and International gateway services. Maxis Berhad, is the only integrated communications service provider and the first operator to launch 4G LTE in Malaysia. Operational Support System (OSS) services are implemented for MAXIS. Client Using Netcracker’s TOMS product. The purpose of this project is to provide the mentioned services by configuring resource inventory and implementing the integration modules. Maintaining the device library to configure latest devices and updating the existing in central repository for all the Vendors.
  • Responsibility:
  • Enhancing the new design requirements by implementing the Change Request.
  • Day to day activities include L3 development, implementing CR designs and bug fixes providing post Production support.
  • Develop integration modules to implement the data reconciliation between NMS/EMS systems and NetCracker system.
  • Involving in Design discussion along with BA for the project enhancements.
  • Responsible for developing few modules using Python, SQL, PL/SQL as per the Business requirements in promised timelines.
  • Involved in designing database, creating functions, stored procedures and views to access the database.
  • Provided complex SQL Scripts as the data Solutions for identifying the affected process
  • Worked on Restful Web Services as integrated with third party systems.
  • Following Agile/Scrum methodology for incremental software development using Jira.
  • Project: Maxis
  • Client: Maxis Telecommunications
  • Designation: Technical Analyst
  • Role: TOMS Developer
  • Technologies: Oracle, Python, Rest API

Sr. Software Engineer

DHL
09.2016 - 10.2017
  • The purpose of this project is the development of a Graphical User Interface that will sit on top of the rational database of the rating engine that will be built by NetCracker for DHL Express. The new rating engine will be named Global Rating Engine or GRE in short, and will be replacing an existing one that is more than 30 years old and is now obsolete and hard to maintain. The solution will be built using NetCracker’s technology. GRE consists of RBM and TOMS products.
  • Responsibility:
  • Providing an online/real-time quote mechanism to external DHL client applications
  • Rating batch shipment transactions from the legacy system and return the results to the legacy system
  • Enhancing the new design requirements by implementing the Change Request.
  • Event file processing, Batch Rating and Pricing calculations are implemented in RBM Configuration.
  • Pricing Identifier (PID), Zones, Rates, Product Lines, Discounts, S&S, Special Lanes are configured using DHL TOMS Interface.
  • Day to day activities include L3 development, implementing CR designs and bug fixes providing post Production support.
  • Project: Global Rating Engine (GRE)
  • Client: DHL
  • Designation: Sr. Software Engineer
  • Role: RBM and DB Developer
  • Technologies: Oracle, Python, Rest API

Developer

Telefonica
05.2016 - 08.2016
  • Telefonica is a telecom company in Central America. The TFN CAM Solution Delivery SOW1 system is used by Telefonica to track the activation of the services and service orders.
  • Responsibility:
  • Data Migration will be based on a Big Bang approach – all Business entities, domains and data sources will be migrated from source system(s) into TOMS/RBM during single cutover deployment window
  • Data Migration process will be implemented as a standard ETL (Extraction, Transformation and Load) process.
  • Based on the Data Migration strategy and process, migrating Legacy source system data of Telefonica Central America countries (TFN CAM) Data source systems to Net Cracker (TOMS) solution.
  • Implementing Data Migration process using the Migration Strategy documents such as IDB structure document, Fallout Dictionary document, Core configuration mapping document.
  • Data will be loaded into intermediate database for validation, transformation, cleansing and further loading into production system and generates the audit report.
  • Working on assigned TMS tickets daily and used for internal monitoring of the implemented tasks
  • Writing validation queries to filter incorrect data into fallout reports.
  • Writing transformation to load source data as specified by the customer by implementing the transformation logic.
  • Project: TFN CAM Solution Delivery SOW1
  • Client: Telefonica
  • Designation: Sr. Software Engineer
  • Role: Developer
  • Technologies: Oracle, Python, Rest API

Developer / Application support

Sprint
02.2015 - 04.2016
  • Sprint is a telecom company in USA. The NRM-Sprint system is used by Sprint to track the activation of the services and service orders. It is OSS (Operational Support System).
  • Responsibility:
  • Understanding Sprint Network Resource Management.
  • Provisioning Support for Samsung, Alcatel Lucent and Ericsson vendors for Sprint.
  • Worked on different Backhaul Types like Ethernet, Microwave for different vendors like Samsung, ALU, Ericsson etc
  • Writing validation code for various business requirements.
  • Writing SQL queries for various requirements for report purpose.
  • Providing first responses for new tickets, assessment of tickets’ category and severity in accordance with SLA definitions
  • Analysing the reported Defect/Incident/Investigation Request/Info Request and provide the root cause and workaround till a permanent fix is provided.
  • Test and reproduce the reported problem in test environment.
  • Generate SLA report and priority list of tickets on daily basis
  • Publish our investigation results to the CSA Support team in the respective Project mail threads and add the same to the ticket as internal comments.
  • Customer communication for all projects’ tickets
  • Meeting/maintaining SLAs for all project modules.
  • Project: NRM-Sprint
  • Client: Sprint
  • Designation: Sr. Software Engineer
  • Role: Developer / Application support
  • Technologies: Oracle, Python, Rest API

Developer / Application support

Tech Mahindra
01.2011 - 02.2015
  • Classic is a component of Service Fulfilment and Workflow stack in Telecom Domain. It is part of Andes program run by British Telecom (BT). It is an order entry tool used by BT to raise orders and manage delivery of services to the clients. Classic is a thick client based product of Amdocs.
  • Responsibility:
  • Responsibilities include requirement analysis, coding using Oracle/PL-SQL, testing, production and environment support, user documentation.
  • Involved in design calls (CFT calls), identifying the impact of the new changes, creating Low level Design (LLD) document and providing the cost estimation.
  • Developing Packages, Procedures, Functions and Triggers for data uplift and Change Requests as per requirement.
  • Built and maintained SQL scripts, indexes, reports and queries for data analysis and extraction per requirement.
  • Coordinating with upstream and downstream.
  • Analysing data discrepancies on production environment and creating scripts for rectifying the issues.
  • Building scripts in PL/SQL and provide the same to ASG (L2 and L3 support team) for uplifting the values on live environment (production).
  • Using the PL/SQL scripts identifying Redundant Data and eliminating the duplicate data
  • Participating in all release/deployment activities like data refresh, Release planning and QG activities.
  • Creating defect report and attending the defect summary calls (defect program call).
  • Involved in KT/training sessions to my juniors in system development, project functionality.
  • Project: SOE – Classic
  • Client: British Telecom
  • Designation: Sr. Software Engineer
  • Role: Developer / Application support
  • Technologies: Oracle, Python, Rest API

Education

Bachelor of Technology - Computer Science

JNTUK affiliated College of Engineering
Tuni, Andhra Pradesh
01.2009

Skills

  • ETL processes and Data Migration
  • Database design and programming
  • Azure Data Factory and PySpark
  • Proficient in Azure Data bricks
  • SQL and PL/SQL development
  • UNIX and Python Programming
  • Team collaboration and communication

Certification

  • Databricks Certified Data Engineer Associate
  • Academy accreditation - Databricks Fundamentals
  • Academy accreditation - Generative AI Fundamentals

Work Availability

monday
tuesday
wednesday
thursday
friday
saturday
sunday
morning
afternoon
evening
swipe to browse

Timeline

Data Engineer

Concentrix Catalyst
05.2025 - Current

Data Engineer

NSCC
04.2021 - 05.2025

Oracle DB Developer

Sprint
08.2020 - 03.2021

Technical Lead

Verizon
02.2020 - 07.2020

TOMS Developer

NetCracker
11.2017 - 02.2020

Sr. Software Engineer

DHL
09.2016 - 10.2017

Developer

Telefonica
05.2016 - 08.2016

Developer / Application support

Sprint
02.2015 - 04.2016

Developer / Application support

Tech Mahindra
01.2011 - 02.2015

Bachelor of Technology - Computer Science

JNTUK affiliated College of Engineering

Awards

  • IDU Level Pat on the Back award for appreciation of work by Tech Mahindra in the FY 11-12 (Q4)
  • IDU Level Valuable Team Player award for excellence by Tech Mahindra in the FY 12-13 (Q3)
  • Awarded with You Have Made the Difference for excellence in DHL by NetCracker in the FY 17-18
  • Cluster Level Star Performer award for excellence by NetCracker in the FY 18-19 (Q2)
  • Awarded You Have Made the Difference by NetCracker for appreciation in Maxis in FY 18-19 (Q4).
GANESH VEMPARALA