Summary
Overview
Work History
Education
Skills
Timeline
CustomerServiceRepresentative
Nishanth Chitumothu

Nishanth Chitumothu

DW ETL Professional
Hyderabad

Summary

Professional Summary:

Over 12 years of experience in analysis, design, development, and implementation of Data warehouse applications using Oracle, Netezza and Snowflake.

✓ Strong experience in designing and developing Business Intelligence solutions in Data Warehouse Systems using ETL tool (IBM Datastage 8.1 / 8.5/9.1/11.5).

✓ Proficiency in database design, maintenance, developing queries with Oracle, Netezza and Snowflake.

✓ Worked with various parallel jobs and sequence jobs using stages such as Netezza Connector, ODBC Connector, Lookup, Join, Transformer, File stage, Datasets, Funnels etc. in IBM Datastage.

✓ Hands-on experience with Complex jobs from varied stages as mentioned above, along with implementing parallel processing and performance tuning using configuration files, schema files, RCP option, checkpoints.

✓ Experienced in job scheduling and job monitoring, Root cause analysis of job failures and fixes on cube processing and ETL process.

✓ Experience in Customer Relationship Management and addressing the customer issues.

✓ Experience in Development and Deployment of Datastage jobs from Excel, Text, CSV Files.

✓ Implemented checkpoints, job configurations at the project level.

✓ Experience of working in all phases of SDLC.

✓ Thorough understanding of Business Intelligence and Data Warehousing Concepts.

✓ Developed effective working relationships with the client team to understand the support requirements, develop tactical and strategic plans to implement technology solutions, and effectively manage client expectations.

✓ Good hands-on using the Custom ETL process using the SQL process and provided support for large ETL integration process.

✓ Experienced in Full Life Cycle Development of building a Data Warehouse for large technology projects including Analysis, Design, Development, Testing, Maintenance, and Documentation.

✓ Experience in creation and customization of ETL jobs and load process using IBM Datastage.

✓ Experience in integration of various data sources from Databases like Oracle, SQL Server, Netezza, Snowflake and formats like flat-files and CSV files.

✓ Experience in Performance tuning (sources, mappings, targets, and jobs) and tuning the SQL.

✓ Strong Database Experience in Oracle 11g/10g/9i, Toad version 9.5, Netezza v 12.1, Snowflake 3.43/4.12/5.3.

✓ Load the data into Snowflake using Snowpipe from AWS S3 buckets.

✓ Parsed the JSON/XML data using Snowflake into a variant table.

✓ Created Streams and Tasks to populate JSON data into an Flat tables in snowflake.

✓ Migrated the existing stored procedures into Snowflake stored procedures.

✓ Experience with Snowflake cloud data platform including hands-on experience with SnowSQL, Snow Pipe, Snowflake Tasks and Streams development

✓ Worked on deployments in the snowflake environment

✓ Experience with performance tunings in large-scale data warehouses and identifying bottlenecks.

✓ Managed the Onsite & Offshore Model like dedicating the work & responsibilities to offshore resources and monitoring the work being carried out.

✓ Excellent analytical and logical programming skills with a good understanding at the conceptual level and possess excellent presentation, interpersonal skills with a strong desire to achieve specified goals.

✓ Professionally trained by Infosys in Microsoft Business Intelligence Suite of ETL (SSIS), Report Design (SSRS), and Multidimensional cubes development (SSAS) in year 2010. Passed out with CGPA of 4.94 out of 5 from the Infosys Mysore training.

✓ Professionally trained by Infosys in SAS Business Intelligence Suite of ETL, Report and Cube development in year 2013.

Overview

12
12
years of professional experience
6
6
years of post-secondary education

Work History

Team Leader

Aptude Software Solutions Pvt Ltd
Hyderabad
2017.03 - Current

Project: HUB group

Project Name : TDW and EDW projects

Client : Hub Group Transportation, Logistics and Supply Chain Solutions, USA.

Team size : 16

Environment : Windows, Unix


Project Objective:

Hub group was a third-party transportation company to begin with, dealing with orders and shipments of vendors like Amazon. Later it has upgraded to Logistics and Transportation company by acquiring a Logistics company Unyson. Now Hub group initially had a Datawarehouse (TDW) created after performing ETL (Extract Transform and Load) using SSIS from various source systems like OTM, TPS, INFOTRAK etc. And later in the year, Hub group has planned to remove all the loopholes in warehous design and schedule and blockages in exisitng MS SQL Server, to a renewed version of TDW Datawarehouse, called EDW (Enterprise Data Warehouse) with primarily Datastage, Snowflake, Netezza tools.


Solution:

Aptude involved in development activities partially, of the EDW, with primary task being supporting both the Data warehouses - TDW and EDW. The migration, development of new jobs and queries in Datastage, exploring Snowflake to integrate ETL of the EDW are main part of the EDW project.


Role / Responsibilities:

Ø Parsing the JSON data using the Snowpipe to load data into a variant table in snowflake

Ø Created Streams and Tasks to populate JSON data into a Flat tables in snowflake

Ø Migrated the existing stored procedures into Snowflake stored procedures

Ø Performed Data Analysis and Documentation

Ø Participated in PRD reviews business impacts, feedback and final approvals.

Ø Collaboration and Communication in an Agile Scrum development environment.

Ø SQL Tuning and Optimization of current and existing applications.

Ø Collaborate with the business analysts to discuss issues in interpreting the requirements.

Ø Perform Integration and System testing, and coordinate with UAT testing.

Ø Preparing and executing Unit Test cases and documenting the execution results

Ø Participate in Testing Effort estimates sessions and provide the timely feedback on progress of the testing activity.

Ø Created Drill-down, Sub reports, Linked Reports, Snapshot Reports, Drill Through reports.

Ø Worked on data Design, Json Data modeling, develop schemas and test data implementation.

Ø Understood the Business process and design of the ETL flow in DataStage.

Ø Coordinated with Business-users to get the specific requirements.

Ø Developed jobs using DataStage from the scratch and got well versed with OTM application which is one of the sources.

Ø Diagnose ETL and database related issues, perform root cause analysis, and recommend corrective actions to management.

Ø Work with a project team to support the design, development, implementation, monitoring, and maintenance of new ETL programs.

Ø Validate tests by cross-checking data in backend on Netezza and Snowflake using SQL Queries, which involves verification of data mapping, data integration and data validations in database, tables, indexes, keys, triggers and data consistence check.

Ø Execute system quality assurance activities on complex n-tier systems to ensure the quality Control of test plan design, test execution, defect tracking and implementation plans.

Ø Use of Bug tracking tools such as JIRA and Cherwell.

Ø Designed LLD documents and unit test cases for the jobs in the project and involved in the HLD designing activity.

Ø Logging in the different issues related to the development phase.

Ø Highly efficient in debugging any issues which have come up in CIT and SIT phases.

Ø Executed data reports for overview of Hub group orders and shipments by using ETL Tool IBM Datastage.

Ø Coordinate/facilitate Product Chartering, story mapping, sprint/release planning meetings, daily SCRUM stand-ups and preparing weekly/monthly metrics.

Ø Prepared and enhanced the test cases according to the functionalities for manual testing and Test execution(UTC & CIT)

Ø Started working on Azure Data Factory for pipelines, Dataflow, scheduling, Synapse Analytics, Azure Data warehouse, Azure SQL server and Azure Data sets since December 2021.


Solution Environment: OTM, DataStage 9.1 and 11.5, Netezza v 12.1, Snowflake 3.43/4.12/5.3, UNIX, Cherwell, JIRA, Power BI Desktop, Power BI Cloud, Azure Data Factory

Data Specialist

IBM India Pvt Ltd
Hyderabad
2015.03 - 2017.03

Project : TEVA

Project Name : SAP ECC ETL

Client : TEVA Pharmaceutical, Israel

Team size : 6

Environment : Windows, Unix


Project Objective:

TEVA is a pharmaceutical company which is spread across more than 100 countries. This project carries so many modules. The module which we are working has a centralized repository called ECC, which is maintained using SAP. In order to provide the data to the clients in their required format of tables, ETL part is used before loading data from the source to SAP ECC, basing upon the requirements of the client.

Solution:

By providing the transformed input data, with the help of ETL tool IBM Datastage, we have reduced the efforts which SAP side needs to put in data cleansing and rejecting required things.


Role / Responsibilities:

Ø Developed jobs using DataStage from the scratch.

Ø Coordinated with Business-users to get the specific requirements.

Ø Creation of ETL jobs and to move data from source systems to Data Mart's.

Ø Understood the Business process and designing the ETL flow in DataStage.

Ø Preparing and executing Unit Test cases and documenting the execution results

Ø Performing Code peer reviews against ETL and report development

Ø Designing DWH architecture and ETL Workflows

Ø Interacting with Client and gathering the requirements.

Ø Writing SQL Queries to get data for reports from created Datawarehouse.

Ø Procedures to develop and populate Data Warehouse Relational databases, Transform and cleanse data, and load it into data marts.

Ø Coordinated in Testing, writing and executing test cases, scripts, creating test scenario.

Ø Working on Ad-hoc load requests from Business users.

Ø Involved in business analysis and technical design sessions with business and technical staff to develop requirements document, and ETL specifications.

Ø Designed, Developed, and test processes for extracting data from legacy systems or production systems.

Ø Participated in performance, integration, and system testing.

Ø Logging in the different issues related to the development phase.

Ø Highly efficient in debugging any issues which have come up in CIT and SIT phases.

Ø Worked with TFS (Team Foundation Server) to Check in/Check out the code.

Ø Involved in identifying KPI List and worked to get the data for Ad-hoc Report Requests from business

Ø Monitored and collected performance data of Datastage jobs to maximize the performance.

Ø Successfully deployed the parallel and sequence jobs from the development to production environment.

Ø Prepared and enhanced the test cases according to the functionalities for manual testing and Test execution(UTC & CIT)

Ø Designed LLD documents and unit test cases for the jobs in the project and also involved in the HLD designing activity.


Solution Environment: DataStage 8.5, Oracle, JIRA

Associate IT Consultant

ITC Infotech
Hyderabad
2014.06 - 2015.01

Project : PRDS

Project Name : Unified Data Warehousing, PRDS Parity Pharmacy Claims

Client : United Health Group, Hyderabad

Team size : 4

Environment : Windows, Unix


Project Objective:

The Pacific Regional Data Store also known as PRDS, is a data repository application that provides a single source of consolidated information for the dimensions and fact tables that reconciles to the financials. In order to decommission PRDS parity needs to be achieved within UDW.

Solution:

To support PRDS functionality within UDW parity will be achieved through data acquisition, exploration, integration, enrichments/capabilities and data provisioning activities in UDW. Parity consists of acquiring and making available via UDW the missing data attributes, sources, domains and history.

Current Process:

Currently ORx is sending the CHF7.0 feed to Claim Highway containing data from E&I, M&R and C&S line of business, but CHWY is not sending the Part-D (M&R data) to UDW, due to policy number issue. in the current state ORx is sending the data feed of Claim History File to PRDS system in CHF5.0 version. The PRDS system has currently 114 attributes, out of which 77 attributes are directly feed into the system from CHF5 file and 37 attributes are derived or lookup table columns

Proposed Process:

Source the Rx Claims data feed directly from ORx to UDW, containing data from E&I, M&R and C&S line of business including 39 months of historical data. Replace the Rx claims from CHWY with this feed.

Benefits:

The format of base table facilitates faster report generation as a retrieval of data becomes lot easier


Role / Responsibilities:

Ø Understood the Business process and designing the ETL flow in DataStage.

Ø Coordinated with Business-users to get the specific requirements.

Ø Extracted data form Excel, Flat Files and Oracle, applied business logic to load them in the central Oracle database.

Ø Developed Test cases for Unit Testing of the Mappings, and also was involved in the Integration Testing.

Ø Implemented Error Processing for Datastage jobs and the sql queries.

Ø Effectively documented all Development work done.

Ø Extensively worked in the performance tuning of programs, ETL procedures and processes.

Ø Extensively used SQL functions and Datastage stages to build business rules.

Ø Reviewing the ETL process design & ETL design documents with the business analysts, end users, data quality team and enterprise test team.

Ø Used JIRA to file, resolve, closing Defects.

Ø Designed LLD documents and unit test cases for the jobs in the project and also involved in the HLD designing activity.

Ø Logging in the different issues related to the development phase.

Ø Highly efficient in debugging any issues which have come up in CIT and SIT phases.

Ø Worked with the Release Management Team for the approvals of the Change Controls and Change Requests and also work with the infrastructure team to make sure that deployment is up-to-date and involved production support as and when necessary.

Ø Developed jobs using IBM DataStage from the scratch.

Ø Prepared and enhanced the test cases according to the functionalities for manual testing and Test execution(UTC & CIT)


Solution Environment: DataStage 8.5, Teradata

Senior Systems Engineer

Infosys Limited
Hyderabad
2010.03 - 2014.05

Joined as a fresher recruited from the campus driver, got through the training and had taken the responsibility as a Team Member.


Project : Standard Bank

Project Name : Datawarehouse Management System

Client : Stanbic Ibtc Plc, South Africa

Team size : 73

Environment : Windows, Unix


Standard Bank Group is one of the big four full-service South African banks providing services inside and outside the African continent. The group operates in a range of banking and related financial services. The group has a wide representation which spans 17 African countries and 16 countries outside of Africa with an emerging markets focus.

1. Equinox is a system, implemented by Standard Bank Nigeria for providing customer service and regulatory functions of the bank. In the same way, Bank Master (BM), Branch Accounting (BA) and Customer Information (CIF) are the systems used by Standard Bank Namibia. These two countries decided to replace their banking systems with Finacle. FODS (Finacle ODS) is the database which has all the cleansed Finacle tables and used to generate MOD extracts, thereby generating reports from MOD (The generation of reports is out of scope of the project).

2. CACS is a solution, residing at Centre that helps the users (collectors) to control the Arrears and Rehabilitations & Recoveries (R&R) areas effectively. The current CACS solution has three source systems (Accounting systems) namely Finacle, HP&L and PRIME. ODS, residing in Country, will cater to CACS with Finacle and HP&L data.

3. Prime is an outsourced end-to-end credit card processing solution for both issuers and acquirers that includes online authorization and switching, ATM management and device handling, fraud detection, risk management. This module holds the information about the transactions made at Merchants using POS.


Role / Responsibilities:

Ø Involved in understanding the measures and breaking the user stories into a technical stories

Ø Communicating with customer to understand functional requirements

Ø Involved in creation of data Quality/validation rules at column level for the data warehouse and the data mart

Ø Involved in creation of mapping documents from the source ingest file to data warehouse

Ø Creation of ETL jobs and to move data from source systems to Datamarts.

Ø Preparing and executing Unit Test cases and documenting the execution results

Ø Working on Change requests against existing jobs and reports

Ø Configuring the ETL jobs to load Monthly aggregate tables and running Month end data load Process.

Ø Deploying ETL packages and reports into Production environment

Ø Ensures deliverables meet Quality standards.

Ø Understood the Business process and designing the ETL flow in DataStage.

Ø Coordinated with Business-users to get the specific requirements.

Ø Involved in High level design of the project.

Ø Developed jobs using DataStage from the scratch.

Ø Prepared and enhanced the test cases according to the functionalities for manual testing and Test execution(UTC & CIT)

Ø Designed LLD documents for the jobs in the project and also involved in the HLD designing activity.

Ø Logging in the different issues related to the development phase.

Ø Highly efficient in debugging any issues which have come up in CIT and SIT phases.


Solution Environment: Datastage 8.1/8.5, Oracle 11g.

Education

Bachelor of Science - Electrical, Electronics And Communications Engineering

P.V.P Siddhartha Institute of Technology
Vijayawada
2005.05 - 2009.05

Intermediate - Maths, Physics, Chemistry

Gowtham Junior College
Vijayawada
2003.05 - 2005.04

Skills

    IBM Datastage 8.1 / 8.5/9.1/11.5

undefined

Timeline

Team Leader

Aptude Software Solutions Pvt Ltd
2017.03 - Current

Data Specialist

IBM India Pvt Ltd
2015.03 - 2017.03

Associate IT Consultant

ITC Infotech
2014.06 - 2015.01

Senior Systems Engineer

Infosys Limited
2010.03 - 2014.05

Bachelor of Science - Electrical, Electronics And Communications Engineering

P.V.P Siddhartha Institute of Technology
2005.05 - 2009.05

Intermediate - Maths, Physics, Chemistry

Gowtham Junior College
2003.05 - 2005.04
Nishanth ChitumothuDW ETL Professional