Summary
Overview
Work History
Education
Skills
Work Availability
Timeline
Generic
DEEPTHI DHANANJAYA

DEEPTHI DHANANJAYA

Bengaluru

Summary

Solution-focused Data Warehousing Specialist with 8+ years of experience working in IT sector. Technically-savvy and proficient in data analysis, team support and process improvements.

Overview

9
9
years of professional experience

Work History

Data Migration Developer

American Farmers
10.2021 - Current
  • AFR)
  • Data Migration for AFR phase 1 project is a Big bang, mid-term data migration of customers, Policies, Membership (As Policies), Billing, Claims and Documents
  • The purpose is to de-commission the BriteCore operation, replacing it with Sapience Core suite
  • The scope of this project is Migration project follows decisions taken in functional/implementation stream
  • During transformation process data will be adapted to fit this functional decision
  • We were Involved in Building a Data warehouse which deals with different aspects of Insurance domain like customer, billing,claims ,policies for Dwelling Property, Farm & Ranch Umbrella , Farm Liability , Homeowners .The main process of our team was to Extract Transform & populate the various Data marts
  • Code, using K2V ETL:
  • From the simplified data structure in the predefined staging DB, produce XML in the structure and requirements of the Customer, Billing and Policy Web Services
  • Execute The relevant Web services
  • From Claims, transform the simplified data structure in the predefined Staging DB to the ClaimsPro DB
  • Configure rules and Questions related to migration in Sapiens Product Designer
  • Orchestration, Monitoring of the Load:
  • Add validation checks
  • Develop logs to manage the fails during load
  • Populate control tables and migration list tables needed for the cross relations between entities (e.g
  • Customer and Policy)
  • Replace Legacy keys with Sapience Core suite Keys
  • Manage Multi-threading during load
  • Build the ETL workflow
  • Iterative test load
  • To identify fails and report to the relevant responsible to fix
  • Each time with larger population, and more entities
  • A separate ETL process for each entity is recommended
  • Execute the ETL workflows to transform the data from the predefined Staging DB to Sapiens Core suite DB
  • Run Data integrity Checks within Sapiens Core suite
  • Build reconciliation processes to validate the Legacy source, to predefined Staging DB and to Sapiens Core suite DB

ETL DEVELOPER

SAPIENS TECHNOLOGIES, DATA MIGRATION TEAM
Bangalore
04.2020 - Current
  • Over all 8 YEARS 10 Months of experience in data warehouse development and Migration using ETL Tool like Informatica, K2view, talend in Telecom, FMCG ,Banking and Insurance Domain
  • Experienced in ETL process using Informatica-Power Center,k2view tool for transformations , Designer, Workflow Manager, Workflow Monitor and Repository Manager
  • Worked On BI Analytical tool such as Tableau for Data Analysis and generated the Report
  • End to End implementation, Data migration of data from informatica 8.5 to informatica 9.6
  • Experience in Extraction, Transformation and Loading of data using Informatica & k2view and talend from heterogeneous sources like Relational databases and Flat Files
  • Knowledge in the Data Warehousing using Data Extraction, Data Transformation and Data Loading (ETL) and thorough knowledge on data warehouse concepts like Star Schema, Snow Flake, Dimension and Fact tables
  • Involved in the Design, Development ,Unit testing and Migration of individual mappings and update processes
  • Migrating data to Claimspro ,Policypro, Documentspro
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables
  • Well versed in developing the complex queries which involves multiple join conditions, aggregation functions and analytical functions
  • Experienced with TOAD ,Postgres and SQL DEVELOPER, SQL Server for source and target database activities
  • Having good Knowledge on SQL and Unix

01.2020 - 05.2021
  • GEB & NAV-Navigators
  • This are Insurance Domain Projects ,worked simultaneously on both Migrations to the client
  • The scope of this project is Migration project follows decisions taken in functional/implementation stream
  • During transformation process data will be adapted to fit this functional decision
  • Major part Developed and migrated here was claims for a US based company called Navigators
  • Actively involved in the Migration of Claims from sapiens coresuit to the Client Environments.

ETL Developer

12.2017 - 01.2020
  • KDM
  • A Dutch based landline and mobile tele-communications company.It Has more than 33 million subscribers in the Netherlands, Germany, Belgium, France, and Spain
  • The scope of this project is to maintain and support client’s business process, to enable the higher management to take intelligent and better decisions, to forecast the future market scenario and have look at the trends
  • We were Involved in Building a Data warehouse which deals with different aspects of Telecom business like customer billing, service call tracking, call details, no of products serviced, purchased and sold, customer feedback, future emerging areas etc
  • The main process of our team was to Extract Transform & populate the various Data marts
  • Created ETL mapping using Informatica which extracts data from multiple sources like Oracle and Flat files, transforms the data based on business requirements and loaded to Data Warehouse
  • Worked with Informatica Designer Components - Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet and Mapping Designer
  • Experienced in implementing the business rules by creating transformations (Expression, Aggregate, Unconnected and Connected Lookup, Router, Update Strategy, Filter, Joiner, Union), and developing Mappings
  • Extensively used TOAD for source and target database activities
  • Modified existing mappings for enhancements of new business requirements
  • Involved in the development and testing of individual data marts, Informatica mappings and update processes
  • Extensively used ETL to load data from wide range of sources such as flat files (CSV, fixed-width or delimited)
  • Participating in project meetings to fully understand new business processes and identify any new data
  • Slowly changing Dimensions process flows, Oracle 11g, SQL, , Toad
  • Target table in efficient manner

Data Migration Developer

HELIOS
08.2015 - 11.2017
  • HDFC Bank had various products like Accounts & Deposits, Loans, Insurance and Premium Banking etc
  • This project is mainly to Migrate the Accounts & Deposits module functionality
  • The Accounts & Deposits has different account types like Savings, Salary, Current, Demat, Deposits, Rural and Safe Deposit locker
  • Created mappings to migrate data from Flat files, Oracle to data warehouse from Informatica Version 8.x to version 9.x
  • Extensively used Informatica Power Center an ETL tool to extract, transform and load data from remote sources to DW and worked with various active & passive transformations
  • Developed complex Informatica mappings and tuned them for better performance
  • Extensive Use of SQL Overrides and substituted them in place of multiple transformations
  • Created complex mappings using Lookup, Aggregator and Router transformations for populating target table in efficient manner

Data Management, ETL Developer and Business Analyst

10.2014 - 07.2015
  • Profile-Sales and Marketing-
  • This Project involved the development of sales and marketing Data Warehouse of FMCG Industry based company
  • The scope of this project was to maintain present and historical data of the client and create intelligent data to support client’s business process, to enable management to take better decisions, to forecast the future market scenario and have a look into the trends
  • Involved in performance issues while doing Full load and delta load using data services
  • Performed source data assessment and identified the quality of the source data
  • Created Jobs, Workflows and Data Flows according to specifications and implemented the business logic.

Education

Bachelor of Engineering - Information Science

NITTE MEENAKSHI INSTITUTE OF TECHNOLOGY
01.2013

Skills

  • Skills & Strengths
  • Organized, Self-starter, Dedicated and a hardworking individual
  • Ability to handle multiple tasks and work independently as well as in a team
  • Strong Analytical skills with an ability to manage multiple tasks under pressure
  • Recognized for reliability and “getting the job done” through persistence and strong work ethics
  • Technical Skills
  • ETL Tools: Informatica Power Center 9x/8x/10,k2view tool,Talend
  • BI TOOLS:Tableau basics
  • Query Tools: SQL Developer, Toad,postgres,Microsoft sql server
  • Operating Systems: Windows XP/Vista/7/81,LINUX-Fedora
  • Languages: SQL,PHP
  • Database: Oracle 11g/10g,MYSQL
  • ETL,DATAWAREHOUSE,DATAMODELLING,SNOWFLAKE,
  • Data Aggregation Processes
  • Database Structures Expertise
  • Normalization Techniques
  • Agile Methodologies
  • Data Acquisitions
  • Requirements Gathering
  • Data Mapping
  • Verification and Testing
  • Data piplines
  • SQL Programming
  • Data Warehousing Management
  • RDBMS
  • Staging tables

Work Availability

monday
tuesday
wednesday
thursday
friday
saturday
sunday
morning
afternoon
evening
swipe to browse

Timeline

Data Migration Developer

American Farmers
10.2021 - Current

ETL DEVELOPER

SAPIENS TECHNOLOGIES, DATA MIGRATION TEAM
04.2020 - Current

01.2020 - 05.2021

ETL Developer

12.2017 - 01.2020

Data Migration Developer

HELIOS
08.2015 - 11.2017

Data Management, ETL Developer and Business Analyst

10.2014 - 07.2015

Bachelor of Engineering - Information Science

NITTE MEENAKSHI INSTITUTE OF TECHNOLOGY
DEEPTHI DHANANJAYA