Overall 15+ years of experience out of which 8
years in data ware housing projects, related to
various aspects involving data analysis and
data modelling techniques, using OLAP & ETL
tools like Snowflake cloud data ware house,
informatica Power Center 10.x/9.x Oracle
10g/9i, SQL, Unix Shell Scripting.
Trained in Azure Data Engineering basics and tools functional knowledge.
5 Years of experience in duck creek rating engine, page and form manuscript development.
Hands-on experience on star-schema and dimensional modelling methodology.
Experience in bulk loading from external stage (AWS S3) and internal stage to snowflake cloud using copy command.
Loading into snowflake tables from internal
stage using SnowSql.
Used COPY, LIST, PUT and GET commands for validating internal stage files.
Used Snow pipe for continuous data ingestion from S3 bucket.
Creating Linux shell scripts to automate data loads.
Performed trouble shooting analysis and resolution of critical issues
Designed, developed Informatica mappings, enabling the extract, transport and loading of the data into target tables and loading into the Oracle and SQL Server.
Created Workflow and Tasks to schedule the loads at required frequency using power center scheduler, cron jobs and passed the data to target tables.
Created email task for auto generated mail for task failure notification.
Expert in learning and providing production support on existing BI projects, DWH implementation activities and Data Management operations.
Experienced in Informatica deployments objects (Workflows, Tasks, Mappings, Connections and other ETL objects) and Informatica server Maintenance and Web logic server maintenance.
Experienced in J2EE file deployments through Web Logic server.
Have expertise in handling typical technical issues for best performance to meet customer satisfaction.
Knowledge in Installation and configuration of Informatica server with Sql Server, oracle and able to handle the Informatica administrator tasks like Configuring DSN, creating Connection Strings, copying & moving mappings, workflows, creating folders etc.
Have a good understanding knowledge on Informatica Cloud, Amazon Web Service and writing experience on unix shell scripting.
Through with Deployment process from DEV→QA→UAT→PROD Involved in Unit and System level testing in different modules of the project.
Overview
17
17
years of professional experience
4
4
years of post-secondary education
1
1
Certification
Work History
Manager
CapGemini
02.2022 - Current
Creating new informatica objects and Implementing new business rules for new data loads.
Developing new parameter file for reusable codes to automate the informatica objects.
Monitoring Power Center workflows minimal failures and performance improvements.
Design and planning for code movements.
Creating JIRA tickets for logging daily work hours,code movements and other process eg.for QA and UAT team.
Performed through log analysis for both power center and IICS jobs and successfully improved performances.
Tech Lead
Technosoft Global LLP
01.2020 - 02.2022
Understanding the business requirement and preparing the BRD for detail level requirement
Preparing Tspec design doc for developers for actual code change
Reviewing code change doc and migration doc for support team
Creating JIRA tickets for upstream systems for production bugs.
Senior Software Engineer
IDC Technologies Sol (I) Pvt. Ltd
05.2019 - 04.2020
SINGTEL is one of the leading telecom company in SINGAPORE and currently enhancing the application through CR FACTORY project after which code need to be migrate from legacy system to ALDM model
Responsibilities
Prepared technical design documents and specifications for HLIA (LLD) and Design doc for the CR's to be deliver
Developed mappings, workflows and tasks using power center 10.1.1
Written UNIX shell script for SQL reports and automation
Prepared test plans and test cases based on CR requirement documents for Informtica and UNIX
Prepared release document and code deployment plan
Worked closely with clients to establish problem specifications and system designs
List down all the source tables, target tables, connection details, parameters
Converting informatica transformation logics to SQL.
Tech Lead
Tech Mahindra Ltd
01.2017 - 08.2018
MasterCard runs a Data Lake stored on AWS which is used as a powerful data source for teams around the business to build data driven products
Our goal is to continuously push data from oracle to Snowflake
Created complex mappings using Aggregator, Expression, Joiner, Lookup transformations for transferring data from source to target
Making sure the batch jobs running jobs scheduled and running properly
Created UNIX script and schedule as corncob same to auto unscheduled the workflows and schedule the workflows
Created UNIX scripts to get auto generated mails for missing source files and workflow failures
Performed Configuration Management to migrate Informatica mappings/sessions /workflows from Development to Test to production environment
Working with performance DBA team to tune the SQL for better performance.
Tech Lead
Tech Mahindra Ltd
08.2015 - 12.2016
AT&T world's largest telecommunications company having various applications out of which Device Life Cycle(DLC) and Order Track (OT) are one of them
DLC deals with the data of devices activation, deactivation, warranty, lost data those are purchased from AT&T where as OT are dealing with shipment data of devices those are ordered from AT&T
Depending upon the applications fronted and data issue logged by users, A's and customers AT&T used to undergo quarterly development and sometime immediate development
The various sources involved here Apple, Samsung, Walmart, Target, Best Buy etc
AT&T collects the rdbms and flat file data from above sources and transfers data to EDW
Responsibilities
Acted as coordinator between offshore ETL Development and Onshore team for ETL development, Use Cases, Gap Analysis and process flow documents
Created several Informatica Mappings in all the EDW Environments to pull the data from sources like SQL Server, Oracles and flat files to transfer data to Oracle tables and flat file and monitored the workflows till its successful execution
Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target
Created complex mappings using Aggregator, Expression, Joiner, Lookup transformations for transferring data from source to target
Created unix script and schedule as corncob same to auto unscheduled the workflows and schedule the workflows
Created unix scripts to get auto generated mails for missing source files and workflow failures
Performed Configuration Management to migrate Informatica mappings/sessions /workflows from Development to Test to production environment
Working with performance DBA team to tune the SQL for better performance.
Tech Lead
Tech Mahindra Ltd
07.2014 - 08.2015
My AT&T App MDM Enhancement & Support
Analyzed the source system for integrity issues to the data resided in various source systems and defined the same in Systems and Trust
Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM from various source systems
Defined the Base objects, Staging tables, foreign key relationships, static lookups, dynamic lookups, queries, packages and query groups
Configured Address Doctor which can cleanse whole world address Data and enhanced by making some modifications during installation
Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
IT Analyst
Tata Consultancy Services Limited
05.2012 - 06.2014
As a part of ONEPMS, there is an increased focus on consolidation of source systems to reduce IT costs and this project is one of the first steps towards enhancing the process of creating Loan Products in a Leasing source system
The leasing data is getting loaded into the EDW data warehouse and in order to support this set of new loan data, an enhancement needs to be made to the existing ETL code
Responsibilities
Involved in business analysis and technical design sessions with business and technical staff to develop requirements document and ETL design specifications
Involved in Dimensional Modelling to design and develop STAR Schema
Designed and developed mapping, transformation logic and processes in Informatica for implementing business rules and standardization of source data from multiple systems into the data warehouse
Created Informatica mappings to load the data mart with the use of transformations like Aggregator, Filter, Router, Expression, Joiner, Lookup, Sequence generator and Update Strategy
Defined Target Load Order Plan and Constraint based loading for loading data correctly into different Target Tables
Involved in Unit Testing, Integration Testing of mappings.
Analyst Programmer
Atos Syntel
10.2008 - 01.2012
I worked as a Duck Creek developer for some of the well-known Insurance clients like American Safety Insurance and Horace Mann Insurance Company
Responsibilities
Preparing approach document after discussing SRS with Business users and BA team
Discussing the requirement discrepancies with BA team and then Client
Experienced in duck creek architecture and its integration with other data warehouse and middle ware systems
Well Experiences in Rating engine calculation procedure, page manuscript design and form manuscript developments
Experienced in debugging the issue with data tester and TSV debugger
Hands on knowledge on SQL server database
Performed unit test cases and system testing.
Software Developer
Capgemini - TCube Solutions Pvt. Ltd
08.2008 - 10.2008
Duck Creek Developer for below clients like Horace Mann insurance company.
Software Developer
Discoverture Solutions, A Mindtree Company
01.2007 - 06.2008
Odisha Duckcreek developer for clients like Farm Family insurance company,Horacemann Personal Auto Insurance and Great American Insurance commercial line insurance.
Education
Bachelor in Engineering - Engineering: Information Technology
C. V. Raman College of Engineering
Bhubaneswar,Odisha
06.2002 - 05.2006
Skills
Informatica Power Center
undefined
Certification
Informatica IICS R38 CDI Certified Professional
Timeline
Informatica IICS R38 CDI Certified Professional
07-2022
Manager
CapGemini
02.2022 - Current
Tech Lead
Technosoft Global LLP
01.2020 - 02.2022
Senior Software Engineer
IDC Technologies Sol (I) Pvt. Ltd
05.2019 - 04.2020
Tech Lead
Tech Mahindra Ltd
01.2017 - 08.2018
Tech Lead
Tech Mahindra Ltd
08.2015 - 12.2016
Tech Lead
Tech Mahindra Ltd
07.2014 - 08.2015
IT Analyst
Tata Consultancy Services Limited
05.2012 - 06.2014
Analyst Programmer
Atos Syntel
10.2008 - 01.2012
Software Developer
Capgemini - TCube Solutions Pvt. Ltd
08.2008 - 10.2008
Software Developer
Discoverture Solutions, A Mindtree Company
01.2007 - 06.2008
Bachelor in Engineering - Engineering: Information Technology