Seeking professional assignments in an organization of repute where there is progressive environment and encouraging new ideas. To utilize my technical knowledge, communication skill and leadership qualities for a successful and challenging career
Overall 11+ Years of experience in IT Industry with diversified skills in Banking, Telecom and Healthcare Domain. Motivated Engineer recognized for strong critical thinking and problem-solving abilities coupled with successful track record in industry. Dedicated to offering innovative solutions to eliminate legacy issues and elevate performance metrics. Diligent creator of innovative workflows and exceptional final products.
English, Hindi, Nepali
Technologies : IICS, UNIX, SQL Server
Project Overview:
KDRP – Keurig Dr. Pepper. This Project is to develop the process with the help of IICS-Informatica Integration Cloud Service to Extract, Transform and Load the data from source to target & to migrate the On-Prem system RPDM(Red Point Data Management) in to IICS system for CDP Customer Data Platform.
Roles & Responsibilities:
➢ Working in KDRP Project to develop mapping, taskflows in the cloud environment for data transformation using IICS (Informatica Intelligence Cloud Service), Unix ,SQL Server database
➢ Migrating RPDM on-premise system to Informatica Cloud IICS
➢ Involved in developing TaskFlows using IICS and testing the taskflow using SQL database.
➢ Produce clean, efficient code based on specifications.
➢ Discuss business requirements with clients.
➢ Involved in Unit Testing of the Taskflow and Unix shell scripts.
➢ Involve in daily meetings with the client to understand the requirement and helping with the ideas technically.
➢ Write and update technical documentation.
➢ Address Standardization & cleansing of the source data coming from different source system and loading it to single database.
Technologies : IICS, Informatica Power Center 10.4.1, UNIX, Big Query, Teradata
Project Overview:
CDW – Cloud Data Warehouse is an abbreviated as CDW. Cloud Data Warehouse Migration to migrate On-Prem Data Warehouse footprint to Cloud GCP for Charles Schwab.
Roles & Responsibilities:
➢ Working in CDW Migration Project to convert OnPrem system into cloud environment for data transformation using IICS (Informatica Intelligence Cloud Service), Unix ,GCP(Google Cloud Platform-Big Query),Teradata and Informatica Power Center.
➢ Changing the BTEQ scripts into Big Query scripts with the help of Datametica team and embedding them into the Unix shell scripts.
➢ Involved in developing TaskFlows using IICS and testing the taskflow by loading data into Big Query tables.
➢ Involved in Unit Testing of the Taskflow and Unix shell scripts.
➢ Handling the Team Size of 4 and helping them with the technical assistant to fulfill the sprint goal.
➢ Fixing issues faced by the team member and helping them to reach extra mile.
➢ Involve in daily meetings with the client to understand the requirement and helping with the ideas technically.
➢ Giving Project KT to the newcomers.
➢ Took multiple interviews for the candidate outside of the company.
Technologies : Informatica Power Center 9.6.1, UNIX, Oracle SQL, Tidal.
Project Overview:
PDW – Pharmacy Data Warehouse will house all e-PIMS data from and all POS data from all regions.PDW application processes build the data from various regions into integrated database. The PDW GL process transform the functional data into General Ledger (GL) Financial entries.PDW maintains the Sub ledger level GL entries and posts entries to the One Link system. General Ledge Financial entries are built for both ePIMS and POS data.
The PDW Bridge feed processes ePIMS as well as PS data to generate the data feed to the downstream users. Downstream customers use the PDW data to build data marts & integrated data warehouse systems.PDW BI reporting team developed several POS End Of Day operational reports & sends daily reports to many users from all the regions like (GA, MAS,CO,SC,NC,HI).
PDW also processed the data that is received from Upstream source system lie HTR(Catamaran), INO (Prescriptions solution), MIP (Med Impact) ,MDW ,KPHC data ,ECC and PERC source system. The data from this source system is used in building the GL data, EPS Bridge Feed, PERC Data Feed and POS Bridge feed.
PDW process build the ePIMS financial data, POS Transactional Sales &Return data.
Roles & Responsibilities:
➢ Worked in Informatica Development and ETL methodology for data transformation using Informatica Power Centre 9.6.1, Unix ,Oracle SQL and Tidal scheduling tool.
➢ Running multiple Unix script ad hoc as per the requirement and meet the handshake with the downstream, upstream system.
➢ Strong Analytical, Problem-solving, Organizational, and learning skills.
➢ Exposure to batch process and Informatica Web services.
➢ Documenting each change in detail after fix has been made and monitor the system how the changes affected.
➢ Incident Management, Problem Management and Change Management by using the tool like Service Now.
➢ Involved in smooth deployment with production code.
➢ Coordinated within the team and point of contact for all the issues and bugs fix from Offshore.
➢ Impact analysis with upstream as well as with downstream and make the SLA for all the level.
➢ Debugging at script level and circumventing the issue proceeding with permanent fix.
➢ Responding to the customer and client with fast pace whenever required.
Technologies : Informatica Power Center 9.6.1, UNIX, Teradata, UC4
Project Overview:
KPN Netherland provides its customer many promotional offers. Customers can be contacted by various media like phone, email, SMS etc. Many a times a customer instructs to block/receives these offers, but as these instructions are made to single party like call centers or NBR, chances are other parties might miss out on this. Sometimes instructions are violated as contacted party does not have latest instruction. CPR aims to consolidate every customer instruction and build a single repository, hereby providing a single version of truth.
The primary objective is to ensure that KPN is fully compliant to current regulations around opt-out and restrictions registers and to ensure that following processes are in place:
➢ Correct and timely registration, processing, and use of permissions in outbound campaigns.
➢ Define and implement a uniform Permissions Policy throughout.
➢ Timely signaling and rapid resolution of compliancy incidents.
➢ Achieve a high grade of process automation.
➢ To provide and implement a solution to support dynamic nature of Permissions.
CPR-Central Permission Register is critical on financial aspect. Registration of permissions (opt-in) and restrictions (opt-out) according to the Privacy law. When CPR is not available the customer restrictions (opt out) will not be known for Campaign activities. KPN will be at risk of severe penalty. For every complaint 0.5 million Euro is being charged as fine. File must be processed every day without fail or maximum by next day as this could cause financial loss.
CPR follows a stove pipe architecture. CPR receives source data on two interfaces: Batch processing and Web Services. CPR loads incremental data and does not handle Full dump. In case of any issues in any record, it is backtracked, and the record is pulled out and send it back to source to correct and loaded next day.
Roles & Responsibilities:
➢Experienced in enhancement project and ETL methodology for data transformation using Informatica Power Centre 9.6.1.
➢ Strong Analytical, Problem-solving, Organizational, and learning skills.
➢ Exposure to batch process and Informatica Web services.
➢ Documenting each change in detail after fix has been made and monitor the system how the changes affected.
➢ Incident Management, Problem Management and Change Management by using the tool like Service Now.
➢ Involved in smooth deployment with production code.
➢ Coordinated within the team and point of contact for all the issues and bugs fix from Offshore.
➢ Impact analysis with upstream as well as with downstream and make the SLA for all the level.
➢ Debugging at script level and circumventing the issue proceeding with permanent fix.
➢ Responding to the customer and client with fast pace whenever required.
➢ Giving KT to the new joiner/team members and helping them in case of any issue in the application.
Technologies : Informatica Power Center, UNIX, MySQL
Project Overview:
Singapore & Hongkong Project for Bank Julius Baer & Co Ltd in BFS domain. Here, Operational Data-ODS EOD will have end of day and historical data which will be used mainly for financial and management reporting. ETL process is responsible for populating the ODS tables from the staging tables. This listens on the queue and fetches the data and transforms it to data as required by ODS tables in both RT and EOD. The Power Center ETL picks up the message from the queue and writes it to ODS staging for real time. A separate queue is maintained for SG and HK.
The Static and Master as well as Transactional & Positional tables will be populated based on two approaches: Message based approach for RT and File based approach for Batch Process where data will be loaded into staging table based on the file received for every COB.
Roles & Responsibilities:
➢ Involved in designing mapping, creating workflows & supporting with Informatica 9 & above.
➢ Exposure end to end support right from requirement analysis to System study, Designing, Coding, Unit-testing, De-bugging, and Implementation.
➢ Good in Using Debugger wizard to find the bottlenecks in the Mapping.
➢ Skills in Data Analysis, Data Requirement Analysis and Data Mapping for ETL processes.
➢ Excellent communication, documentation, and presentation skills.
➢ Attended code walkthrough as Developers/SMEs to discuss/review the changes and the impact to the system.
Technologies : Informatica Power Center 9.6.1, UNIX, DB2, Control M
Project Overview:
Worked with an American Express Client in Marketing and Banking Technologies in BNFS Cards and Payment – 1 Banking Domain.
DQME – DATA QUALITY AND MATCHING ENVIRONMENT
Provides the capability to perform advanced internal matching and data quality functions using Informatica Data Quality platform. This capability will enable re.al time and batch address.
Cleansing, standardization for the regions US, JAPA, NA and EMEA. Business applications of this Capability provide identification, linkage processing for Consumers and Institutional Strategic Partner customers.
ICLIC – INSTITUTIONAL CUSTOMER LINKING AND IDENTIFICATION CAPABILITY.
ICLIC will identify and link an institutional customer’s relationship with American Express. It will also provide different relationship views of institutional customers like Global New Accounts, Global Establishment Accounts and Global Commercial Accounts.
MNM\DM – MARKETING NACH MERGE\DATA MANAGEMENT.
Application collects financial and Demographic data from all US based accounts receivable (AR) systems, such as Triumph, Optima, Globe star, SROC, and WROC, corporate and foreign spend at domestic SE’s and acts as a warehouse for the same.
Roles & Responsibilities:
➢ Overall, 3 years of experience in production support project, client American Express BFS domain with technical as ETL tool Informatica, database DB2 and Unix operating system.
➢ Batch Jobs, Informatica Web services monitoring ,power exchange and resolve complex issue related to job failure.
➢ Responsible for 8x5 of monitoring and troubleshooting issues pertaining to Production environments and analyze the root cause.
➢ Perform a day-to-day operational support functions for applications, checking logs and providing workaround.
➢ Root cause analysis for Job failures and ensures data deliveries to target.
➢ Attending meeting for upcoming changes with the development team.
➢ Involved in new deployment/migration happening in weekends.
➢ Stopping and starting the web services during the downtime and validating the server health checkup.
➢ Operation excellence by continues analysis of the current system and suggesting improvement ideas to the team with benefits of the proposed system.
➢ Implementing the changes which would be the outcome of the operational excellence, permanent issue fix or new business requirements.
➢ Documenting each change in detail after fix has been made and monitor the system how the changes affected.
➢ Incident Management, Problem Management and Change Management by using the tool like Service Now.
➢ Experience in Control M Batch Jobs and debugging in case any failure.
➢ Debugging at script level and circumventing the issue proceeding with permanent fix.