Experienced IT leader with over 20 years of expertise in enterprise-wide IT solutions, specializing in Data Lake Cloud and Traditional projects. Proficient in AWS, Azure, ETL Tools like Informatica, Talend, and various databases. Successfully executed projects in Financial, Pharmaceutical, Insurance, Wealth Management & Telecom sectors. Led diverse teams and fostered collaboration across business groups, earning trust and recognition. Collaborated with Business Partners, fostering confidence in traditional and cloud-based solutions. Provided guidance on Design & Architecture, driving project improvements and client satisfaction. Demonstrated strong analytical, programming, and communication skills. Implemented process improvements for stable and standardized platforms, ensuring project success and client satisfaction.
Overview
25
25
years of professional experience
5
5
Certification
Work History
ETL Tech Lead / Architect / Delivery Lead
Accenture - BMS, UHC, Charles Schwab, RBC & TDBank
10.2019 - Current
Spearheaded and implemented diverse projects across platforms and technologies for BMS, UHC, Charles Schwab, RBC & TD Financials, fostering strong client confidence.
Led end-to-end Project Delivery, Design, Development, Planning, and Implementation efforts, ensuring client satisfaction with on-shore and off-shore teams.
Collaborated with Business Partners and SMEs to consolidate landscapes, instilling confidence and encouraging continued collaboration on traditional and cloud-based solutions, notably on Azure and AWS platforms.
Reviewed project scope, requirements, architecture, and proof of concept (POC) design across Data Lake Projects, ensuring alignment with client needs.
Leveraged best practices for self-development, fostering project improvements and growth across various clients.
Conducted technical walk-throughs/reviews with clients after each sprint to maintain alignment with requirements and coding standards.
Ensured coding standards adherence through meticulous code reviews, enhancing quality and alignment with client expectations.
Worked closely with the Data Model team to capture and align business requirements/logic in Source to Target Mapping documents critical for DEV team success.
Identified and implemented process, procedure, and control improvements, promoting stable and standardized development platforms.
Mentored junior developers, promoting a culture of continuous learning and skill enhancement.
Established coding standards for consistent quality and security measures to safeguard sensitive data.
Optimized resource allocation by prioritizing tasks based on business needs and project timelines effectively.
ETL Tech Lead / Architect
Knowledgent Group -> PAREXEL
03.2018 - 09.2019
Led Clinical Trials Operations and Research-Azure Cloud Data Lake project, developing a Dynamic Ingestion framework in Azure to efficiently load data, significantly boosting project performance and generating substantial cost savings.
Implemented a dynamic framework with potential for company-wide utilization, enhancing code reusability and maintainability, thereby reducing development cycle time.
Collaborated seamlessly with diverse teams, including data modelers, business analysts, Quality teams, and Business Intelligence teams/SMEs.
Managed offshore-onsite engagement, fostering adoption and collaboration among a wide spectrum of colleagues and teams.
Hands-on experience on: Azure Cloud Data Lake, Azure Data Factory and SSIS.
Authored code fixes and enhancements for inclusion in future code releases and patches.
Built strong relationships with stakeholders, ensuring clear communication channels for project updates and progress reports.
Estimated work hours and tracked progress using Scrum methodology.
Collaborated with product managers to define technical requirements and develop innovative solutions for complex problems.
Designed and implemented scalable applications for data extraction and analysis.
Senior Talend AWS Developer / Architect
Knowledgent Group -> Regeneron Pharmaceuticals
02.2017 - 02.2018
Spearheaded Information ETL MDM Project on AWS Cloud Data Lake, catering to Regeneron RIT's need for rapid, integrated, and reliable data access.
Developed and tested an Information Management Framework, integrating diverse data capabilities to meet research needs effectively.
Evaluated AWS Data Lake components and ETL technologies including Talend, Athena, S3, EC2, and Redshift Database to construct a robust Research Data Lake with distinct data zones.
Implemented key business use cases through data ingestion, cataloging, search, and provisioning of prioritized data sources.
Enabled critical Data Lake capabilities such as Data Ingestion, Data Discovery leveraging Data Catalog/Quick Data Integration, and Service-Based Architecture.
Established a foundational Enterprise Information Management Framework while addressing specific Research organization use cases.
Environment: AWS, Talend, Redshift & Data Lake infrastructure.
ETL Lead / Architect
Knowledgent Group -> Akorn Pharmaceuticals
01.2016 - 08.2017
Led Revenue Contract Management System on AWS Cloud Data Lake project for Akorn pharmaceuticals, enhancing data management and analytics capabilities.
Reviewed project scope, requirements, architecture, POC design, and Talend development guidelines for AWS Data Lake Project.
Orchestrated the migration of data to the new revenue management system using AWS Cloud Computing components, including S3, EC2, and Amazon Redshift.
Spearheaded revenue management (RM) month-end reporting to ensure accuracy and compliance.
Established a flexible data management and analytics platform, extendable to diverse functional areas such as sales, marketing, and supply chain.
Environment: Talend, Informatica, AWS Redshift & Data Lake infrastructure.
Information ETL MDM Project, ETL Lead
Knowledgent Group, John Hancock / Manulife Insurance Company
07.2014 - 12.2015
Managed end-to-end development life cycle for 17 source systems, establishing a scalable framework for additional system support and ensuring operational integrity across multiple projects and environments.
Conducted thorough analysis of source systems for data migration, integration, cleansing, and staging, ensuring data standardization and consistency.
Provided leadership in Informatica code development, collaborating with an Offshore Malaysian team to ensure client satisfaction.
Contributed to data modeling and design discussions, offering valuable insights and recommendations.
Developed and enhanced Informatica mappings for loading data into various Dimensions, FACTS, and Aggregate FACT tables in the Enterprise Data Warehouse (EDW) and downstream Data Marts.
Implemented optimization techniques for performance tuning, enhancing overall system efficiency.
Conducted testing, knowledge transfer, and provided technical support, utilizing Informatica parameters for streamlined management.
Designed Informatica mappings for automatic error handling processes and audit log tables, enhancing data quality and traceability.
Coordinated closely with the MDM team, facilitating the generation of XML files for loading into the MDM database.
Developed UNIX shell scripts for job scheduling and Informatica pre and post mapping session processes, ensuring smooth workflow automation.
Conducted data profiling, cleaning, and analysis for Customer and Product data domains using Informatica Data Quality (IDQ) capabilities.
Participated in system integration testing (SIT) and user acceptance testing (UAT), ensuring alignment with business and technical requirements.
Created process flow and mapping/data flow diagrams based on business/process specifications, enhancing clarity and communication.
Led Proof of Concept (POC) initiatives during project phases, driving innovation and validation of key strategies.
Identified and implemented process/procedure/control improvements to enhance project efficiency and effectiveness.
Cultivated and maintained effective relationships with staff, clients, developers, and stakeholders.
Environment: Informatica PowerCenter, Oracle, SQL Server, Mainframes, DB2, Erwin R7, PL/SQL, Unix Shell Scripting, CAW Scheduler, and IBM MDM.
Senior ETL Informatica Developer / Data Warehouse Consultant /Data Analyst
Knowledgent Group, Daiichi Sankyo
05.2013 - 06.2014
Spearheaded the Aggregate Spend 360 project, integrating spend-related transaction data into Daiichi Sankyo's EDW and AggregateSpend360 solution, ensuring Compliance reporting met Legislative requirements.
Analyzed compliance needs, providing deeper insights into spend data for Compliance reporting, auditing, and reconciliation.
Managed IMS to Symphony Health Systems (SHS) vendor data Conversion/Integration project, ensuring a seamless transition with minimal business impact, while guiding the Compliance team to optimal solution delivery.
Orchestrated end-to-end development life cycles, implementation, deployment, and quality assurance for multiple production projects.
Collaborated with functional teams to understand business requirements, creating BRDs and technical specifications.
Designed, developed, and implemented ETL Informatica mappings, data load, and validation processes using PL/SQL, SQL, and Unix Shell.
Documented source feed systems for one-to-one mapping, facilitating ETL routine development.
Analyzed the impact of data vendor transition on Data Warehouse facets including Retail sales, DDD, and Plantrak.
Mentored team members in data analysis, integration, and migration activities, while employing various data cleansing scripts and batch processes using Unix Shell Scripts.
Addressed production issues and validated data across staging, IDS, DWH, and Business Objects reporting layers, utilizing Informatica Data Quality (IDQ) for data validation, profiling, and developing processes for data cleansing and standardization.
Environment: Informatica Power Center 9.1.0, Netezza, Oracle 10G, SQL Server, UNIX, Informatica Data Quality 9.1.0, Tidal, Business Objects.
Led the design and implementation of the Global Market Logistics (GML) application, integrating logistics performance metrics across markets globally.
Integrated Consumption Data from IRI and AC Nielsen with sales data in the Data Warehouse, facilitating analytical environments for business users.
Enhanced and supported the Operational Data Store (ODS) for the Health Care Division, incorporating changes and enhancements to integrate processes.
Analyzed source systems, conducted performance tuning, system integration, testing, and data validations for new systems.
Developed Informatica mappings for error handling, audit log tables, and loading data into various tables within the Enterprise Data Warehouse (EDW) and downstream Data Marts.
Leveraged Informatica Data Quality tool for data profiling, scorecard creation, and common data rule derivation.
Developed PL/SQL stored procedures, triggers, and UNIX shell scripts for job scheduling and pre/post-mapping session processes.
Created GML Data Model using ERWIN and set up production jobs for ETL batch scheduling using Informatica and Autosys.
Identified and implemented process/procedure/control improvements and maintained effective relationships with staff, clients, and developers.
Environment: Informatica Power Center 9.1.0, Netezza, Oracle 10G, UNIX, Informatica Data Quality 9.1.0, Data Stage, Autosys, Tidal, Cognos, Business Objects.
Senior ETL Informatica Developer / ETL Architect
MorganStanley SmithBarney
01.2011 - 08.2011
Directed the development and deployment of the Enterprise Data Warehouse (GWIM) – Compliance Surveillance, serving as the primary repository for GWIM brokerage data.
Ensured data integrity and quality for Compliance Surveillance and AML, fostering intelligent correlation and knowledge sharing with downstream systems.
Provided multi-dimensional data views for Account Securities, Client Relationships, Business Sectors, and Transactions, enabling consolidated trade monitoring for Legal, Risk, and Compliance.
Designed an adaptable framework for integration with various information areas and business partners, utilizing Business Objects for business intelligence reporting.
Managed end-to-end development life cycles, overseeing BRD creation, technical specification conversion, and source-to-target mapping documentation.
Developed Informatica mappings for data loading into EDW and downstream DataMarts, employing optimization techniques for enhanced performance.
Conducted testing, knowledge transfer, and offered technical support, utilizing Informatica parameters and workflows for streamlined automation.
Implemented UNIX shell scripts for job scheduling and set up Production jobs for ETL batch scheduling.
Identified and implemented process/procedure/control improvements, fostering effective relationships with staff, clients, and developers.
Environment: Informatica Power Center 8.6.1, Oracle 10G, Teradata 13, UNIX, Autosys, IBM Tivoli Workstation, Business Objects XI R2, Mainframe for file access.
Senior ETL Informatica Developer / Analyst
Bank of America and US Trust
12.2005 - 12.2010
Oversaw the development and deployment of the Enterprise Data Warehouse (Retail Banking) and Client Management Online (GWIM/PWIM), serving as a comprehensive repository for business information.
Managed the entire life cycle of multiple production projects, ensuring operational integrity and quality assurance.
Collaborated with functional teams to translate business requirements into BRDs and technical specifications.
Led the development of Informatica mappings for loading data into various tables within the EDW and downstream Data Marts, implementing optimization techniques for enhanced performance.
Conducted testing, knowledge transfer, and provided technical support, utilizing Informatica parameters for efficient management.
Implemented incremental data loading and set up Informatica workflows for complete automation.
Developed UNIX shell scripts for job scheduling and handled Production job setup using Control-M (US Trust) and AutoSys (Bank of America).
Played a key role in the upgrade of ETL Informatica from version 7.1.4 to 8.6.1.
Conducted POCs, estimated project timelines, and managed ETL Administration and Production support for Informatica tool.
Identified and implemented process/procedure/control improvements, fostering effective relationships with stakeholders.
Environment: Informatica Power Center 8.6.1, Oracle 9i, 10g, MS-SQL Server, UDB DB2, UNIX, TOAD, Autosys, OBIEE 10.1.3.4-Siebel Analytics 7.7, Business Objects XI R2, Siebel CRM tool, Mainframe for file access.
Senior ETL Informatica Developer / Data Warehouse Analyst
CITIGROUP
01.2005 - 12.2005
Managed data operations for Citigroup Global Consumer Group, overseeing diverse product offerings.
Led Retail Banking Data Warehouse initiatives covering branch banking, mortgages, and more.
Modeled incoming source feeds, ensured alignment with enterprise standards, and collaborated on enhancements.
Designed and implemented ETL Informatica mappings for Retail Banking Data Warehouse.
Implemented optimization techniques and conducted stress testing for ETL routines.
Administered ETL tasks, maintained privileges, and scheduled jobs.
Mentored developers on Informatica best practices and contributed to ETL development.
ETL Informatica Developer / Analyst / Production Support
PFIZER Inc
02.1999 - 12.2004
Directed Sales and Marketing Enterprise Data Warehouse initiative for informed decision-making and trend prediction.
Led ETL and Oracle routines to extract data from various sources like NDC, IMS, ACXIOM, and DENDRITE.
Managed transaction data feeds, analyzing 20 million records weekly for sales, marketing, and manufacturing insights.
Implemented incremental data load logic and Oracle replication strategies for real-time processing.
Developed DataStage jobs and conducted performance tuning for Sales/Marketing data.
Oversaw data modeling for snowflake and star schema methodologies, ensuring granularity.
Conducted gap analysis, designed processes, and developed performance metrics.
Ensured project milestones were met, providing estimates and resolving issues.
Supported production processes, maintaining over 12 TB data in a complex environment.
Environment: Unix (HP-UX 11.x), SUN E15K, Sun Solaris (2.6/2.8), Windows 2K, Oracle 9i/8i, Unix Shell Scripts, PL/SQL, Informatica PowerCenter 6.x and 7.x, DataStage 5.2, First Logic, ERWIN, PVCS, Autosys, and Maestro.
Programmer Analyst
American Home Shield Insurance, AHS
10.1998 - 01.1999
Designed and developed a Data Warehouse for AMERICAN HOME SHIELD, focusing on accounting department reporting and ad hoc queries.
Migrated data from UDB database to Oracle using client/server technology for improved efficiency.
Analyzed and evaluated legacy data from UDB Database on HP-3000.
Developed SQL Loader control scripts for loading data into Oracle database from raw data files.
Incorporated database triggers to accommodate business rules and generated On Demand Automatic Financial reports using SQL Plus.
Efficiently coded PL/SQL statements for dynamic processing to support new paradigms within Cobol programs, enhancing report handling flexibility.
Education
Bachelor of Science - Computer Science & Engineering
PDA College of Engineering
India
Skills
Project management- End to End SDLC
ETL Tools- Informatica ETL, Azure Data Factory, Talend ETL, SSIS, Data Stage, Ab Initio and