To obtain a position in the value driven organization, utilize my creative skills to present and disseminate technical knowledge, Explore the opportunity to expand my knowledge and be an asset to the firm.
Having 17+ Years of extensive experience in Big Data, Data Warehousing and Cloud technologies in Hadoop Ecosystem environment and using ETL/ELT models to develop, construct, test, and maintain data architectures, such as databases and large-scale processing systems, leveraging various cloud platforms like AZURE/AWS/GCP Cloud Platforms Design and implement data frames and schemas to support data requirements, ensuring data integrity, efficiency, and scalability.
Requirements Analysis: Collaborate with business stakeholders, data analysts, and other team members to gather and analyze requirements for data integration projects. Understand business processes, data sources, and integration needs to design effective solutions.
Data Integration Design: Design data integration workflows, mappings, and transformations using Informatica PowerCenter or other Informatica products. Define data mappings, transformations, and business rules to extract, transform, and load (ETL) data from source systems to target systems.
Development and Implementation: Develop and implement data integration solutions using Informatica PowerCenter. Configure source and target connections, create mappings, workflows, sessions, and workflows, and schedule data integration jobs.
Data Quality and Cleansing: Implement data quality and cleansing processes to ensure the accuracy, completeness, and consistency of data. Use Informatica Data Quality (IDQ) or other data quality tools to profile, cleanse, and standardize data.
Performance Tuning and Optimization: Optimize data integration workflows, mappings, and transformations for performance and efficiency. Identify and address performance bottlenecks, optimize SQL queries, and fine-tune session and workflow configurations.
Metadata Management: Manage metadata for data integration processes, including data lineage, impact analysis, and version control. Document data integration workflows, mappings, and transformations to facilitate knowledge sharing and maintainability.
Error Handling and Monitoring: Implement error handling and logging mechanisms to track data integration errors, exceptions, and failures. Monitor data integration jobs, troubleshoot issues, and resolve errors in a timely manner to ensure data integrity and availability.
Collaboration and Documentation: Collaborate with cross-functional teams including data architects, database administrators, and business analysts to design and implement data integration solutions. Document data integration requirements, designs, and processes to facilitate communication and knowledge sharing.
Compliance and Security: Ensure compliance with data governance policies, regulatory requirements, and security standards. Implement data encryption, access controls, and auditing mechanisms to protect sensitive data and ensure data privacy and security.
Continuous Learning and Improvement: Stay updated on emerging technologies, best practices, and trends in data integration and Informatica tools. Participate in training programs, workshops, and conferences to enhance your skills and knowledge in data integration and Informatica development.
Data Integration Design: Design data integration workflows, mappings, and transformations to extract, transform, and load (ETL) data from source systems to target systems. Define data mappings, transformations, and business rules based on requirements provided by business stakeholders and data analysts.
Development and Implementation: Develop and implement data integration solutions using Informatica PowerCenter or other Informatica products. Configure source and target connections, create mappings, workflows, sessions, and workflows, and schedule data integration jobs.
Data Quality and Cleansing: Implement data quality and cleansing processes to ensure the accuracy, completeness, and consistency of data. Use Informatica Data Quality (IDQ) or other data quality tools to profile, cleanse, standardize, and enrich data as it moves through the data integration pipeline.
Performance Tuning and Optimization: Optimize data integration workflows, mappings, and transformations for performance and efficiency. Identify and address performance bottlenecks, optimize SQL queries, and fine-tune session and workflow configurations to improve data processing throughput and reduce execution times.
Error Handling and Logging: Implement error handling and logging mechanisms to track data integration errors, exceptions, and failures. Handle data integration errors gracefully, log error messages, and notify stakeholders of issues that require attention.
Metadata Management: Manage metadata for data integration processes, including data lineage, impact analysis, and version control. Document data integration workflows, mappings, and transformations to facilitate knowledge sharing, collaboration, and maintainability.
Scripting and Custom Development: Develop custom scripts, expressions, and functions to extend the functionality of Informatica PowerCenter. Use scripting languages such as Shell Scripting, Python, or JavaScript to implement custom logic and data transformations as needed.
Integration with Source and Target Systems: Integrate Informatica PowerCenter with various source and target systems such as relational databases, flat files, XML files, web services, and cloud platforms. Configure connections, drivers, and protocols to enable data extraction and loading from/to different data sources and destinations.
Testing and Validation: Conduct unit testing, integration testing, and user acceptance testing (UAT) to validate the functionality and performance of data integration solutions. Verify data accuracy, completeness, and consistency against predefined acceptance criteria and business requirements.
Documentation and Knowledge Sharing: Document data integration requirements, designs, configurations, and processes to facilitate knowledge sharing and collaboration among team members. Create technical documentation, user guides, and training materials for data integration solutions.
Continuous Learning and Improvement: Stay updated on Informatica product releases, updates, and best practices in data integration and ETL development. Participate in training programs, workshops, and conferences to enhance your skills and knowledge in Informatica development and data integration technologies.
Requirements Gathering: Collaborate with business stakeholders, data analysts, and project managers to understand data integration requirements. Gather requirements related to data sources, data transformations, business rules, and integration workflows.
Solution Design: Design data integration solutions based on the gathered requirements. Define data integration workflows, mappings, and transformations to extract, transform, and load (ETL) data from source systems to target systems. Design data models, schemas, and mappings to ensure efficient and effective data processing.
Development: Develop data integration solutions using Informatica PowerCenter or other Informatica products. Configure source and target connections, create mappings, workflows, sessions, and workflows to implement the designed solution. Write and maintain ETL code, scripts, and transformations using Informatica's visual development environment.
Testing: Conduct unit testing, integration testing, and system testing of data integration solutions. Verify data accuracy, completeness, and consistency against predefined acceptance criteria and business requirements. Identify and troubleshoot issues, errors, and performance bottlenecks in data integration workflows.
Documentation: Document data integration requirements, designs, configurations, and processes. Create technical documentation, user guides, and training materials for data integration solutions. Maintain documentation repositories and version control systems to ensure documentation accuracy and accessibility.
Deployment and Maintenance: Deploy data integration solutions to production environments and ensure smooth transition to operations teams. Monitor and maintain data integration workflows, mappings, and transformations in production. Perform regular maintenance activities such as backups, updates, and performance tuning to optimize system performance and reliability.
Collaboration: Collaborate with cross-functional teams including data architects, database administrators, business analysts, and quality assurance testers. Coordinate with project managers, team leads, and stakeholders to ensure alignment with project goals, timelines, and deliverables.
Continuous Improvement: Stay updated on Informatica product releases, updates, and best practices in data integration and ETL development. Participate in training programs, workshops, and conferences to enhance your skills and knowledge in Informatica development and data integration technologies. Proactively identify opportunities for process improvements and tool enhancements to optimize data integration workflows and achieve business objectives.
Requirements Gathering: Collaborate with business stakeholders, data analysts, and project managers to understand data integration requirements. Gather requirements related to data sources, data transformations, business rules, and integration workflows.
Solution Design: Design data integration solutions based on the gathered requirements. Define data integration workflows, mappings, and transformations to extract, transform, and load (ETL) data from source systems to target systems. Design data models, schemas, and mappings to ensure efficient and effective data processing.
Development: Develop data integration solutions using Informatica PowerCenter or other Informatica products. Configure source and target connections, create mappings, workflows, sessions, and workflows to implement the designed solution. Write and maintain ETL code, scripts, and transformations using Informatica's visual development environment.
Testing: Conduct unit testing, integration testing, and system testing of data integration solutions. Verify data accuracy, completeness, and consistency against predefined acceptance criteria and business requirements. Identify and troubleshoot issues, errors, and performance bottlenecks in data integration workflows.
Documentation: Document data integration requirements, designs, configurations, and processes. Create technical documentation, user guides, and training materials for data integration solutions. Maintain documentation repositories and version control systems to ensure documentation accuracy and accessibility.
Deployment and Maintenance: Deploy data integration solutions to production environments and ensure smooth transition to operations teams. Monitor and maintain data integration workflows, mappings, and transformations in production. Perform regular maintenance activities such as backups, updates, and performance tuning to optimize system performance and reliability.
Collaboration: Collaborate with cross-functional teams including data architects, database administrators, business analysts, and quality assurance testers. Coordinate with project managers, team leads, and stakeholders to ensure alignment with project goals, timelines, and deliverables.
Continuous Improvement: Stay updated on Informatica product releases, updates, and best practices in data integration and ETL development. Participate in training programs, workshops, and conferences to enhance your skills and knowledge in Informatica development and data integration technologies. Proactively identify opportunities for process improvements and tool enhancements to optimize data integration workflows and achieve business objectives.
Requirements Gathering: Collaborate with business stakeholders, data analysts, and project managers to understand data integration requirements. Gather requirements related to data sources, data transformations, business rules, and integration workflows.
Solution Design: Design data integration solutions based on the gathered requirements. Define data integration workflows, mappings, and transformations to extract, transform, and load (ETL) data from source systems to target systems. Design data models, schemas, and mappings to ensure efficient and effective data processing.
Development: Develop data integration solutions using Informatica PowerCenter or other Informatica products. Configure source and target connections, create mappings, workflows, sessions, and workflows to implement the designed solution. Write and maintain ETL code, scripts, and transformations using Informatica's visual development environment.
Testing: Conduct unit testing, integration testing, and system testing of data integration solutions. Verify data accuracy, completeness, and consistency against predefined acceptance criteria and business requirements. Identify and troubleshoot issues, errors, and performance bottlenecks in data integration workflows.
Documentation: Document data integration requirements, designs, configurations, and processes. Create technical documentation, user guides, and training materials for data integration solutions. Maintain documentation repositories and version control systems to ensure documentation accuracy and accessibility.
Deployment and Maintenance: Deploy data integration solutions to production environments and ensure smooth transition to operations teams. Monitor and maintain data integration workflows, mappings, and transformations in production. Perform regular maintenance activities such as backups, updates, and performance tuning to optimize system performance and reliability.
Collaboration: Collaborate with cross-functional teams including data architects, database administrators, business analysts, and quality assurance testers. Coordinate with project managers, team leads, and stakeholders to ensure alignment with project goals, timelines, and deliverables.
Continuous Improvement: Stay updated on Informatica product releases, updates, and best practices in data integration and ETL development. Participate in training programs, workshops, and conferences to enhance your skills and knowledge in Informatica development and data integration technologies. Proactively identify opportunities for process improvements and tool enhancements to optimize data integration workflows and achieve business objectives.
Data Transformations
undefinedAzure components (Data Factory ,Azure DataLake, Deltalake|Data Bricks , Azure Synapse ,Azure Active Directory ,Azure Key Vault, Power BI)| GCP (Data Flow,Data Proc, Pub/Sub,Cloud Composer(Apache AirFlow) ,Data Prep ,Big Query , Cloud Storage,Compute Engine )| NIFI|NOSQL|Kafka|Cassandra|Informatica | Kubernetes Engine | IICS | Salesforce | Apache Spark Scala| PySpark | Python | HIVE | SQOOP | Map Reduce |Oracle | SQL | Unix, Eclipse| GitHub| Jenkins | JIRA| Agile| ServiceNow| IDQ 9.6 | AutoSys| Looker Studio| JSON|ORC|Parquet|
CRV (Customer Relationship View), Wells Fargo Centre, Strategic Services innovation Technologies,
IBCM (Investment Banking Capital Markets), Wells Fargo Centre, Investment Banking Capital Markets,
EA-Integration –Clinical Trails Integration Program, Pfizer, Perceptive Informatics (PAREXEL),
CREDIT SUISSE, CLIENT TECHNOLOGY SOLUTIONS
INVESCO-ATPW (Atlantic Trust Private Wealth Management), INVESCO,
Standard and Poor’s, Standard and Poor’s, Standard & Poor’s is a leading provider of financial market intelligence.
Global Integrated Data store (GID), Franklin Templeton Investments, GID is the Global Integrated Data store
Advice, Principal Global Services (PGS), Major reason of this projects is loading of Affiliate Financial Planning (AFP)
Super-Fix, GENERAL ELECTRIC,