Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

Arvindh GV

Kochi

Summary

  • Overall 6.10 years of IT experience, around 1.4 years as an Data Engineer, 2+ years as an Informatica MDM Developer and 3+ years as an Oracle PL/SQL Developer.
  • Experience in Mater Data Management (MDM) Multi Domain Edition HUB Console Configurations, Provisioning Tool and E360 UI Application creations.
  • Having domain experience on Healthcare and Storage Management Extensive experience in Master Data Management (MDM) HUB Console Configurations such as Staging Process, Loading Process, Match and Merge Process and Cleansing Functions.
  • Experience in configuring and running jobs in batch groups and real-time processing.
  • Experience in trust configurations and setting up validation rules for base objects.
  • Involved in extracting the data from the Flat Files and Snowflake database share into staging area.
  • Used Matillion (ETL) tool for extraction, transformation and loading data from Staging to Integration and Integration to Data warehouse.
  • Created SCD1 and SCD2 workflows for Integration and Data warehouse tables.
  • Created Json batch files for Apache Airflow to create DAG and Trigger the DAGs to load the data from Staging to Target layer.
  • Having experience in creating Airflow DAGs and Triggering and troubleshooting the issue by viewing Logs when DAG failed.
  • Having good experience in Modifying and updating the Data Catalog and Lineage, BRMS modules.
  • Expertise in Client-Server application development using Oracle, SQL PLUS, PL/SQL, SQL LOADER.
  • Involved in all phases of the SDLC (Software Development Life Cycle) from analysis, design, development, testing, implementation and maintenance with timely delivery against aggressive deadlines.
  • Good exposure to Development, Testing, Production Support, Debugging, Implementation and Documentation.
  • Have written SQL complex queries using joins, sequences and cursors.
  • Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.
  • Experienced in PL/SQL Performance Tuning using Bulk Collect technique.
  • Experience in Unit testing and System Integration testing.
  • Loaded Data into Oracle Tables using SQL Loader.
  • Used Performance Tuning techniques like Hints, Indexes and Explain Plan for better performance.
  • Post Production support, bug fixing and enhancement. Good knowledge in Data warehouse concepts and ETL.

Overview

8
8
years of professional experience
1
1
Certification

Work History

Applications/Software Developer II

IQVIA
12.2023 - Current
  • Company Overview: IQVIA formerly Quintiles and IMS Health, Inc is a leading global provider of advanced human data science and analytics, technology solutions, and clinical research services to the life sciences industry.
  • NOVARTIS is a Swiss multinational pharmaceutical corporation and one of the largest pharmaceutical companies in the world based in Basel, Switzerland.
  • Novartis Brazil is going through a Data and Analytics transformation journey and aiming to leverage the IMI data product.

Roles and Responsibilities:

  • Creating Data Ingestion jobs to load the data from Snowflake share/AWS S3 to STAGING & Data Quality Checks for DI & INT jobs in Glass box (IMDNA portal).
  • Updating Data Catalogue & Lineage in Glass box DEV & QA environments.
  • Creating BRMS process for DI jobs in Glass box.
  • Creating Matillion ETL Integration jobs for DI jobs as SCD1 workflow to load the data from STAGING to INTEGRATION, validation & running the job.
  • Creating Matillion Data warehouse jobs for INT jobs as SCD2 workflow to load the data from INTEGRATION to DATAWAREHOUSE, validation & running the job.
  • Creating Unit Test Case Evidence for data loaded in STAGING, INTEGRATION & DATAWAREHOUSE layers.
  • Working on change request, as per requirements like inserting new columns in tables, adding new primary key, and modifying the workflow in Matillion ETL Tool and updating/deleting jobs in Airflow DAG.
  • Creating batch files in Json to Import & Generate DAG and Trigger the DAG in Airflow.
  • Exporting Json scripts in Matillion and Airflow to deploy in QA and PROD environments.

MDM Developer

HCL Technologies Ltd
06.2020 - 07.2022
  • Company Overview: Merck Sharpe & Dohme. – One of the largest global Pharmaceuticals, Chemical and Life Science Company that manufactures Drugs, Medicines and Products.
  • Merck develops and produces medicines, vaccines, biologic therapies and animal health products.

Roles and Responsibilities:

  • Having good understanding of Informatica MDM Architecture and Business process.
  • Involved in gathering the business requirements for the data warehouse.
  • Designed and created BO, staging tables, mappings, transformations as per business requirements.
  • Worked on updating the Hub environment as per the Data model changes.
  • Developed and Configure MDM Landing, Staging and Base Objects.
  • Configured data flow as Land, Stage, Load, Tokenize, Match, Consolidate and Publish.
  • Good experience in working on Match & Merge Configuration and Performance Tuning.
  • Users and Roles creation in Hub console and Customer 360.
  • Experience in Trust and validation setting for the source systems.
  • Understanding of Informatica MDM API (SIF & Business Entity Services).
  • Developing and deploying HUB user exit for customization transformations.
  • Basic understanding on Active VOS BPM.
  • Taking care of inbound and outbound data flow.
  • Create and update Layout, Transformation, Entity and View in Provisioning.
  • Search Query and Task update in Customer 360.
  • Good understanding and experience in data warehousing concepts and ETL.

Software Developer

HCL Technologies Ltd
04.2019 - 05.2020
  • Company Overview: Merck Sharpe & Dohme. – One of the largest global Pharmaceuticals, Chemical and Life Science Company that manufactures Drugs, Medicines and Products.
  • Merck develops and produces medicines, vaccines, biologic therapies and animal health products.

Roles and Responsibilities:

  • Develop Packages, Functions, and Triggers for different functionalities in the project.
  • Created database objects like tables, synonyms, indexes, views, materialized views.
  • Experience on UNIX applications environment.
  • Monitoring application servers, service and providing fix for the users on their requirements in a timely manner.
  • Involved in data loading using PL/SQL and SQL Loader calling UNIX scripts to download and manipulate files.
  • Responsible to develop design document.
  • Daily status update to the onsite counterpart.
  • Knowledge sharing about the project's business to the new team members.
  • Involved in Code Review process.
  • Good knowledge of key oracle performance related features such as Hints, Explain Plan and Indexes.
  • Taking care of ARIES POLLER loads in multiple Servers.
  • Taking care of stuck transactions during automated loads in server and manual load issues.
  • Troubleshooting the servers during automated loads.
  • Deleting duplicates from the processed data.
  • Reprocessing the data in case of errors.
  • Preparing Daily load reports, Weekly reports, and Monthly reports and presenting it to Stakeholders and business.
  • Worked on BMC remedy, Service Now tool for reporting ARIES load issue.

Software Developer

HCL Technologies Ltd
01.2017 - 03.2019
  • Company Overview: Iron Mountain Inc
  • (IRM) is an American enterprise information management Services Company founded in 1951 and headquartered in Boston, Massachusetts.
  • IRM is an records management, information secure shredding, and data backup and recovery services to more than 220,000 customers throughout North America, Europe, Latin America, Africa, and Asia.

Roles and Responsibilities:

  • Worked with the team to design, develop, test & implement system.
  • Creation of database objects like tables, procedures using Oracle tools like PL/SQL, TOAD.
  • Written Stored Procedures using PL/SQL.
  • Interacted with clients to understand their requirements.
  • Developed PL/SQL Packages, Procedures, Functions, Indexes and Collections to implement business logic.
  • Rich experience in writing SQL queries, views, cursors, collections, Bulk collect, Ref cursor, cursor variables, Sys_Refcursor, Dynamic SQL.
  • Troubleshoot the existing PL/SQL procedures, functions, triggers.
  • Stored procedures and triggers were developed for Business rules.
  • Involved in Production Support and Operations teams to resolve production issues in a timely and efficient manner.
  • Loaded Data into Oracle tables using SQL Loader.
  • Debugging the code using tools and prepared the root cause analysis for fix.
  • Understanding the client Project Change Requests requirements and prepared Detailed Design Document and Impact Analysis Document.
  • Worked on AMEX Credit Card Expense Integration.
  • Responsible to develop design document.
  • Develop Packages, Functions, and Triggers for different functionalities in the project.
  • Weekly status update to the onsite counterpart.
  • Knowledge sharing about the project's business to the new team members.
  • Involved in Code Review process.

Education

ME - CSE

SSIET
Chennai
01.2013

BE - CSE

KCET
Virudhunagar
01.2009

HSC -

TMHNU Mat.Hr.Sec, School
Theni
01.2005

SSLC -

TMHNU Mat.Hr.Sec, School
Theni
01.2003

Skills

  • INFORMATICA MDM
  • SQL
  • PL/SQL
  • GLASSBOX
  • MATILLION ETL
  • APACHE AIRFLOW
  • Snowflake
  • MS Office
  • Jira
  • Service Now

Certification

IQVIA Certification: Orchestrated Analytics [OA] - IDP [INTEGRATED DATA PLATFORM], KPI Library, NBA [NEXT BEST ACTION].

LinkedIn Learning: Snowflake Badges, Apache Airflow.

Matillion ETL: Matillion Badge

Timeline

Applications/Software Developer II

IQVIA
12.2023 - Current

MDM Developer

HCL Technologies Ltd
06.2020 - 07.2022

Software Developer

HCL Technologies Ltd
04.2019 - 05.2020

Software Developer

HCL Technologies Ltd
01.2017 - 03.2019

ME - CSE

SSIET

BE - CSE

KCET

HSC -

TMHNU Mat.Hr.Sec, School

SSLC -

TMHNU Mat.Hr.Sec, School
Arvindh GV