Summary
Overview
Work History
Education
Skills
Timeline
Generic

Mohammad Salman Ansari

Mumbai

Summary

  • Have Extensively Experience in IT data analytics projects, experience in migrating on premise ETLs to Google Cloud Platform using cloud native tools such as BIG Query , Google Cloud Data Storage , Cloud composer , GCS- Bucket , Cloud dataflow , gsutil , Python.
  • Practical understanding of the Data Modeling (Dimensional & Relation) concepts like Star - Schema Modeling , Snowflake Schema Modeling , Fact and Dimension tables.
  • Highly experienced in developing data marts and developing data marts and developing warehouse designs using distributed SQL concepts , Python ,composer , CDF etc to cope with increasing volume of data.
  • Experience in building and architecting multiple data pipelines , end to end ETL process for data ingestion , transformation in GCP.
  • Worked with cloud composer the run end to end data pipelines to scheduled jobs and dependencies.
  • Diverse experience in all phase of software development life cycle (SDLC) especially in Analysis,Design , Development , Testing and Deploying of applications.
  • Strong knowledge in data preparation , data modelling and data visualization using OAC and had experience in developing various reports , dashboard using various visualization in OAC.
  • Extensively worked in ODI 12c.
  • Knowledge on various file format like XML , Avro .
  • Can work parallely in GCP and Oracle cloud coherantly.



Overview

10
10
years of professional experience

Work History

Senior GCP Data Engineer

Segmetriq Ltd
01.2021 - Current

Technologies Used: BIQ Query , Cloud Data Fusion , Composer , Apache Airflow , Python

Responsibilities:

  • Worked on compute GCP services like compute engine,cloud load balancing ,cloud storage,cloud SQL etc.
  • Migrated an entire oracle database to BigQuery and using of power bi for reporting.
  • Build data pipelines in airflow in GCP for ETL related jobs using different airflow operators.
  • Experience in GCP Cloud Data fusion , Cloud functions , BigQuery.
  • Experince in moving data between
  • Post migration executed all jobs and did data validation for all target tables.
  • Used cloud shell in GCP to configure the services data proc , storage , BigQuery , PI Packages , Drivers
  • Wrote Scripts in BigQuery for creating complex stored procedures for high performance metrics like partitioning , clustering.
  • Extensively worked on Cloud Data Fusion and Apache Airflow to create ETL pipelines for SCD type 1 , Type 2 architecture.
  • Created bigquery authorized views for row level security or exposing data to other team.
  • worked extensively on cloud composer for task scheduling in serial and parallel , executing cloud fusion pipelines , executing bigquery procedures, creating connections for relational and cloud technologies.
  • worked extensively on composer for monitoring session and handling errors from logs.


Senior Technnical Analyst

Evolutionary Systems
03.2020 - 01.2021

Technologies Used: ODI 12c, ADW, Oracle Cloud Infrastructure, BICC, OTBI

Description: Archrock is an energy infrastructure company with a pure-play focus on midstream natural gas compression. The scope of the project to extract data from Oracle SaaS application like FSCM, CRM, and HCM and load data into ADW. The data extraction from SaaS to ADW was done in ODI 12c

Responsibilities :

  • Design end to end implementation to load from Oracle SaaS applications like ERP, FSCM, CRM, HCM etc. into Autonomous data warehouse .
  • Extensively worked on Rest & SOAP API for extracting data from Oracle saas.
  • Used OCI object storage as a staging area to process data from views .
  • Used Oracle business intelligence cloud connector (BICC) as extract business intelligence and other data in bulk into object storage
  • Used BIP report for views which were not available in BICC .
  • Design automation scripts to edit the Offerings settings like selecting only required columns , reset to full extract , reset to shipped content etc using JSON , RestAPI.
  • Extensively used auto scaling and auto indexing feature of ADW Generated server side PL/SQL scripts for data manipulation and validation and materialized views for remote instances.
  • Worked on DBA task generating AWR report and SQL Tuning advisor to optimize loads.
  • Migrated all database objects from dev to UAT and then to Prod by data pump wizard .
  • Migrated all scenarios and loadplans from dev to uat and then to prod using smart export in ODI .
  • Provided professional services and support in a dynamic work environment.
  • Managed database admin roles by creating user and profiles

ODI Developer (ICICI Bank)

Clover Infotech
08.2019 - 02.2020

Technologies Used: ODI 12c, OAC, Oracle 12c

Description: ICICI bank produces financial dashboards and reporting using Microsoft Excel. The Data is being fetched from the different source systems like OFSAA, Perform and SAP to prepare various reports. Current report is completely manual and most of the cases is person dependent manual work.ICICI is looking towards building solution which will help them in consolidating the information’s from different applications being used in ICICI and prepare the financial re-posting's with expected cuts of view. The Purpose of the Financial Reporting Datamart for ICICI is to provide a conceptual understanding, design and description of the key functioning elements of the Datamart

Responsibilities:

  • Identifying the business requirements like how many source system data, tables & columns are required.
  • Identifying the datamart business identities and then classifying the different facts & dimension tables like product master, customer master, branch master etc.
  • Configuration of database and app server on different instances.
  • Extracting of source systems and required data and then applying business transformation before loading data into target.
  • Indexing the data if required and partitioning the fact tables and performing incremental gather stats in memory cache,archiving and compression to improve the performance.
  • Maintained different layers of data loading strategy like landing, staging and target, ETL and Reporting Layer to debug in case of execution failure and maintaining the flexibility in design in case of any change in the source system.
  • Designing Datamart in star schema format.
  • Using ODI to orchestrate the ETL process from source to the target layer.
  • Used OBIEE 12c classic version to build dashboard and reports.

ODI Developer (Yes Bank Integration)

Clover Infotech
06.2014 - 02.2019

Technologies Used: Oracle data Integrator, Oracle, MS Access, Flat Files, MS SQL Server

Description: Developed ELT design for more than 20 banking applications for India’s 4th largest private sector bank. Extracted data from multiple databases including such as MS SQL, ORACLE 10G/11G/12C, SYBASE, MYSQL, FILES etc. using ODI (Oracle data integrator) 11g.Client wanted data being extracted had to be transformed directly at the target server. Transforming data directly at the target server saved a lot of time for the loading process which in result increased the productivity of banking system. More than 500 tables had to be populated daily on append and incremental update basis and almost the same numbers of MV’s and Views were refreshed on these tables. Apart from core banking applications took the sole responsibility for developing ELT for YES Bank Credit Card system. As a Senior ODI developer I had to train other ODI developer’s and support members which were part of the BI project by creating new users and assigning customized profiles to them. Bug fixing, Raising SR (Service request) and customizing developed codes from Oracle and covering every aspect of the project from development to testing and deployment onto production servers are key features for my designation as ODI Developer in Yesbank

Responsibilities:

  • Daily monitoring ELT’s executed in production environment and fixing issues if exists.
  • Collecting information from different users for multiples core banking applications changes and new enhancement and creating a prototype for how the development will carry on in UAT environment.
  • Developing the ELT’s using ODI objects like mappings, procedures, packages, loadplans etc. in limited time by making full utilization of repository tables.
  • Very smoothly handled data transfer from heterogeneous source system's into target systems.
  • Loaded more than a year data files for Yesbank Credit card within a day.
  • Migrated more then 100+ loadplans, 2000+ scenarios while Yesbank oracle database migration from 11g to 12c.
  • Very smoothly designed exception handling in ODI loadplans by raising e-mail for failed jobs and skipping dependent jobs execution and resuming execution for non-dependent jobs.
  • Responsible for loading correct data from files and tables into target by ensuring all transformations in place.
  • Configured ODI console to execute and monitor for the support team &debugging issues related to weblogic server.
  • Currently involved in Yes bank ongoing ODI 11g to 12c in HA (high availability) migration by gathering all the necessary system requirements.
  • Evolved in all POC done for any projects in Yesbank.
  • Providing training & support for production issues to the support team.
  • Taking the responsibility for 24 * 7 server availability on phone calls if not in office.

ODI Developer (Yes Bank Migration)

Clover Infotech
11.2017 - 03.2018

Technologies: ODI 11g / ODI 12c, Microsoft Excel

Description: Migrating ODI 11g to 12c using ODI migration assistant and smart export/import feature.

Responsibilities:

  • Created an ETL jobs for planning and scheduling application using Oracle data integrator having various sources as Oracle, Text files, MS Access DB, MS-Excel .
  • Responsible for analyzing the basic requirements like codes which had to be changed in ODI 12c, number of ODI objects like loadplans, interfaces, packages, procedures etc. to migrate and prepared the detail technical specification document
  • Installed ODI on UAT, DR and PROD setup with standalone & Java EE agent by configuring weblogic admin and managed server along with weblogic administrators in a HA mode.
  • Migrated ODI objects by smart export feature in UAT and PROD Environment.
  • Developed SQL scripts to match count of objects from repository table under UAT & PROD environment.
  • Successfully tested modified code which had used those perimeters that are deprecated in ODI 12c work repository.
  • Successfully installed ODI studio and console in PROD environment.
  • Successfully tested data at table level by executing jobs from both environments.
  • Successfully applied multiple patches to overcome the bugs in ODI 12.2.1.2.6.
  • Found 2 new bugs while migration in version 12.2.1.2.6 and raised the same to Oracle support which Oracle took one month to develop the patch.
  • Migrated more than 150+ loadplans, 500+ interfaces and procedures.
  • Created table partitions on SCD tables for fast flow of data.

Education

Bachelor of Engineering - Computer Science

Rajiv Gandhi Technical University
India
06.2013

Skills

Technical Skills :

  • ETL Tools: Cloud Data Fusion,Bigquery,Composer,Apache Airflow,Oracle Data Integrator 11g/12c
  • Reporting: OBIEE, OAC
  • Database: GCP Bigquery,Oracle, ADW, MYSQL, MSSQL
  • Application Tools: - SQL & PLSQL developer
  • OS:Windows Server, Redhat Linux
  • Languages: SQL/PL SQL, Groovy scripting , Batch Script , Python
  • Cloud Technologies: GCP , OCI , ADW , BICC , RESTFUL services

Timeline

Senior GCP Data Engineer

Segmetriq Ltd
01.2021 - Current

Senior Technnical Analyst

Evolutionary Systems
03.2020 - 01.2021

ODI Developer (ICICI Bank)

Clover Infotech
08.2019 - 02.2020

ODI Developer (Yes Bank Migration)

Clover Infotech
11.2017 - 03.2018

ODI Developer (Yes Bank Integration)

Clover Infotech
06.2014 - 02.2019

Bachelor of Engineering - Computer Science

Rajiv Gandhi Technical University
Mohammad Salman Ansari