Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

MOHAMMED JAWAD

Bengaluru

Summary

With over 12 years of Experienced with building and maintaining data pipelines to ensure seamless data flow. Utilizes advanced knowledge of big data technologies to drive data-driven decision-making. Track record of enhancing data architecture for improved performance and reliability.

Overview

13
13
years of professional experience
1
1
Certification

Work History

Senior Data Engineer

Emirates NBD
11.2023 - 03.2025

Project #1:

Data Products Implementation

Responsibilities:

  • Designed and developed PySpark ETL pipelines to process structured and semi-structured data.
  • Implemented Spark DDI ingestion for efficient data processing.
  • Optimized data transformations to improve performance and reduce execution time.
  • Scheduled and monitored PySpark jobs using Oozie.
  • Worked closely with stakeholders to ensure business requirements were met.

Project #2 :

Central Bank Regulatory Reporting

Responsibilities:

  • Developed and optimized Informatica BDM workflows for data ingestion and transformation.
  • Integrated multiple source systems and implemented error handling mechanisms.
  • Designed and scheduled Oozie workflows for automated job execution.
  • Maintained historical data in Hadoop for compliance and audit requirements.
  • Collaborated with business teams to ensure data accuracy for regulatory reporting.

Senior Consultant

JP Morgan & Chase
06.2022 - 09.2023

Project #1:

Snowflake Data Integration

Responsibilities:

  • Migrated TB's data from Oracle to Snowflake.
  • Ingestion of data/files using Snowpipe from external stages (AWS S3 bucket and Azure Container) and used copy into command for bulk loading.
  • Created and executed streams in snowflake for CDC.
  • Created internal and external stage to load the data into snowflake tables
  • Created different DBT models, macros and persist the data in different data marts in Snowflake.
  • Good hands-on CI/CD process for deploying new and modified existing components to higher environments.
  • Written complex SQL queries and optimized queries for performance tuning
  • Performed Design and build internal snowflake applications. Transferring data to and from the snowflake using DBT ETL tool.
  • Worked in Snowflake Materialized views to publish data.
  • Used Clone, Time travel and Swap to recover corrupted data.
  • Used Transient tables to reduce storage costs of Time Travel and Fail Safe.
  • Built end to end pipeline using prefect flows and automated the data load process.

Project #2:

Data Integration

Responsibilities:

  • Understanding project requirements, preparing design documents or Source to target mappings and get them approved from client.
  • Developing complex ETL(Extract/Transform/Load) mappings using Informatica to support business requirements and loading data from various source systems such as Oracle, Flat files and load it into downstream integration applications for reporting layer purposes.
  • Enhance or create new Unix Shell Scripts for encryption of Flat files to external Vendors.
  • Developing PL/SQL Stored Procedures, Packages, Triggers, and sequences to handle huge amounts of data for better performance.
  • Performing Informatica code review and doing code deployment at each higher environment using deployment group (Using Jenkins tool).
  • Fine tuning existing Informatica mappings for performance optimization.
  • Performing unit testing with SQL scripts and implementing test cases through.
  • ALM/QTest tools and supports System Integration Testing by triggering Control M jobs.
  • Monitoring of application batch processes to ensure successful completion of jobs.
  • Handles any sort of DBA or ETL issues in production within limited time which are created based on then priority as incidents in Service Now.

Senior Programmer Analyst

FIS Global
11.2015 - 06.2022

Project #1:

NAC Base2000&TBS

Responsibilities:

  • Understanding the High level design document and involved in preparing the Low level design documents.
  • Involved in developing ETL jobs using Sequential Files, data sets, database Stages, Aggregator stage, Head stage, Sort stage, Remove Duplicate stage, Transformer stage, Change Capture stage, Funnel Stage, Look-up stage, Join Stage etc.
  • Involved in implementing different logics like incremental loading, change capturing and slowly changing dimensions.
  • Involved in UNIX shell scripting, implementing the parameterization of the jobs.
  • Involved in writing the test cases for the unit testing and supporting system testing and fixing the defects across the modules.
  • Involved in migration of the jobs between different Environments.
  • Took complete ownership of all the production implementation activities.
  • Monitor ETL jobs that are scheduled in Production environment.
  • Ensure issues are resolved before SLA time.
  • Pro actively involved and highlighted issues present in the current system.
  • Perform Weekly and monthly production patching and fixing the vulnerabilities.
  • Engaging different team for quick resolution of the production issues, assigning incident/CMS/SNOW/Change/ECR/Service tickets which ever required. Escalating issues to higher/incident managers by involving ECC support team.

Project #2:

NAC Data Migration

Responsibilities:

  • Migration of daily and weekly extracts into Snowflake Data Warehouse.
  • Experience in data migration from RDBMS to Snowflake cloud data warehouse.
  • Involved in weekly and daily status project meeting with onsite coordinator.
  • Designing to build ETL process on Snowflake for Tableau Reports/Dashboard.
  • Created a new data pipelines and data flows in Snowflake by applying required transformations on data.
  • Move data from Exadata/Oracle database to Snowflake and AWS platform.
  • Worked on SnowSQL and Snowpipe.
  • Created Snowpipe for continuous data load.
  • Used COPY to bulk load the data.
  • Created data sharing between two snowflake accounts.
  • Created internal and external stage and transformed data during load.
  • Experience in working with AWS services.
  • Worked on Oracle DB, MS SQL Server, RedShift and Snowflakes.
  • Defined virtual warehouse sizing for Snowflake for different type of workloads.

Project #3:

Falcon and Triad (FICO Applications)

Responsibilities:

  • Involved in Production support activities like loads monitoring and handling session failures.
  • Perform Weekly and monthly production patching and fixing the vulnerabilities.
  • Involved in weekly and daily status project meeting with onsite coordinator.
  • Perform daily health and weekly health check status.

Software Engineer

Aroha Technologies
07.2012 - 11.2015

Project #1

Bank of America – Biz$core NPA Reporting to RBI.

Responsibilities:

  • Requirement analysis, articulation and refining of requirements.
  • Implementation of different ETL logics like Filter, Aggregation, Joiner, Expression, Lookup, Router, and Update Strategy etc.
  • Daily operational support including Batch job monitoring and application health checks .
  • Perform the deployment of code in Production environment.
  • Created database objects like views, stored procedures, functions, tables, cursors as per the business requirement. Used UNIX commands to search, replace and checking the logs used head and tail command etc.
  • Written batch scripts to run the workflow through windows.
  • Monitoring the Batch jobs through Autosys Scheduling tool.
  • Used the mapping parameter to make the mapping more flexible.

Project #2

Syndicate Bank – Automation to Regulatory Reporting to RBI.

Responsibilities:

  • Involved in the source system study in order to developed technical data mapping (source systems to destination system). This involves mapping of data length, data field, data type, business logic.
  • Involved into code review of mappings created by my team members.
  • Developing mappings using different transformations like Source Qualifier, Aggregator, Expression, Router, Normalizer, Lookup, Joiner and optimized them for better performance.
  • Implemented business logic with appropriate error handling techniques and auditing of data.
  • Created integrated workflows and scheduled them on timely basis.

Education

Master of Science - Computer Engineering

Visvesvaraya Technological University (VTU)
Bangalore
07.2012

Bachelor of Science - Computer Science

Gulbarga University
Raichur
07.2009

Skills

  • Informatica PC/IICS/BDM
  • PySpark
  • Databricks
  • Snowflake Cloud
  • Azure and AWS Services
  • Database (Oracle/SQLServer/Sybase/Postgres)
  • Service Now/BMC Remedy
  • Git/SVN/Jenkins
  • Splunk
  • Autosys/Apache Oozie/Control-M)
  • DBT Tool
  • Linux/Shell Scripting
  • Shell Scripting

Certification

AWS Certified Solutions Architect - Associate

Microsoft Azure Fundamentals (AZ-900)

Timeline

Senior Data Engineer

Emirates NBD
11.2023 - 03.2025

Senior Consultant

JP Morgan & Chase
06.2022 - 09.2023

Senior Programmer Analyst

FIS Global
11.2015 - 06.2022

Software Engineer

Aroha Technologies
07.2012 - 11.2015

Master of Science - Computer Engineering

Visvesvaraya Technological University (VTU)

Bachelor of Science - Computer Science

Gulbarga University
MOHAMMED JAWAD