Summary
Overview
Work History
Education
Skills
Certification
Timeline
Hi, I’m

Md Sadaf Sadique

Kolkata

Summary

A professional Senior Software Engineer with 9.5 Years of experience in IT and relevant experience of 6.5 Years in Informatica Power Centre (10.4,10.1) Development, enhancement, implementation and maintenance. 3 Years in IICS (Data integration) and 1+ year in Snowflake development. Worked as Senior ETL developer in Two Migration Project(Legacy to Digital and Powercentre to IICS migration) and data integration projects in domains like Manufacturing Operation , Banking Operations, Utility Corporate data handling, having good experience in transforming and loading data into various systems through ETL tool using IICS and Informatica Power Center (Designer, Workflow Manager, Workflow Monitor) in data warehouse/data lake. Worked as Snowflake developer in loading data to dimension/Fact/Stage/History tables using procedure, streams, tasks. Worked on various databases like Oracle , MSSQL , Postgress , Snowflake Familiar with Deployment of code using Bitbucket/Jenkins pipeline. Insightful and working knowledge of IT operations, AWS Cleo , AWS RDS, AWS EC2 , Crystal Reports, Autosys and TWS Job scheduler. Proven talent for designing, developing, implementing and maintaining organizational IT projects based on requirements. Having knowledge of data profiling, data analysis wherever applicable. Having an understanding of Data Warehousing Concepts as ETL Developer. Experience in team management or leading a team of 7-9 associates for individual projects.

Overview

10
years of professional experience
1
Certification

Work History

Tech-Mahindra

Senior Software Engineer
01.2022 - Current

Job overview

  • Currently working for largest manufacturing client on their AfterSales datamart data
  • Loading various dimension and its related fact tables located in snowflake for service retention factory and related projects
  • Implemented SCD2 using IICS as well as by writing SQL query in Snowflake
  • Used IICS where source and target are different Used snowflake snowpipe where source and target both are snowflake
  • Worked on requirement gathering from clients on project
  • Developed mapping, mapping task and taskflow in IICS to load data from heterogenous sources using various transformations to Snowflake Stage/History tables
  • Scheduling of jobs in TWS by writing script to execute them
  • Preparaiion UTR,STR and design related documents for Business and platform architect approvals
  • Performing Unit testing and UAT testing to get the code deployed to Production
  • Developed Snowpipe to load data from S3 bucket file to Snowflake history tables
  • Developed Snowflake procedure to load data in dimension table as SCD2 type , created streams and tasks for the same to schedule the procedure
  • Working on Data migration project for a foreign Govt Food Ministry project
  • Manual to Digital Project from scratch
  • Digitization of existing manual system of Food Ministry DG FOOD system with the help of SAP S4HANA, ETL, MDM and UI Portal Liferay
  • Existing data (farmer, dealer,miller,movement contractor etc) should be migrated to new system with the help of ETL and master data of should be kept in MDM for all sources
  • Farmer, dealer, miller etc registration/updation will be done from new Portal system
  • Invoices generated and for regular business activity will be done with help of SAP
  • Gathering requirements from client and Vendor, creation of Design, Data Migration Strategy, SRS, data mapping template, Testing, Data Profiling Documents
  • As senior ETL developer Migrated data from various source system (Excel, database, flatfiles) to Portal Database (Oracle), SAP, MDM by developing mappings, workflows, sessions
  • Delivered all development requests on time with minimal defects
  • Implemented Data cleaning, duplicate check and SCD 1 at ETL level using various ETL capabilities
  • Data migration was divided into two phases
  • Data was classified as Transaction and Master Data
  • Migration strategy was different for Transaction and master data
  • Lead a team for 3-4 people, was responsible for regular work assignment to them, performance and quality check of their deliverables, mentoring and guidance to them as lead
  • Coordinated and provided technical support for debugging and customization requests
  • Established regular communication with teammates
  • Monitoring of jobs and providing fix for any issues
  • Coordinated and provided technical support for debugging and customization requests post go live
  • Technology: IICS , Snowflake , Postgress ,AWS Cleo , AWS EC2 , AWS RDS, Informatica PowerCenter 10.4, Oracle 19c, TWS

Tata Consultancy Services

Information Technology Analyst
06.2019 - 01.2022

Job overview

  • Working on development/Enhancement ETL project for a large Banking client based in US
  • Development of workflows and mappings which will fetch record from different Sources(flat files,Database,Excel files) and will store in a single database/table or txt file in Unix Server which was fulfilled through Informatica Powercentre
  • All these data will be used by downstream other sources or applications
  • Enhancement of workflows and mapping which will pick up data from PowerExchange Sources and transform them to meet the requirement
  • Gathering requirements from client and Vendor, creation of Design Docs
  • Worked on new Development requests major/minor from understanding the requirement, analyzing, coding to implementation in a mapping/wf’s
  • Delivered all development requests on time with minimal ST defects
  • Worked on Enhancement requests major/minor from understanding the requirement, analyzing, coding to implementation in a mapping which was already developed previously
  • Delivered all enhancement requests on time with minimal ST defects
  • Used several transformations like lookup, joiner , Rank , Aggregate , Expression
  • Used target load order to sequence multiple loads in same mapping
  • Various performance tuning technique was applied to improve or reduce the runtime of wf’s
  • JIL file in autosys was created to schedule the job in Autosys to run daily at particular time
  • Supported multiple client change orders simultaneously through prioritization and task delegation
  • Debugging and fixing Defects
  • Technology: Informatica Powercentre , Autosys , Oracle

Infosys Ltd.

Technology Analyst
01.2015 - 05.2019

Job overview

  • Corporate Person Application (Handling HR data) For US Based Utility Client
  • Sumtotal Learning Central (Sending data to Cloud based Application related to Training activities of employees and Vendor) For US Based Utility Client
  • Maximo To Powerplan (Picking up transaction invoices from Maximo and processing it in Powerplan application.)
  • Creation of a workflow and mapping which will fetch record from different Sources and will store in a single database or txt file in Unix Server which was fulfilled through Informatica Powercentre
  • Maintenance of a workflow and mapping which will pick up data from Maximo database and performs various calculations to produce a file through ETL Tool Informatica
  • This file then gets processed in Powerplan application
  • Maintenance of a workflow and mapping which will pick up data from Maximo database and performs various calculations to produce a file through ETL Tool Informatica
  • This file then gets processed in Powerplan application
  • Gathering requirements from client and Vendor, creation of Design Docs
  • Worked on new Development requests major/minor from understanding the requirement, analyzing, coding to implementation in a mapping/wf’s
  • Delivered all development requests on time with minimal ST defects
  • Worked on Enhancement requests major/minor from understanding the requirement, analyzing, coding to implementation in a mapping which was already developed previously
  • Delivered all enhancement requests on time with minimal ST defects
  • Creating mapping and workflow where if a new employee joins an organization (Client) then it will create unique identification and user id for that employee or vendor
  • Creation of mapping where ETL passes the user details to Unix server path
  • Sumtotal processes this file in their cloud database
  • JIL file in autosys was created to schedule the job in Autosys to run daily at particular time
  • Supported multiple client change orders simultaneously through prioritization and task delegation
  • Debugging and fixing Defects
  • Coordinated and provided technical support for debugging and customization requests
  • Established regular communication with teammates
  • Daily monitoring of jobs and providing fix for any issues
  • Technology: Informatica Powercentre , Autosys , Crystal reports, SQL SERVER

Education

Dream Institute Of Technology

B-Tech from Computer Science and Engineering
01.2014

University Overview

  • under WBUT
  • GPA: 7.35

St Thomas’s Day School

10+2 from science stream
01.2010

University Overview

  • under ISC
  • GPA: 67%

The Park English School

10 from science stream
01.2008

University Overview

  • under ICSE
  • GPA: 80%

Skills

  • TOOLS: IICS(data Integration) ,Informatica PowerCenter, Snowflake, AWS RDS , AWS S3 , AWS EC2 , Postgress , Oracle ,Autosys, TWS, Bitbucket,Git ,Jenkins, Crystal Reports, FileZilla, Putty
  • LANGUAGES: SQL, ETL Development, ETL Design, Autosys JIL
  • Operating Systems: Windows, LINUX
  • Domain: Manufacturing, Utility, Banking, Retail
  • Certifications: AWS Cloud practitioner, Azure Dp-900, Cloud Data Integration for PowerCenter developers (Informatica)

Certification

Microsoft Azure Data Fundamentals (DP-900) And Cloud Data Integration For PowerCenter for Developers Foundation

Timeline

Senior Software Engineer

Tech-Mahindra
01.2022 - Current

Information Technology Analyst

Tata Consultancy Services
06.2019 - 01.2022

Technology Analyst

Infosys Ltd.
01.2015 - 05.2019

St Thomas’s Day School

10+2 from science stream

The Park English School

10 from science stream

Dream Institute Of Technology

B-Tech from Computer Science and Engineering
Md Sadaf Sadique