Project 1: Migration from Snowflake Platform to Azure Synapse
Client: Microsoft Direct - L3Harris
Role: Azure Data Engineer
Tools and Technologies:
- Azure Synapse Analytics
- Delta Lake
- Azure Synapse Dedicated SQL Pool
- ADLS Gen 2
- Synapse Notebooks
- Synapse Pipeline
- Py-Spark
- Spark SQL
Client Description:L3Harris Technologies, Inc. is a leading American technology company specializing in defense solutions, information technology services, and wireless communications. The company provides a range of products, including command and control systems, tactical radios, avionics, and electronic systems, serving government, defense, and commercial sectors.
Responsibilities:
- Converted Snowflake DDLs of Tables, Views, SPs, and Semi-structured Tables Scripts to Azure Delta Tables for execution in the Azure environment.
- Parameterized the Schemas and ADLS location in the DDL scripts for Tables in both Bronze and Silver layer Schemas.
- Created over 130 DELTA tables in ADLS Gen 2 for both RAW and RPT Schemas.
- Developed Logging and Error Handling Notebooks for Pipeline Orchestration.
- Integrated over 60 Stored Procedures Notebooks into the pipeline framework.
- Established more than 70 Views and 60 Tables in Azure Dedicated SQL pool for the Gold layer.
- Created Snapshot Notebook for the Brown and Silver layers for restoring Tables Data.
- Orchestrated the deployment of Main and Child pipelines for data migration across Bronze, Silver, and Gold layers, integrating Stored Procedures and Logging Notebooks.
- Implemented Pipelines for extracting data from the cloud to ADLS Gen 2, extensively working on copy activities and incorporating Error Handling concepts.
- Utilized Azure Data Factory activities such as Lookups, Stored procedures, if condition, for each, Set Variable, Append Variable, Get Metadata, Filter, and Wait.
- Configured logic apps for email notifications to end-users and key stakeholders using web services activity.
- Stored processed and raw data in Azure Data Lake Gen2 containers.
- Stored processed data in Azure Synapse Analytics.
Project 2: Huntsman BOLT
Client: Huntsman Corporation, USA
Role: Azure Data Engineer
Tools and Technologies:
- Azure Databricks
- Azure Blob Storage
- ADLS Gen 2
- Azure Synapse Analytics
- Azure SQL
- Azure Logic Apps
- Spark SQL
Responsibilities:
- Led the offshore team as an Offshore Lead, overseeing end-to-end project implementation.
- Implemented a comprehensive project involving data extraction from on-premises into ADLS Gen2, pre-processing using Azure Databricks, and storage in Azure Data Warehouse using ADF for further processing.
- Set up Integration Runtimes (IR) to connect to SAP and non-SAP systems for data extraction into Azure through ADF.
- Processed data through Databricks using Spark and Python, storing both processed and raw data in Azure Data Warehouse.
- Presented processed data reports through Tableau.
- Created over 20 Tables in Azure Data Warehouse to store specified data.
- Scheduled triggers in ADF for daily pipeline runs, ensuring regular data updates.
- Tracked, processed, and updated specific tables daily, ensuring timely deliveries of action items.
Project 3: FEA (Fraud Enterprise Aggregation)
Client: Standard Chartered Bank
Role: Database Developer
Tools and Technologies:
- SQL Server Management Studio (SSMS)
- SQL DB
Responsibilities:
- Used SSMS to work with Microsoft SQL Server, involved in Business requirement document walkthrough to understand functionality.
- Contributed to understanding System Design and Database Designs.
- Created SQL Databases, Tables, indexes, and Stored Procedures based on PO requirements.
- Developed User-defined Functions and DB objects.
- Involved in Bug Fixing and root cause Analysis.
- Participated in Performance tuning.
- Planned and developed ETL pipelines using ADF.
- Extracted data from Azure Data Lake Storage to Azure Data Factory.
- Prepared transformations using activities and data flows as per requirements.
- Scheduled and monitored pipelines, loading transformed data from ADF to Azure SQL data.