Skilled Azure Data Engineer with 3+ years of experience in designing and optimizing data pipelines using Azure Data Factory, SQL, Synapse Analytics, Databricks, and Logic Apps. Proficient in ETL/ELT processes, data modeling, and cloud data architecture. Strong in performance tuning, issue resolution, and delivering clear knowledge transfer (KT) sessions. Adept at translating business needs into scalable, secure, and reliable data solutions.
OTVF(ocean transport volume forecast)
A web app basically used to analyze and book the containers in well advance for a finished product based on the volumes to move from one plant to another plant. This will leverage the prebooking of containers in case of crisis where the transportation becomes costlier. The web application delivered by the planners from Nestle used to analyze the volumes for the inter-market transfer to give better visibility of demand analysis for routes and volumes for ocean freight through reporting and building underlying logic, assumptions, and structured data, created procedures on Snowflake which will extract data from different stage tables and merge into a single integration table, created storage integrations, stages, file formats, tasks, views on top of int tables in Snowflake, extracted (unloaded) data to blob storage from Snowflake using copy into commands, migration of code components to higher environments using Azure DevOps, implemented copy activity, stored procedure, and many Azure Data Factory pipeline activities, create, edit, and delete data factories and child resources including datasets, linked services, pipelines, triggers, and integration runtimes, implemented logging and error handling mechanisms using execute PL activity, web activity, implemented logic apps for email notifications, delivered KT sessions to the sustain team, provided supporting documents and mapping sheets required by the sustain team, provided support to the sustain team if they face any hurdles,