Accomplished IT professional with over 10 years of experience in application development and enhancement across high-impact domains such as Retail, EIS, and Life Sciences.
Expertise in SQL and PL/SQL programming, including the creation of complex queries to drive business insights, and optimize data management.
Proven ability in developing and optimizing T-SQL and PL/SQL queries by efficiently joining multiple tables, and designing stored procedures, views, triggers, and functions to ensure high-quality data manipulation and storage, aligned with business rules.
3.5 years of experience working with cutting-edge Microsoft Azure technologies, including ADF, Azure Synapse Data warehouse, and ADLS, to build scalable, cloud-based solutions.
Skilled in developing Azure Data Factory pipelines for ETL processes, implementing automated job scheduling, and facilitating seamless data transfers between diverse sources and destinations.
Extensive experience with Microsoft Fabric, driving the development of data pipelines, and working with lakehouse architecture to enable data transformations using PySpark.
Expertise in architecting and implementing cloud migration strategies for seamlessly transitioning traditional on-premise systems to Azure Cloud, improving scalability and performance.
Adept in agile methodologies, ensuring iterative development and quick adaptation to changing business requirements.
Demonstrated leadership and collaboration skills, with a strong ability to guide cross-functional teams and cultivate a high-performance team environment.
Involved in all stages of the SDLC, from analysis and design to development, testing, implementation, and ongoing maintenance, delivering projects on time and within budget, despite aggressive deadlines.
Overview
12
12
years of professional experience
1
1
Certification
Work History
Senior Data Engineer
Ernust & Young(EY)|Alberta Health Services
Kochi
05.2024 - Current
Modernized AHS corporate analytics by migrating from eManager (OBIEE and Oracle BI Applications) to advanced Fabric SaaS solution.
Designed a metadata-driven ingestion framework leveraging Fabric pipeline to enhance processing efficiency.
Developed complex data transformation processes leveraging PySpark and SQL.
Senior Data Engineer
Ernst & Young(EY)|Fonterra
Kochi
10.2021 - 05.2023
Worked as a Data Engineer Major Commercial client based out of NZ, Involved in development activities for various use cases
Build a metadata driven ADF ingestion framework which extracts data from different source systems namely SAP, network drive, manual files, Teams Folders and API etc
And orchestration of databricks notebooks using ADF and performed advanced transformations with Azure Databricks
Loaded transformed data into Azure SQL Server databases, ensuring optimal performance and adherence to data governance standard
Senior Data Engineer
Tata Consultancy Services(TCS)|Wallgreens
Kochi
01.2020 - 10.2021
The LEAP D&A Legacy Migration project is to re-platform and modernize WBA data systems, leveraging Microsoft Azure capabilities, for the business to focus on core functions, leveraging cloud capabilities.
The project objective is to ingest and curate retail data domains from the existing Hadoop/DB2 environment to the Azure environment and build a Global Data Hub.
Design and implement migration strategies for traditional systems onto the Azure platform.
Develop an Azure Data Factory pipeline for on-cloud ETL processing.
Build complex stored procedures to facilitate efficient data manipulation and consistent data storage according to the business rules.
Improved database performance and loading speed by 80% through SQL query optimization.
Bigdata Developer
Tata Consultancy Services(TCS)|Johnson and Johnson
Kochi
07.2019 - 12.2019
The aim of the project was to ingest various files coming from the SAP system into the hive tables in the Enterprise Data Lake, and then the transactional data is being loaded into the kudu tables.
Developed advanced methodologies to improve ATTP EDL ingestion workflows.
Developed a shell script for data ingestion and data transformation.
Engaged with multiple big data platforms within the Hadoop ecosystem like Hive, Impala, and Kudu.
SQL Developer
Tata Consultancy Services(TCS)|Duke Energy
Kochi
07.2016 - 06.2019
Duke Energy is an electric power holding company in the UK.
Our project was to migrate the distribution data from GTech to Smallworld using the FME ETL tool.
Designed robust functions and views for efficient data migration processes.
Created FME workflows for data transfer.
Facilitated transfer of GIS data attributes and geometry from source Oracle database to staging database utilizing FME.
Developed Python scripts to facilitate data migration from sources to staging environments.
Junior SQL Developer
TCS |Nielsen
Kochi
07.2013 - 07.2016
Nielsen, a leading global data analytics company, collects shopper data through handheld scanners.
Our tools provide clients with a holistic view of the marketplace.
Built database-backed modules including functions, procedures, packages, and triggers according to client needs.
Re-engineered and enhanced performance of database structures.