Dynamic professional with over 6.5 years of experience in Big Data technologies, including Python, Scala, Unix scripting, AWS, and GCP. As a Data Engineer III at Walmart Global Tech India, I have enhanced and supported various projects, optimizing data pipelines and reducing processing times. My key contributions include automating GCP Bucket cleaning, improving SLA metrics, and developing cost-efficient data pipeline processes. Previously, at Publicis Sapient and Cognizant, I managed data models, data pipelines, and ensured data quality.
Description: The Trade Surveillance System analyzes complex data sets to detect anomalies in trading behavior and generate alerts for further investigation. It integrates trade and order data from exchanges, enriched trade, position, and PNL data from Endure, and additional market data from Refinitiv. This project supports the ETS Compliance teams and evolves to meet user needs.
Technical Responsibilities:
Description: The Supply Chain Data Lake - Cloud Platform Migration project manages and delivers Walmart business data to the user community. It involves migrating supply chain application data from various sources (Oracle, Informix, Teradata, DB2, SQL, Mainframe) to the Google Cloud Platform and loading it into Hive tables. The goal is to enhance data accessibility, usability, and quality.
Technical Responsibilities:
Description: Quantum Migration handles the migration of regional DCs data to GCP using batch uploads for 42 regional DCs, covering applications like PO download, receiving, order filling, and shipping invoicing. The goal is to enhance data accessibility, usability, and quality.
Technical Responsibilities:
Description: The project focuses on enhancing ECE Ab Initio on the cloud platform and migrating data for existing workflows (WFs) to an S3 bucket by adding dropdown components. The DDE platform is being migrated to the cloud, requiring location changes for source files.
Technical Responsibilities:
Description: The project aims to implement Slowly Changing Dimensions Type 2 using Hive, updating historical data changes automatically.
Responsibilities:
Python