Detail-oriented professional with over 8 years of experience in Data Warehouse, Business Intelligence, and Big Data Analytics projects.
Expertise in data modeling and migration, including successful transitions from on-premises data warehouses to cloud environments.
Results-driven data engineering professional with solid foundation in designing and maintaining scalable data systems.
Expertise in developing efficient ETL processes and ensuring data accuracy, contributing to impactful business insights.
Known for strong collaborative skills and ability to adapt to dynamic project requirements, delivering reliable and timely solutions.
Positive, analytical problem-solver with strong foundation in data systems and processes. Possesses solid understanding of data modeling and database design, coupled with skills in SQL and Python. Capable of driving data-driven decision-making and improving data infrastructure.
Extensive experience in managing complete Data Warehouse Development Life Cycle including Data Analysis, Database Design, Database Performance Tuning & Optimization and Data Warehouse Optimization.
Keen analyst with a proven track record in understanding and gathering requirements from clients, vendors, consultants, and multiple stakeholders, followed by delivering Data Warehouse, Business Intelligence, and Data Analytics solutions.
Highly resourceful in technical solution architecture and conceptualization for enterprise-wide BI and Data Warehousing projects, aligning business and information systems using cutting-edge technologies and adhering to high-quality standards.
ETL (Azure Data Factory,Informatica)
undefinedSettlement to Trade Lineage (S.T.L.A) model development at Settlement System, The project helps the settlement system to extract the lineage between the parent and the subsequent child mission information. This shall help the system to analyze the trade and settlement link. Additionally it extracts the fail missions which help the users to get the info in advance and reduce the loss of payment for failed missions.
Created a data pipeline and etl design for loading and extracting data set in real time from various external sources and compute a netting logic on Azure cloud using Azure Data Factory orchestration tool.
Developed a data pipeline to load (full and incremental ) from on premise data warehouse to cloud data ware house and optimize the reporting solutions on cloud for better and effective performance.
Developed a monitoring mechanism for analyzing the cloud data warehouse performance vs on prem data warehouse performance to check and understand the point of improvement.
Commodity Market - A New Market Released on 12th Oct'18, The project aimed at performing the development role in which masters was required for commodity segment (Client master, Trading Master, Clearing Member Master, Contract Master). In addition, performed the client level position calculation and sent it to Unique Client Identification Team in intraday and executed computation of open interest for members at PAN & member level and sent the required data to clearing team & web team of NSE to publish top three member data on daily basis.
Interoperability Implementation for Cash Segment, The project directed the market participants to consolidate their clearing and settlement functions, for a segment at a single clearing corporation. The trading and settlement components were no longer tightly coupled and thus needed a change in the entire design of data warehouse for CM segment. All the facts and dimensions for cash segments were changed accordingly. Client Code Modification, NSE has provided provision for modifying the client details post trading hours to members. Based on such changes, there are penalties which a member has to bear. Since, Data Warehouse has all data of trades & modifications, the calculation of penalty & leverages required are calculated at DWH end. Post Interop, it was a challenge because modifications were managed at two sources, Trading Source (for client) & Clearing Source (for Clearing members).