Seasoned professional with 10 years of experience, including 5 years of specialized expertise in designing, implementing, and managing data solutions for the Microsoft Azure ecosystem, leveraging cutting-edge Big Data technologies. Seeking an opportunity to contribute my skills and knowledge to a dynamic team, where I can contribute with innovation, collaboration, and drive impactful data projects forward.
Project : WREF
Client : Pharma Client
Project Description : We oversee the WREF data integration platform, powering 45+ dashboards for reporting needs. Our role involves collecting, cleansing, and transforming data from diverse sources to meet specific business requirements. Leveraging Azure data services, we maintain and manage this integration platform, ensuring seamless availability of refined data to our Power BI team for reporting purposes.
Tools And Technology used : Databricks, Pyspark, ADF, ADLS GEN, Rest API ,Logic Apps, SQL
Roles & Responsibilities:
Project : NOC operation data management for Colt
Project description : Our project involves daily extraction of ticketing data from the Sibel Ticketing tool utilizing APIs, along with network logs sourced from various vendor-specific network management systems. This data is systematically stored in Azure Data Lake Storage (ADLS). Our primary objective is to process this data in alignment with business requirements, facilitating the generation of essential reports such as SLA reports, outage reports, and assessments of device performance across different vendors. Additionally, we compile network availability reports to ensure comprehensive insights into network operations.
Tools And Technology used : Databricks, Pyspark, ADF, ADLS GEN,Spark SQL,SQL.
Roles & Responsibilities:
Developed project to analyze office space requirement and developed smart offices using data of employees coming to office and connected to office Wi-Fi . Business uses that data to determine office space need , need of parking space , booking room and use of facilities on a daily basis.
Tools And Technology used : Databricks, Pyspark, ADF, ADLS GEN, Cisco API ,Delta table
Load Type : Incremental Load, File Format : Json, Data Size : 1 TB
Developed project to ingestion and transformation of finance data to received on monthly basis and feed to Power BI to make it available for Stakeholders.
Tools And Technology used : Databricks, Pyspark, , ADLS GEN2, sql , logic App
File Format : Excel
Collect and transform Microsoft team planner data from API and feed to power BI for visualization.
Tools And Technology :Databrick,ADLS2, sql, Logic App, ADF.
File Format : json
Collect data from different service providers and perform data ingestion/ transformation activity over data using ADF/databrick .Visualize data in Power BI to stakeholders so that they can analyze SLA agreement conditions during vendor meetings.
Tools And Technology used : Databricks, Pyspark, , ADLS GEN2, sql, ADF.
File Format : CSV