Summary
Overview
Work History
Education
Skills
Certification
Languages
Timeline
Projects - Global attendance Dashboard
Project - Finance Dashboard
Project - MST Planner Data
Project - Vendor utilization
Professional Summary
Generic
Yogesh Garg

Yogesh Garg

Faridabad

Summary

Seasoned professional with 10 years of experience, including 5 years of specialized expertise in designing, implementing, and managing data solutions for the Microsoft Azure ecosystem, leveraging cutting-edge Big Data technologies. Seeking an opportunity to contribute my skills and knowledge to a dynamic team, where I can contribute with innovation, collaboration, and drive impactful data projects forward.

Overview

13
13
years of professional experience
1
1
Certificate

Work History

Group Manager - Data Engineering

WNS Global
Gurugram
09.2022 - Current

Project : WREF

Client : Pharma Client

Project Description : We oversee the WREF data integration platform, powering 45+ dashboards for reporting needs. Our role involves collecting, cleansing, and transforming data from diverse sources to meet specific business requirements. Leveraging Azure data services, we maintain and manage this integration platform, ensuring seamless availability of refined data to our Power BI team for reporting purposes.

Tools And Technology used : Databricks, Pyspark, ADF, ADLS GEN, Rest API ,Logic Apps, SQL

Roles & Responsibilities:

  • Providing Single data integration platform for various dashboard reporting
  • Managed migration of database from SQL to Databricks Lake house.
  • Developed ETL processes using Azure Data Factory,logic Apps, Databricks, Data lakehouse to ingest, transform, and load data from various sources.
  • Manage Azure data services and Databricks platform like new cluster creation, cluster pools and attach existing clusters to the pool in DB and manage workflows & scheduling the pipelines in Databricks.
  • Collaborated with product owner and cross-functional teams to understand business requirements and translate them into technical solutions.
  • Managing Team of 10 data engineers.

Technical Lead

Colt Technology Services
Gurugram
04.2019 - 09.2022

Project : NOC operation data management for Colt

Project description : Our project involves daily extraction of ticketing data from the Sibel Ticketing tool utilizing APIs, along with network logs sourced from various vendor-specific network management systems. This data is systematically stored in Azure Data Lake Storage (ADLS). Our primary objective is to process this data in alignment with business requirements, facilitating the generation of essential reports such as SLA reports, outage reports, and assessments of device performance across different vendors. Additionally, we compile network availability reports to ensure comprehensive insights into network operations.

Tools And Technology used : Databricks, Pyspark, ADF, ADLS GEN,Spark SQL,SQL.

Roles & Responsibilities:

  • Developed data pipelines using ADF, Azure Databricks and pyspark for data engineering tasks, including data ingestion, transformation, and loading.
  • Managing business rules, standardization, reference management, and ensuring clean and updated data for downstream consumption.
  • Handle the requirements from scratch i.e., requirement gathering, solution design, providing required ETAs, and then, complete course of development and documentation.
  • Coordinate with various NOC team and support groups to provide continuous service to business users and for implementing the system enhancements.
  • Create complex SQL queries to bring out the extracts of data from multiple
    tables / schemas/databases with defined and required transformations, conditions ,and rule sets.

Senior Testing Engineer

ECI Telecom India Pvt. Ltd.
Navi Mumbai
06.2018 - 01.2019
  • Performed manual testing on lightsoft NMS application.
  • Reviewed technical design documents to ensure all requirements are met.
  • Identify all performance bottlenecks and work with Support and Infrastructure Teams collaboratively on resolution.

Senior Engineer

Tejas Networks
Gurugram
02.2016 - 04.2018
  • Performed testing for ULDC project and verified solution design.
  • Created test scripts to manage automated feature testing using Python Optimized test cases to maximize success of manual software testing.
  • Developed new methods of automation for existing processes and analyzed complex problems and formulated creative solutions.

Engineer

Ciena India Pvt. Ltd.(On FDS Payroll)
Gurugram
02.2015 - 01.2016
  • Developed MPLS-TP technology based EPT network for Bharti Airtel using Ciena Carrier Ethernet switches(CES) Devices 5160,8700 and access device 6200(support Ethernet & SDH)
  • Implement the software up gradation of network elements as per customer requirements and traffic Migration from another network to Ciena Network
  • Interaction with customers to troubleshoot and repair complex circuits, hardware faults.

Engineer

Bharti Airtel(On TDS Payroll)
Maneser
12.2013 - 02.2015
  • Validate performance of various L2 services like ELAN, ELINE, E-TREE services on different devices in the lab.
  • Validate performance and specification functionality of devices LTS, GTS,GX devices before every new release.

Engineer

Alcatel lucent (on STCS Payroll)
Maneser
08.2011 - 03.2012
  • Validate performance of various L2 services like ELAN, ELINE, E-TREE services on different devices in the lab.
  • Work closely with the vendor team to investigate test failures and issues logged during testing and report defects to vendor team.

Education

MBA - Production Management

Maharshi Dayanand University
Rohtak
01.2014

B.Tech - Electronics And Communications

Haryana Collage of Technology And Management
Kaithal
07.2010

Senior Secondary - Non Medical

Tagore Academy Public School
Faridabad
03.2006

Higher Secondary -

Vishwa Bharti Shiksha Kendra
Ballabgarh
03.2004

Skills

  • Azure Services: Azur data factory, Azure Databricks, Azure SQL Database, Azure Data Lake Storage,Azure logic Apps, Power Platform, ADO
  • Programming Languages: Python, SQL,Pyspark
  • Version Control: Git
  • Project and ticket tracking tool : Jira

Certification

  • Microsoft Certified: Azure Data Engineer Associate - Microsoft
    Issued Jun 2022 - Expires Jun 2023
  • Academy Accreditation - Databricks Lakehouse Fundamentals - Databricks
    Issued Sep 2023 - Expires Sep 2024 - 82717737

Languages

Hindi
First Language
English
Advanced (C1)
C1

Timeline

Group Manager - Data Engineering

WNS Global
09.2022 - Current

Technical Lead

Colt Technology Services
04.2019 - 09.2022

Senior Testing Engineer

ECI Telecom India Pvt. Ltd.
06.2018 - 01.2019

Senior Engineer

Tejas Networks
02.2016 - 04.2018

Engineer

Ciena India Pvt. Ltd.(On FDS Payroll)
02.2015 - 01.2016

Engineer

Bharti Airtel(On TDS Payroll)
12.2013 - 02.2015

Engineer

Alcatel lucent (on STCS Payroll)
08.2011 - 03.2012

MBA - Production Management

Maharshi Dayanand University

B.Tech - Electronics And Communications

Haryana Collage of Technology And Management

Senior Secondary - Non Medical

Tagore Academy Public School

Higher Secondary -

Vishwa Bharti Shiksha Kendra

Projects - Global attendance Dashboard

Developed project to analyze office space requirement and developed smart offices using data of employees coming to office and connected to office Wi-Fi . Business uses that data to determine office space need , need of parking space , booking room and use of facilities on a daily basis.

Tools And Technology used : Databricks, Pyspark, ADF, ADLS GEN, Cisco API ,Delta table

Load Type : Incremental Load, File Format : Json, Data Size : 1 TB

Project - Finance Dashboard

Developed project to ingestion and transformation of finance data to received on monthly basis and feed to Power BI to make it available for Stakeholders.

Tools And Technology  used : Databricks, Pyspark, , ADLS GEN2, sql , logic App

File Format : Excel

Project - MST Planner Data

Collect and transform Microsoft team planner data from API and feed to power BI for visualization.

Tools And Technology  :Databrick,ADLS2, sql, Logic App, ADF.

File Format : json

Project - Vendor utilization

Collect data from different service providers and perform data ingestion/ transformation activity over data using ADF/databrick .Visualize data in Power BI to stakeholders so that they can analyze SLA agreement conditions during vendor meetings.
 Tools And Technology  used : Databricks, Pyspark, , ADLS GEN2, sql, ADF.

File Format : CSV

Professional Summary

  • Proven proficiency in data engineering, ETL processes, data warehousing, and analytics, ensuring robust and scalable solutions.
  • Responsible for end-to-end lifecycle of data pipeline development, encompassing Azure infrastructure management for Data services, encompassing design, coding, documentation, post-implementation support, validation, and rigorous review processes.
  • Orchestrated the execution of ETL pipelines, effectively managing the processing of over 50 TB of data on a daily basis sourced from diverse origins.
  • Consistent track record of meeting stringent deadlines and appreciation from clients for delivering high-quality solutions.
  • Possess good interpersonal skills that have been put to good use in coordinating with team members and deliver optimal solutions, with a keen focus on addressing performance-related challenges
  • Collaborated with cross-functional teams to understand business requirements and translate them into technical solutions.
  • Managing Team of 10 data engineers along with recruitment process of new personnel, including interviewing and on boarding
Yogesh Garg