Summary
Overview
Work History
Education
Skills
Personal Information
Certification
Languages
Certification Statement
Accomplishments
Languages
Timeline
Generic
Nishchint Jagdale

Nishchint Jagdale

Mumbai

Summary

Results-driven Data Engineer with over 6 years of experience in designing and implementing scalable data solutions across AI/ML and enterprise systems. Proficient in building data pipelines, MLOps frameworks, and cloud-based architectures (AWS, Azure, GCP) to enable AI-driven insights and automation. Expertise in handling large structured and unstructured datasets, integrating advanced NLP models (e.g., OpenAI GPT-4, Gemma 3 27B), and delivering end-to-end solutions for AI-powered chatbots, product classification systems, and business intelligence applications. Skilled in presales and GTM engagements, translating business requirements into technical solutions, and contributing to the successful acquisition of high-impact projects. Adept at leveraging tools like Python, PySpark, Terraform, and Power BI to drive data-driven decision-making and process optimization.

Overview

6
6
years of professional experience
1
1
Certification

Work History

Assistant Manager

KPMG
Mumbai
01.2024 - Current

Project: AI-Powered Grievance Redressal Chatbot for Department of Administrative Reforms and Public Grievances (DARPG), Government of India

Description:
Designed and implemented an AI-driven chatbot to streamline the grievance redressal process for citizens by intelligently routing queries to the correct department and subcategory. The chatbot system was built to handle over 500 categories, 15,000+ subcategories, and 91 departments. The solution utilized OpenAI’s GPT-based LLM, with experimentation using Gemma 3 27B, and achieved an accuracy of ~90% in category prediction. The system was deployed entirely on Microsoft Azure, leveraging modern DevOps and MLOps practices for scalable and secure delivery.

Tech Stack: Azure App Service, Azure Kubernetes Service (AKS), Azure DevOps, OpenAI API, Marqo (vector DB), Python, Terraform, Git, Gemma 3 27B (LLM)

Roles and Responsibilities:
1) Led data collection and preprocessing, handling 91 Excel-based departmental datasets; cleaned and standardized data while generating synthetic data to enhance LLM contextual understanding.
2) Designed and implemented the vector indexing pipeline using Marqo to enable fast and relevant semantic search over grievance data.
3) Contributed to prompt engineering and fine-tuning to optimize the chatbot’s performance for accurate routing.
4) Procured and configured cloud infrastructure on Azure, including AKS for backend microservices and Azure App Services for frontend hosting.
5) Set up CI/CD pipelines using Azure DevOps, enabling automated deployment and monitoring of services.
6) Collaborated with cross-functional teams for requirement gathering, stakeholder communication, and regular project updates.
7) Designed the chatbot flow and conversational UX, ensuring seamless interaction and correct category prediction through guided prompts, and led the integration of the chatbot with the CPGRAMS portal, enabling real-time grievance submission and category routing.

Project: AI-Powered Automated Product Discovery and Classification System.

Description: The solution involved multiple steps: scraping the data from company websites, filtering and refining it, extracting product details, and, finally, classifying each product according to predefined categories. Classification involved seven tiers, 45 sub-sectors, and 1,045 classes. Key components of the solution included the use of web scraping tools like BeautifulSoupand Selenium, natural language processing (NLP) models powered by OpenAI GPT-4, and classification models fine-tuned on specific industry data.

Roles and Responsibilities:

1) Preprocessed and filtered raw data to standardize formats and structures, enabling seamless integration with NLP and classification models.

2) Collaborated on leveraging OpenAI GPT-4 for product detail extraction and fine-tuned classification models using industry-specific data for accurate categorization.

3) Developed scalable workflows and optimized data pipelines to support automated product discovery and classification, ensuring efficiency and reliability of the AI-driven solution.

Project: GTM & Pre-Sales

Roles & Responsibilities:

1) RFP Analysis & Proposal Development: Reviewed and analyzed client RFPs to understand project scope, estimated effort in terms of man-months and costs, and identified required tools and technologies to create comprehensive technical and financial proposals.

2) Client Engagement & Solutions Presentation: Presented case studies and similar successfully delivered projects to clients, showcasing technical expertise and alignment with their requirements to secure business opportunities.

3) Effort Estimation & Resource Planning: Collaborated with cross-functional teams to accurately estimate resources, timelines, and budgets for proposed solutions, ensuring feasibility and alignment with client expectations.

4) Business Impact: Contributed to securing five projects for KPMG by delivering tailored proposals and effectively communicating the value of AI-powered and data engineering solutions during presales discussions.

Some of the Engagements:

1) AI-Powered Chatbot for answering queries received from Parliament for ONGC

ONGC faces a significant volume of queries and information requests from parliament. Responding to these queries efficiently and accurately is crucial to maintaining transparency and effective communication with stakeholders. Currently, this process involves manual retrieval and analysis of data, which can be resource-intensive and time-consuming. To streamline and enhance this process, KPMG proposed to implement an advanced AI-based chatbot solution. This chatbot leverages cutting-edge natural language processing (NLP) and machine learning technologies to autonomously interpret and respond to parliamentary queries, ensuring prompt and accurate information dissemination.

2) AI-Powered Dashboard solution for PMO office for monitoring schemes

This powerful tool delivers actionable insights and visualizations, enabling efficient tracking and management of nationwide initiatives. By leveraging AI, the Insight Bot provides in-depth analysis and real-time updates, ensuring that every aspect of the Prime Minister’s welfare schemes is meticulously monitored and optimally managed.

3) AI- Asisstant for NITI AAYOG:

NITI Aayog seeks to enhance the scalability and user experience of its NITI for States platform by integrating with a multimodal, multilingual conversational bot, using the BHASHINI API for multilingual capabilities. The proposed solution will enable the platform to provide more intelligent, efficient, and user-friendly services to state governments.

Senior Data Engineer

Quantiphi
Mumbai
06.2021 - 01.2024

Client: Coke One North America

  • ETL and orchestrating end-to-end streaming data pipelines using Azure Data Factory.
  • Data Ingestion in Databricks using Pyspark and REST APIS.
  • MLOPs using Azure Devops.
  • Lakehouse management using azure databricks.
  • Datalake management using ADLS.
  • Visualization using PowerBI.
  • Was heavily involved in successfully Onboarding 6 distributors for this product.

Data Engineer

Quantiphi
Mumbai
06.2020 - 05.2021

Client: Philips

  • Data preprocessing using Excel VBA.
  • Process automation using python to create a one click solution.
  • Maintenance and development of various optimal data pipeline architecture (ETL) in Visual Studio to transform raw data into the format required for various Excel dashboards.
  • Creation and implementation of database designs and data models.
  • Maintenance, designing and optimization of dashboards to meet business requirements.

Data Engineer

Quantiphi
Mumbai
07.2019 - 05.2020

Client: Merkle

  • Creation of an end-to-end pipeline for ingesting data from various data source APIs, like Salesforce Marketing Cloud, LinkedIn, Facebook, GA360, DCM, and DMP, by making RESTful and SOAP requests.
  • Creation of an end-to-end pipeline for ingesting data from various data source APIs like Salesforce Marketing Cloud, LinkedIn, Facebook, GA360, DCM, DMP by making RESTful and SOAP requests.
  • Staging all the data and ingestion in BigQuery.
  • Performing Analytics on the ingested data to generate leads.
  • Developing a Lead Scoring and Delivery process.
  • Creating an App Engine service which will make a REST/SOAP call to APIs, pulling the data, storing the data into GCS, and then pushing it to BigQuery.
  • The process was established using Python.
  • DLP mechanism was used for PII data protection and many other concepts like pagination were taken into consideration.
  • Performing analysis on the data in BigQuery to get fruitful insights.
  • Performed ID resolution i.e., identifying the data which is related to each other in multiple data sources.
  • Lead Scoring i.e giving the leads points based on their activities during engagement.
  • Responsible for handling the UI authentication process and logs.
  • The client was provided a dashboard using stackdriver logging tools.

Education

B.E - Electronics and Telecommunications

Sardar Patel Institute of Technology
Mumbai
05-2019

Skills

Tools: Linux, Python, PySpark, Excel, SQL, LangChain, Flask, Power BI, Airflow, SQL Server, OpenAI, FastAPI

Azure - ADF, Databricks, Key Vault, ADLS, DevOps, App Service, Synapse

AI - Marqo, OpenAI, Langchain

GCP - BigQuery, Cloud SQL, App Service, Cloud Functions, GCS, Compute Engine

Personal Information

Date of Birth: 20/02/1998

Certification

  • Google Associate Cloud Engineer, 12/01/19
  • Microsoft AZ-900 – Azure Fundamentals, 12/01/2020
  • Microsoft DP-900 – Azure Data Fundamentals, 07/01/21
  • Microsoft DP-203- Data Engineer Associate, 12/01/21
  • Microsoft AZ-104- Azure Administrator, 03/01/23

Languages

  • English
  • Hindi
  • Marathi

Certification Statement

I, certify that to the best of my knowledge and belief, this CV correctly describes me, my qualifications, and my experience. I understand that any willful misstatement described herein may lead to my disqualification or dismissal, if engaged. Additionally, I also certify that I shall be available for the entire duration of the contract.

Accomplishments

  • Aryabhatta Award - KPMG May 2024
  • Crackerjack - Quantiphi Newsletter -Feb 2023
  • Super Ninja Award - Quantiphi - H2 2022

Languages

English
Advanced (C1)
C1
Hindi
Advanced (C1)
C1
Marathi
Advanced (C1)
C1

Timeline

Assistant Manager

KPMG
01.2024 - Current

Senior Data Engineer

Quantiphi
06.2021 - 01.2024

Data Engineer

Quantiphi
06.2020 - 05.2021

Data Engineer

Quantiphi
07.2019 - 05.2020

B.E - Electronics and Telecommunications

Sardar Patel Institute of Technology
Nishchint Jagdale