Summary
Overview
Work History
Education
Skills
Achievement
Timeline
Generic

Alok kumar

Bengaluru,KA

Summary

Results-driven and self-motivated IT professional with 13 years of experience, including over a decade of hands-on expertise in developing AI-powered products. Known for exceptional time management, a strong user-centric mindset, and a proactive, solution-oriented approach. Adept at delivering high-impact solutions independently and collaboratively, with a proven track record of driving innovation through artificial intelligence.

Overview

13
13
years of professional experience

Work History

Data Scientist

WNS Global Services
10.2022 - Current

QBE Insurance - Binders Extraction

Developed an AI-driven solution leveraging the OpenAI API to automate the extraction and validation of key policy details from insurance binders. This initiative enhanced the accuracy and efficiency of data entry into pricing tools, reducing manual errors and supporting compliance in underwriting workflows.

  • Used Python to build an end-to-end pipeline for document extraction, validation and logging.
  • Fetched binder PDF documents securely via SFTP, and handled corrupted PDFs by converting them to images and applying tesseract OCR for text extraction.
  • Leveraged the OpenAI API (GPT LLM) for semantic extraction and organization of key policy fields such as policy number, term, coverage limits etc.
  • Collaborated with underwriting and actuarial teams to define data validation rules and critical policy attributes.
  • Improved document processing efficiency and data accuracy, reducing manual intervention and turn around time in underwriting workflows.


QBE Insurance - Pricing Models

Migrate the existing GLM based pricing models built using EMBLEM to Python.

  • Used PySpark to prepare master data in azure databricks by bringing data together from different source system.
  • Prepared custom library to be used as package across the projects.
  • Performed a thorough EDA to understand various behavior and patterns present in data.
  • Binned the variables based on the factor values of the currently deployed production model.
  • Built GLM model using statsmodel package.
  • Used one-way and two-way lift charts to check performance of new model and compare the performance of existing EMBLEM based model and new python based model.
  • Exposed the model to rater through API by deploying it in domino datalab.


Grange Insurance - Claims Subrogation Model

Automate the process of identifying the recovery opportunities by integrating the subrogation model in claims processing workflow.

Problem Statement: In the current process recovery opportunities are identified at later stage of claim life cycle. There may be missed opportunities in the process of identifying recovery claims, potentially leading to leakage.

  • Prepared master data having claim level information by joining data from different source system.
  • Performed a thorough EDA to understand various behavior and patterns of a given variable.
  • Utilized the claim description at FNOL for text mining.
  • Iteratively checked various algorithms to arrive at one best model which works better in the given scenario of data.
  • Deployed the selected model in amazon sagemaker for it to be available for upstream and downstream system.

Data Scientist

Tata Consultancy Services
07.2015 - 09.2022

Working as part of automation team, responsible to write scripts using python and machine learning to automate the processes which are currently being manually performed.


Credit Suisse - eFX Group

To automate the process reading emails from shared outlook inbox and create INC in service now based on Incident Type and Configuration Item

  • Converted the received sample email data in panda’s dataframe using scikit-learn library.
  • Extracted feature and target from dataframe.
  • Cleaned the feature using nltk library, regular expression and vectorized it to be used to create model.
  • Created two different ML models using deep learning, one to predict the incident type and the other to predict the configuration Item.
  • Created python script to filter emails that qualify for INC creation and run against the models for prediction.
  • Created a python script to create incidents based on model prediction.

Software Developer

HCL Technologies
08.2012 - 06.2015

Commonwealth Bank of Australia - MDS

CBA’s MDS(Mandatory Data Standards) enables transferring of Superannuation fund of employers and employees from one bank to another.

  • Involved in requirement analysis of the functionality.
  • Responsible for coding core functionality and contributed to developing all the components of MDS.
  • Build DAA and deploy in Active Matrix environment using ANT script.
  • Responsible for Unit testing the components using SOAP UI.
  • Involved in code review of peers to ensure code quality.
  • Coordinate with the onsite architect for design/domain related clarification.
  • Involved in providing Integration support to the onsite team.

Education

Bachelor of Engineering - Computer Science And Engineering

Annamalai University
Tamil Nadu
08-2012

Skills

  • Python,
  • Data Analysis
  • Machine Learning
  • GenerativeAI
  • LLM
  • Deep Learning
  • Statistics
  • AWS (S3, Sagemaker)
  • Domino Datalab
  • Azure Databricks

Achievement

Winner of TCS Global MFDM - AI Knockdown Hackathon held in August 2019.

Timeline

Data Scientist

WNS Global Services
10.2022 - Current

Data Scientist

Tata Consultancy Services
07.2015 - 09.2022

Software Developer

HCL Technologies
08.2012 - 06.2015

Bachelor of Engineering - Computer Science And Engineering

Annamalai University
Alok kumar