Summary
Overview
Work History
Education
Skills
Timeline
Generic

Vikas Kumar

GCP Data Engineer / BI Engineer / Developer- 13 Year Experience (Financial Services)
Gurgaon

Summary

GCP Data Engineer / BI Developer/ Software Developer, experienced in evaluating business needs to support key business goals. 12 Years of hands-on experience in business intelligence, advanced analytics and Development. Proficient in natural language processing and opens source technologies Python/Linux/Java/C# and SQL for high-performance analytics to enhance business results and decision-making.

Overview

15
15
years of professional experience
7
7
years of post-secondary education

Work History

Manager - Google Cloud

Capgemini
05.2022 - Current

Currently working with HSBC Bank (Client)


Project - CS Data BackBone

Duration - Apr-2024 To Nov-2024

Team Size - Managing team of 5 people.


GCP Components used - Data-flow, BigQuery, Python, Terra-forms, Data-fusion, PG SQL, Cloud composer/Airflow


I have developed pipelines (Event and Batch)


Events Pipelines - There was 2 different sources (Zone and Client) which where coming as stream from source system, at the time of consuming these event there were some reference data (ingested already as Batch and some data we ingested as Batch) in which we need to lookup before sending to final data asset table. We have create around 5-7 pipelines which were consuming data from different Pub/Sub subscription and again sending to different Pub/Sub topics after enriching the events.


Batch Pipelines - There were some vendors who were sending the data in form of CSV files and we used cloud data-flow to ingest them into BigQuery, also there were some pipelines (reference data form events) which were receiving data from various systems (BigQuery / Files / SQL) to store them in reference store (PG SQL) for lookups.



Project - Electronic Data Processing

Duration - May-2022 To Mar-2024


My role is here to build the pipelines on Google Cloud, I handle a team of 6 peoples here. We process data from RDL(Raw Data layer) to CDL(Confirm Data layer) for business use cases. First we get the requirements from business in form if DIS/DRS that how team wants data and what logics they want to be implemented while processing data into CDL. We create models and analyze data for certain use cases.

We need to monitor pipelines schedule that everything is running correctly.

We use various tools like BigQuery as data warehouse, Python for scripting, Jenkins for CICD and Control-M for scheduling the processes.


Sr. Tableau Developer / Data Engineer

Team Computers Pvt. Ltd.
04.2019 - 05.2022

Worked in NBFC domain and client is SBI Cards.

Playing "Business Analyst" Role here, Interact with client to get requirements and to provide end to end solution from data prep to final dashboard.

First take requirement then plan how to get data from various sources,

Based on requirement we create data pipelines/ETL processes so that any BI tools can read that data in more efficient manner.

  • Managing team of 5 peoples and distribute work accordingly.
  • Managing Tableau Application/Server and access control, involve in Patching/Backup-Restore/Installation/Drill and support activities.
  • Addressing ad hoc analytics requests and facilitating data acquisitions to support internal projects, special projects and investigations.
  • Developing, implementing, supporting and maintaining data analytics protocols, standards and documentation.

Projects/Achievements

  • Profitability/P&L Process/Dashboard - It has all metrics which are needed to calculate profit/Loss for the organization like Revenue, Expenses, COF, Write Off, Provisions and Ratios on various metrics. It has data from many sources like RDM (Datamart), Flat files and ODS then it process data into final tables (Summarized tables) so that Tableau can easily process the data.
  • CEO/Digital Dashboard - One dashboard having almost 60% metrics like Spends, Payment, Delinquency, Sourcing, NEA etc. in the single view. It process data on daily basis and shows daily and monthly figures.
  • Authorization Metrics - It shows transactions/Spends done by customer along with forecast values to monitor any unwanted dip in numbers due to any reason. It get refreshed on hourly basis and shared over mail to SMT's.
  • Chargeback Dashboard - It monitor customer's dispute on various levels based on network, Transaction Type (Payment/Spends), Transaction Mode (Offline/Online), Top merchants, daily inflow, TAT distribution and many more metrics.

-

NBFC domain and client is Mannapuram

Designed and Developed data warehousing solution on Big Query, Client is using standard SQL based system and due to size limitation and query performance (As sql is in normalize form) we are keeping analytical data into Big Query.

-

Food Industry and client is Bikano

Client is using third party DMS (Distributor Management) software in which they used traditional/old FTP approach to sync data from client nodes (Distributors) to Head office/primary system. Problem with this system was inconsistency / File corruption / Read-Write partial file.

We replace this by using GCP pub/sub, client nodes send data to GCP and HO application read and push data into database.


Sr System Analyst

Indus Software Technologies Pvt. Ltd.
11.2016 - 04.2019

Domain - NBFC

Skills used - (Java, SQL, python, Tableau)

I worked as data engineer, My role was to build data pipelines / ETL and to manage existing processes and work load and to build tableau dashboards.

I have developed a new multithreading solution (Using java) and replaced existing system which was made on sequential processing which was using to maintain daily business process

Sr. Dot Net Developer

Eazy Business Solutions Pvt. Ltd.
02.2013 - 11.2016

Domain - ERP

  • ERP

I worked in this company as Sr. Dot Net developer.

I worked/Developed ERP system (Finance/Purchase/Sales module).

My role was to modify and develop new modules as per client requirement.

  • Tally Integration/ Busy Integration

I develop a module in which any transaction which is created in ERP, We push directly into Tally or Busy (Which are leading account software).

  • DMS (Distributor Management System)

Worked from scratch to develop this system.

This software is used to track secondary sale of customer and this software push and receive data from Tally/Busy via FTP transfer.

Sr Developer

Triangle Solutions
01.2010 - 02.2013

I started my carrier and worked here as a VB6 Developer and developed software using VB6 and SQL server 2008. I used to interact with Client for new requirements or enhancements and to deliver end to end solution.

List of projects done

1. Accounting Software

2. Academic Software

3. Restaurant/Hotel Management Software

4. Export Management Software

5. ERP

Education

Master of Science - IT

MD University
Haryana
04.2008 - 04.2010

Master of Arts -

MJP Rohilkhand University
Bareilly
04.2004 - 04.2006

Bachelor of Arts -

MJP Rohilkhany University
Bareilly
04.2001 - 04.2004

Skills

Project Management / Analytical skills

Big Query /Big Table / Pub-Sub / Data Flow / PG SQL

SQL / Oracle / HQL (Hive Query Language)

ETL / Data Migration / Dell Boomi

Python / JS/ JQuery

Tableau / Power BI /Looker

C# / ASP/ MVCNET / Core Java

Timeline

Manager - Google Cloud

Capgemini
05.2022 - Current

Sr. Tableau Developer / Data Engineer

Team Computers Pvt. Ltd.
04.2019 - 05.2022

Sr System Analyst

Indus Software Technologies Pvt. Ltd.
11.2016 - 04.2019

Sr. Dot Net Developer

Eazy Business Solutions Pvt. Ltd.
02.2013 - 11.2016

Sr Developer

Triangle Solutions
01.2010 - 02.2013

Master of Science - IT

MD University
04.2008 - 04.2010

Master of Arts -

MJP Rohilkhand University
04.2004 - 04.2006

Bachelor of Arts -

MJP Rohilkhany University
04.2001 - 04.2004
Vikas KumarGCP Data Engineer / BI Engineer / Developer- 13 Year Experience (Financial Services)