Summary
Overview
Work History
Education
Skills
Personal profile
Timeline
Generic
Mikhil Dinesh Vartak

Mikhil Dinesh Vartak

Pune, Mumbai

Summary

  • Experienced Developer -lead. 14 years of experience in creating applications based on specific requirements. Adaptable, diligent and knowledgeable in Big data technologies .
  • Developed and implemented big data solutions on Databricks and Hadoop platforms using Spark-Scala, pyspark .
  • Involved in development of data pipelines to process large-scale json datasets using pyspark through databricks notebooks on Azure cloud .
  • Good understanding of databricks concepts such as versioning ,delta table ,time travel .
  • Led the migration from On-prem Hadoop platform to Databricks on azure platform.
  • Well aware of Agile methodology . Participates in daily scrum meetings , Demo ceremonies.
  • Worked on concepts like managed tables, external table's dynamic partitions, static partitions, bucketing
  • Hands on complex Spark transformations on dataframes such as map ,flatmap .
  • Enhanced data pipeline security and compliance by implementing azure key vault -secrets .

Overview

14
14
years of professional experience

Work History

Developer, Tech lead, developer

Amdocs
10.2018 - Current

Client :AT&T

Role: Developer,Tech lead

Project :Streaming data platform (Azure migration)

  • Technologies: Azure platform, databricks ,Delta table ,Apache Kafka
  • Guided team for designing and implementing cloud-based data architecture, optimizing performance, and ensuring data integrity
  • Spark jobs are created in databricks to load daily incremental data in delta tables
  • Data loading , extraction & transformation is done using Medallion architecture
  • Implemented REST API to invoke databricks spark jobs
  • Conducted thorough testing of the new environment to ensure that everything was working as expected and optimized the configuration for performance, cost and reliability


Project : Streaming data platform (On premise)

  • Technologies: Spark-sql , Spark -Scala , Hadoop cluster , shell scripts, python , Airflow , Kafka -flume


  • Summary :
  • The purpose of SDP project is to process compensation related data of AT&T dealers & distributors
  • Payment is generated based on reports provided by SDP
  • Designed and implemented real time incremental data pipeline to process streaming high volume(~4 million records per hour) json data from 14 different Kafka topics using Apache-Flume and Spark-Scala
  • Automated ELT (Extraction-loading-transformation) process using Airflow
  • Moved services to docker -container for high availability (99% uptime).

Senior software engineer

ATTRA INFOTECH
02.2018 - 09.2018
  • Project : MI issuing
  • Client : MasterCard
  • Team size : 8 (Agile team)

Technical Analyst

FISERV
05.2015 - 01.2018
  • Project : EPOC: Enterprise Platform for Online Commerce
  • Team size : 35
  • EPOC is a transaction processing software product developed by Fiserv
  • EPOC host interface application communicates with hosts, processors and networks using various electronic fund transfer message formats including ISO-8583
  • EPOC handles transactions for ATM, POS, Mobile payments and Internet
  • Spark-Scala based process is developed to create monthly reports based on daily transactions .

Developer,Analyst

ATOS WORLDLINE INDIA PRIVATE LIMITED
02.2012 - 04.2015
  • Project : MAGNUS Payment Switch (High performance credit/debit transaction management system)
  • Technologies: C++,Unix, SQL
  • Db2, Sybase, multithreading
  • Summary :
  • MAGNUS is EFT (Electronic Fund transfer) acquiring & issuing switch for Point of sale terminals and IPG (Internet Payment Gateway) transactions
  • TCP-IP sockets, UNIX message queues, signaling mechanisms have been used to process multiple transaction requests simultaneously
  • It is used to acquire, authenticate, route, switch, and authorize financial transactions across multiple channels
  • Accepts cards of all major associations which include Visa, MasterCard, American express, Rupay

Application Engineer

Cognition solutions pvt ltd, Kotak securities Ltd
04.2010 - 01.2012

BO Server :

  • Team size :20
  • Summary :
  • This application is dialog-based application
  • It's designed to run the various back-office processes
  • Developed using TCP/IP client - server architecture
  • The BO Server which is the Back Office Server Application, and it is responsible for all the activities and processing to be done.

Education

MSc - IT

University of Mumbai
2009

Bachelor of Science - IT

University of Mumbai
06.2007

Skills

Technical Skills Set

  • Languages
  • Python, spark-Scala ,pyspark
  • Scripts
  • Shell scripts, databricks notebook
  • Technologies
  • UNIX, Hadoop, Hdfs, Kafka , Apache Flume, Azure Databricks,Azure data factory
  • Database
  • Hive, Oracle, Mysql , Databricks Delta Tables
  • Tools :Git, AgileCraft, Ambari, Grafana, Docker, Airflow
  • Certifications :

Databricks certified data engineer associate

Personal profile

Fathers Name : Dinesh Atmaram Vartak
Sex: Male
Marital Status: Married
Nationality: Indian

Hobbies : Playing Cricket, Chess,trekking 

Timeline

Developer, Tech lead, developer

Amdocs
10.2018 - Current

Senior software engineer

ATTRA INFOTECH
02.2018 - 09.2018

Technical Analyst

FISERV
05.2015 - 01.2018

Developer,Analyst

ATOS WORLDLINE INDIA PRIVATE LIMITED
02.2012 - 04.2015

Application Engineer

Cognition solutions pvt ltd, Kotak securities Ltd
04.2010 - 01.2012

MSc - IT

University of Mumbai

Bachelor of Science - IT

University of Mumbai
Mikhil Dinesh Vartak