Summary
Overview
Work History
Education
Skills
Timeline
Generic

Nikhil U Rao

Bengaluru

Summary

Experienced Software engineer with a demonstrated history of working in product industry. Proficient with functional and reactive programming paradigms. Engaged in diverse projects involving the development and enhancement of data-intensive applications, platforms and Software as a Service (SaaS) products in a hybrid cloud environment. Skilled in Java 8, micro-services architecture, big data technologies, distributed systems and databases. Highly motivated to solve complex technical and business challenges by leveraging distributed systems.

Overview

7
7
years of professional experience

Work History

Senior Software Engineer, Data Reliability Cloud

Acceldata
03.2023 - Current
  • Redesigned Analysis microservice (Containerized Dataplane service) to a distributed stateless service architecture with replicas making it highly available for data processing requests using Redis and Javalin
  • Developed a system to compute data quality metrics such as completeness, skewness, correlation on underlying data assets using Amazon's Deequ library and performed constraint verification by the means of user defined data validations rules using Kotlin, Java and Spark
  • Developed high-quality, scalable RESTful APIs using Kotlin, KTOR and Postgres DB
  • Designed and developed cadence and anomaly detection on data assets using Prophet library
  • Designed and developed scalable and reliable distributed job scheduler (5k jobs schedules per second, over 2k callbacks made everyday) which is responsible for workflow management with Temporal cluster and worker processes
  • Workflow service provides strong guarantees of durable execution of application code in the face of transient or intermittent failures and crashes by maintaining append-only log event history
  • Usage of workflow service instead of Quartz job scheduler mitigated workflow failures by 18%

Senior Software Engineer, Kelsa Core

Target Corporation
12.2021 - 12.2022
  • Building a platform offering core pipeline development capabilities at scale leveraging OSS CDAP (Cask Data Application Platform) within Target's data ecosystem
  • Integrate CDAP with Target's on-premise data platform (Hadoop3) and in-house cluster management and application platform software
  • Refactor and improvise CDAP's Metadata management for Elasticsearch 8.0 increasing indexing efficiency and document versioning by implementing optimistic concurrency control
  • Develop custom Java based hydrator plugins for data cleansing and analytics using Spark APIs
  • Involved in code reviews and query fine tuning

Software Development Engineer, Data & ML Platform

247.ai
10.2020 - 12.2021
  • Building a distributed event driven stream processing framework capable of handling real-time data such as user-agent interaction and engagement data along with core metrics using Apache Beam with Java API for internal business applications, client self-serve platforms and many more mission-critical products

Analyst, Analytics & Cognitive

Deloitte
10.2017 - 09.2020
  • Developed scalable (3k TPS 100 concurrent) distributed data factory ingestion framework facilitating the insertion of data from relational data sources to ADLS Data Lake automatically based on source data patterns and practices
  • Designed and developed streaming data ingestion and transformation pipeline with Apache Spark Streaming API and Apache Kafka to extract metrics, stats and KPIs and pushed to Elasticsearch for text-based search and analysis
  • Worked with Implemented Object Level Dependency to the existing data intensive application, which resulted in reduction in the job execution time by up to 20%
  • Operationalized analytics to end users via purpose built and multi-dimensional data stores and marts on Hive and PostgreSQL database and exposed views for consumption of processed metrics
  • Involved in system integration and stress/load testing of pipelines for the entire data life cycle
  • Deployed, orchestrated, and scheduled production pipelines with Apache Airflow and Oozie

Education

Professional Diploma - Digital Transformation Big Data with Apache Hadoop

NIIT
Bengaluru, Karnataka
09.2017

Bachelor of Engineering - Telecommunication

Visvesvaraya Technological University
Bengaluru, Karnataka
12.2016

Skills

  • Java
  • Python
  • Scala
  • Kotlin
  • Bash
  • MySQL
  • Apache Hive
  • MongoDB
  • PostgreSQL
  • Cassandra
  • Clickhouse
  • Apache Hadoop
  • Spark
  • Beam
  • Flink
  • Kafka
  • NiFi
  • Zookeeper
  • Sqoop
  • Flume
  • Apache Airflow
  • Oozie
  • Git
  • Maven
  • JSON
  • Gradle
  • JUnit
  • Docker
  • Kubernetes
  • Redis
  • MS Azure
  • GCP
  • AWS

Timeline

Senior Software Engineer, Data Reliability Cloud

Acceldata
03.2023 - Current

Senior Software Engineer, Kelsa Core

Target Corporation
12.2021 - 12.2022

Software Development Engineer, Data & ML Platform

247.ai
10.2020 - 12.2021

Analyst, Analytics & Cognitive

Deloitte
10.2017 - 09.2020

Professional Diploma - Digital Transformation Big Data with Apache Hadoop

NIIT

Bachelor of Engineering - Telecommunication

Visvesvaraya Technological University
Nikhil U Rao