Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

ABHISHEK HARSH

Senior Data Engineer - Google Cloud
Bangalore

Summary

Innovative and enthusiastic software development professional with a hardworking and conscientious approach. Knowledgeable about developing Cloud Architectures, Data and IoT Platforms.

Excellent communicator with ability to meet deadlines and quickly resolve issues.

Overview

6
6
years of professional experience
3
3
Certifications

Work History

Senior Data Engineer - Google Cloud

66degrees - Google Cloud Partner
01.2023 - Current

ViewLift Cloud Modernization (Sports Streaming Platform):

  • Build GCP architecture for Viewlift's Data platform modernization for their sports channel live streaming platform for batch jobs, databases and data warehousing solutions from AWS to GCP.
  • Migrated AWS DynamoDB data to MongoAtlas data (deployed over GCP) over Dev and Prod environments.
  • Build Python scripts for NoSQL DB migration from AWS DynamoDB to MongoAtlas over GCP.
  • Migrated AWS RDS to GCP Cloud SQL for Relational data.
  • Build CDC - Change Data Capture Script for MongoAtlas w.r.t AWS DynamoDB which is used by company as a general solution for other similar migrations.
  • Migrated data from AWS Redshift to GCP BIgquery and its architecture buildup as per best practices.

Express Inc (America's Fashion Retailer):

  • Solved Data Validation problem for migrated data from Teradata to GCP Bigquery.
  • Manual Validation performed for 700 tables based on Python job-level differences between Teradata to Bigquery.

MedAllies (Healthcare):

  • Solutions architecture design to migrate OnPremise MySQL server to Cloud SQL on GCP.
  • Solution architecture design to migrate from Cloud SQL to Bigquery for reporting through Looker dashboard for client's requirement of reporting.

AutoZone 7 Apps Migration (Automotive Parts Retailer):

  • Worked on modernizing 7 core applications of Autozone and performing prod-level migration from on-prem Oracle database to GCP Cloud Sql.
  • These migrations were made possible by designing and developing a lot of automation scripts like Data Validation script, Data Migration Assessment etc.

Senior Data Engineer

Hashedin By Deloitte
10.2021 - 12.2022

Vanguard Retail Stock Market Modernization (United States):

  • Verified requirements, defined architecture, developed an implementation plan and provided thought leadership for all data solutions, including designing data solutions that meet and exceed customer expectations.
  • Architected and Build CI/CD pipelines using Bamboo, AWS Cloud Formation etc.
  • Build Data pipelines for seamless orchestration of batch processing using various data-driven services on AWS adhering to the best practices and reliability.
  • Implemented Transformations and Validations for huge client data loads every day using Spark with Python.
  • Guided various Data engineering groups/pods among Deloitte and Vanguard for various data strategies and architecture on AWS.

Senior Data Engineer

Huron Consultancy Group
11.2020 - 09.2021
  • Worked with Corporate IT Team and Analyzed complex data and identified anomalies, trends and risks to provide useful insights to improve internal controls.
  • Perform data engineering over Workday , Salesforce & Public healthcare data to analyze, process and do cleansing of data into multiple stages.
  • Automation of some real time complexities using python over data pipelines with help of AWS services.
  • Contributed to internal activities for overall process improvements, efficiencies and innovation.

Software Development Engineer

Accenture Digital
08.2018 - 10.2020

Nippon Logistics (IoT platform and Cloud Infrastructure) - Japan

  • Developing an End to End solution for Nippon logistics businesses worldwide, making every shipment of most expensive medicines IoT enabled in order to get crucial real time data of every shipment's/ pallet 's characteristics like temperature, jerk, shock etc.
  • Working on AWS IoT, AWS Lambda, AWS DynamoDB and other AWS services with Azure cloud platform components in order to process real time IoT data from Intel's ICLP devices tagged on Nippon shipments . Build AWS infrastructure for data synchronization on Accenture Visibility portal (for real time data sync).
  • REST API development with Spring Boot framework for Accenture's dashboard in collab with Nippon Express and other logistic partners.

Education

Bachelor of Engineering - Information Science

BNM Institute Of Technology
Bangalore-KA-INDIA
08.2014 - 2018.07

Senior Secondary - Science Education

St Andrews School
Agra-UP-INDIA
01.2012 - 2013.03

Skills

  • Solutions Architecture and Development on Google Cloud Platform, AWS.
  • Batch Processing - Apache Hadoop & Apache Spark
  • Relational Databases - My SQL / PostgreSQL / Aurora / CloudSQL
  • NoSQL Databases - DynamoDB / MongoDB
  • Python, PySpark, REST API, SQL
  • Data Warehousing(Bigquery and Redshift), Data Analysis, Data Modelling & Designing
  • Jira, Bamboo, Git, Azure DevOps, BitBucket, Pycharm, Anaconda, DbViewer, Jupyter Notebook, CI/CD
  • IaaS - AWS Cloud Formation , Terraform(Basic knowledge), Google Cloud CLI

Certification

AWS Certified Developer - Credential Id : LBTSZRN1HNF4QV5W

Timeline

GCP - Professional Cloud Database Engineer - Credential Id: 92562993

01-2024

GCP - Professional Data Engineer - Credential Id: 92191597

01-2024

Senior Data Engineer - Google Cloud

66degrees - Google Cloud Partner
01.2023 - Current

Senior Data Engineer

Hashedin By Deloitte
10.2021 - 12.2022

Senior Data Engineer

Huron Consultancy Group
11.2020 - 09.2021

AWS Certified Developer - Credential Id : LBTSZRN1HNF4QV5W

08-2020

Software Development Engineer

Accenture Digital
08.2018 - 10.2020

Bachelor of Engineering - Information Science

BNM Institute Of Technology
08.2014 - 2018.07

Senior Secondary - Science Education

St Andrews School
01.2012 - 2013.03
ABHISHEK HARSHSenior Data Engineer - Google Cloud