Summary
Overview
Work History
Education
Skills
Accomplishments
Timeline
Generic
Ayaskanta Ratha

Ayaskanta Ratha

BigData Architect & Engineering Manager

Summary

A professional building data pipelines using tools from BIGDATA,PROGRAMMING and CLOUD.

Overview

14
14
years of professional experience

Work History

Senior Manager,Big Data Architect(IC)

Prudential PLC
09.2021 - Current
    • Responsible for Mentoring,Building and designing complex data pipelines in Azure and GCP cloud environments.
    • Used AWS for Asset Master project.
    • Created and Enhanced one Envelope encryption library using java which deploys multi cloud encryption and introduced envelope encryption in prudential to protect PII data.
    • Created a framework in java to ingest data from Kafka to BigQuery raw and structured layer,This framework was having many features like flattening data,encrypting data,handling schema evolution etc in a data pipeline.
    • Coded extensively in python and scala for building data lake in Azure and then migrating to GCP few years later.
    • Used Dataflow,Dataproc,cloud functions and Composer extensively in GCP.
    • Designed and created multiple pipelines in jenekins for maintaining components in Azure and GCP for my project campaign master.
    • Helped in running our application. in GKE and also coordinated to dockerize multiple applications in prudential
    • Currently learning and working in MLops and building AI integrated pipelines using generative and responsible AI.
    • Tech Stack Used:

      Scala,Java,python,GCP,Azure,jenekins,K8,Docker,Azure,Databricks,DBT.

Senior BigData Architect and Manager

Standard Chartered Global Business Services
09.2016 - 09.2019
    • Responsible for migrating legacy system in java and multi threading to Spark and scala.
    • Coded extensively in scala,java and python.
    • Built multiple report pipelines E2E including devops activities and metrics
    • Designed all BigData pipelines from scratch even mentored BAs in Singapore to design new data pipelines and grown a team of 3 to 12 for the data ingestion and reporting.
    • Dealt with GBs of Data everyday.
    • Worked with both real time and batch pipeline using java and scala,Also used Aure extensively later to migrate from on prem to Cloud.
    • Tech stack Used

      Java,Scala,python,Hadoop,Hive,Jenekins,Azure,Spark,Kafka

Senior Software Engineer-3(IC)

Walmart Labs
09.2016 - 09.2019
    • Processing and maintaining data pipelines involved in millions of events everyday.
    • Worked in spark and java in both real time and batch manner.
    • Extensively worked in data lake to process real time events into data lake.
    • Used storm to process event data from kafka in a real time fashion and push it into kafka as sink.
    • Used Flume to push Customer PII data from kafka into a encrypted zone at HDFS built using Apache Atlas.
    • Worked on a validation framework using JAVA which basically compares data between legacy teradata POS systems and data lake HDFS system.
    • Created a DQ framework using JAVA which can convert user statements in yaml to spark sql and run against their regional data lake to give an idea of the overall data quality using certain metrics and then display it using grafana dashboard.
    • Used Teradata connector and Sqoop to push data into CTH from various data sources
    • Tech Stack Used

      Hadoop,Hive,Strom,Flink,java,scala,python,jenekins,Cassandara,Flume,ELK

Senior Software Engineer

Tesco
02.2015 - 09.2016
    • Responsible for ETLing various data sources using teradata,java,Informatica
    • Worked in various data patterns both streaming and batch using spark and java
    • Involved in data modelling and generating an effective data warehouse for my project RDF-retail demand forecasting.
    • Optimized various data pipelines,SQLs written in teradata.
    • Lead creating a blue print for the RDF project for forecasting by sitting month long in UK.
    • Involved in migrating projects from terdata to spark 1.6 and Hive.
    • Entered cloud space with Azure in Tesco.
    • Tech stack used

      Spark,Teradata,Hive,HDFS,Informatica,Oozie,Pig,Java,scala

System Engineer

TCS
09.2010 - 02.2015
    • Worked Extensively Around data processing.
    • Predominately worked in SQL,PL-SQL,Teradata
    • Worked in informatica tool for building ETL pipelines
    • Worked in java to report various matrices of the data pipelines.
    • Optimized and automated many ETL reporting using Java/SQL which were taking more time previously and thereby reduced effort of support teams.

Education

Bachelors in Technology - Applied Electronics And Instrumentation.

National Institute of Science And Technology
Odisha
04.2001 -

Skills

  • Java
  • undefined

    Accomplishments

    • Best team award in Tesco for RDF
    • Bravo award in walmart Labs for best individual performance in a quarter.
    • TCS gems for creating a knowledge base which is reuseable.
    • Best team award in Prime services in standard chatereted for the year 2020.

    Timeline

    Senior Manager,Big Data Architect(IC)

    Prudential PLC
    09.2021 - Current

    Senior BigData Architect and Manager

    Standard Chartered Global Business Services
    09.2016 - 09.2019

    Senior Software Engineer-3(IC)

    Walmart Labs
    09.2016 - 09.2019

    Senior Software Engineer

    Tesco
    02.2015 - 09.2016

    System Engineer

    TCS
    09.2010 - 02.2015

    Bachelors in Technology - Applied Electronics And Instrumentation.

    National Institute of Science And Technology
    04.2001 -
    Ayaskanta RathaBigData Architect & Engineering Manager