Summary
Overview
Work History
Education
Skills
Additional Information
Accomplishments
Interests
Certification
Timeline
AdministrativeAssistant
Akhilesh Singh

Akhilesh Singh

Lead - Technology
New Delhi,DL

Summary

Experienced lead with 12+ years in design, dedicated to creating innovative processes. Expert in developing workflows and frameworks to streamline operations. Skilled in migrating on-premises data to AWS, GCP for optimal performance. Strong focus on Object-Oriented Programming for high-quality solutions. Committed to continuous learning and professional growth.

Overview

12
12
years of professional experience
5
5
Certifications

Work History

Lead - Technology

IRIS Software, Inc.
04.2022 - Current
  • As a skilled data professional, I have successfully migrated on-prem data to the AWS cloud, leveraging my expertise in data warehousing and cloud computing.
  • I am playing a key role in improving the existing Data Quality (DQ) framework, ensuring that the data is accurate and reliable.
  • By leveraging my expertise in data visualization and report development, I have developed DQ Power BI reports that are customized to meet the needs of our business, and that enable users to easily identify issues with data quality and take action to address them.
  • I have migrated the on-prem DaaS to AWS cloud, driving significant improvements in data processing speed, scalability, and elasticity.
  • I am dedicated to providing solutions for any blockers in a workflow, helping my team to use their time efficiently and effectively.
  • I am also committed to contributing to code reviews and ensuring that our code is of the highest quality.
  • I am passionate about sharing my experiences and expertise, and I regularly conduct Python training sessions for beginners and intermediate learners.
  • I also write blogs for the company, sharing insights and best practices related to data warehousing, cloud computing, and other relevant topics.

Senior BI Developer

QL2 Software India Pvt Ltd.
05.2017 - 04.2022
  • Built a streaming service for Expedia crawling data, utilizing my expertise in data processing and cloud computing to ensure that the data was processed accurately and efficiently.
  • As an individual contributor, I played a key role in building the Opti-Price back-end, using my knowledge of database management to optimize performance and ensure scalability.
  • I developed a text matching service for car-rental services, using Jaro-Wrinkler algorithm.
  • Design and developed End-to End ETL process. Using AWS services and snowflake.
  • Additionally, I created a data pipeline sharing between two Snowflake accounts, driving significant improvements in data processing speed and efficiency.
  • Regular participation in monthly presentation/demo of the sprints.

CoE IT Engineer

QSix India Pvt Ltd. C/O Motorola India
03.2013 - 05.2017
  • Developed struts website to integrate Tableau reports using JS API.
  • Experience in developing Tableau reports having extract & incremental refresh.
  • Developed various tableau report for Gray Market, Regional Reports etcs.
  • Writing complex custom SQLs to have dataset ready for the reports.
  • Automated the SAP report extraction using VB macro.
  • Proficient in Microsoft Excel.
  • Set Up regular calls with users to understand their requirements, data source, complexity, and flow of data.

Education

Becholer of Commerce - Commerce

School of Open Learning Delhi University
04.2001 -

Skills

Additional Information

1. SQL to Redshift Migration using AWS DMS & Glue Orchestration, Iris Software, Inc.
This project involved end-to-end migration of an on-premises SQL-based data warehouse to Amazon Redshift. The primary objective was to modernize the data platform by leveraging AWS cloud-native services to improve scalability, performance, and reliability.

Responsibilities & Key Highlights:

  • Solely designed and developed the entire ETL orchestration framework in Python/PySpark, following robust Object-Oriented Programming (OOPs) principles for flexibility, reusability, and modular design.
  • Designed and implemented a multi-layered ETL architecture
  • consisting of Raw, Consumed, and Engineered zones, using Amazon S3 as the landing zone and staging area.
  • Utilized AWS DMS (Database Migration Service) to perform Change Data Capture (CDC) replication from the source SQL database to S3 in Parquet format, ensuring minimal downtime during cutover.
  • Built scalable AWS Glue Jobs to process and transform data between the layers:
  • Bronze→ Silver: Applied basic cleansing, validation, and partitioning.
  • Silver → Gold: Applied business logic, schema harmonization, and column-level transformations.
  • Loaded curated data into Amazon Redshift for downstream analytics and dashboarding using Power BI and QuickSight.
  • Orchestrated the entire workflow using AWS Step Functions, providing visibility and failover control across:
  • DMS task triggers
  • Glue job sequencing
  • S3 data checks and success/failure logging
  • Implemented reusable scripts and parameterized configurations to support new pipelines with minimal code changes.
  • Enabled role-based access to S3 and Redshift using IAM policies, Lake Formation tags, and column-level permissions.
  • Maintained documentation and conducted knowledge-sharing sessions with team members.

Tech Stack:
AWS DMS, Glue, Redshift, Step Functions, S3, IAM, Lake Formation, PySpark, Python, SQL Server


2. DaaS Server: Migration on-prem to AWS, Iris Software, Inc. 3. DQ Framework, Iris Software, Inc.

It's having DaaS Server developed in Java & DaaS client developed in C#. User request the data from server using client and server return the Kafka Topic to the user to consume the data. Client is implemented with both sync & Async APIs.

It's having 100+ APIs. DaaS server need to deploy on AWS and convert the client APIs to AWS API Gateway. Without interfering the user experience, I have gradually integrated the APIs with API Gateway and to achieve this I have created a proxy as Lambda Service. This proxy is responsible to check that the request is coming for AWS migrated API or still on-prem API and then it's sent the response to the user.

Responsibilities:

● Build AWS API Gateway for 50+ APIs.

● Update C# client to send an additional parameter as api_name.

● Create proxy as AWS Lambda which redirect the request to on-prem or AWS DaaS server.

● Testing of all the APIs.


3. DQ Framework, Iris Software, Inc.

This framework is taking care of validating the data quality as per the defined rule by the user. There is 50+ rules which user can select and run on the data like, is_distinct, is_number, check_length, check_file_size, is_blank, ignore_blank etc.

Input data can we S3 files, DB Tables and SQL Script. Best part is, user can create own rule in the form of python script.

Responsibilities:

● Design the code for the flexibility and scalability.

● Converting rules in to Python/PySpark.

● Testing with different use-cases like single or group of files.

● Using best suited data-structure.

● Code review of the Glue Job script.

● Regular discussion with business owner and the user for the new features and improvements.


4. Opti-Price, QL2 Software

Opti Price uses proprietary auto-matching technology to help customer examine their price competitiveness, identify pricing opportunities and outliers for their product catalog, and manage their product matches. Boost revenue growth by making data driven decisions aligned with client's pricing strategies and the market landscape in real time.

Responsibilities:

● Re-design the code as per the OOPs standards.

● Implementing the new features like pricing on region basis.

● Optimize the SQL queries by correcting the sequences of joins and order by.

● Created indexes if any of the column impacting the performance.

● Implemented the multi-threading for the multiple customers of the same vertical/domain.

● With the help of Pandas, all the different type of aggregation like min-max, median, no. of matches, etc. is read to consume.


5. QL2 Python Package, Ql2 Software

No credentials in code of static in code. This package is responsible to return different connection objects like postgres, snowflake, SFTP/FTP, AWS.

Responsibilities:

● Creating different modules and configuration related to different type of connections.

● Maintenance and support.

● Taking care of the up-gradation of the features with different utilities.

Accomplishments

  • Accolade for Q2-2025
  • Python Trainer with 4.5/5 Rating Q3-2024
  • Development Environment Setup for Glue Jobs, 04/23
  • 2nd Position Award: Cloud Hack, 12/22
  • Most Valued Person Award, 01/22
  • Bravo Award, 08/15
  • Bravo Award, 02/15

Interests

Exploring the best practices of programming
Blogging
Code Review
Listening to Music
Hangout with friends

Certification

AWS Knowledge: Object Storage

Timeline

Academy Accreditation - Databricks Lakehouse Fundamentals

12-2023

AWS Knowledge: Object Storage

10-2023

AWS Knowledge: Serverless

08-2023

AWS Cloud Quest: Cloud Practitioner

03-2023

AWS Knowledge: Architecting

03-2023

Lead - Technology

IRIS Software, Inc.
04.2022 - Current

Senior BI Developer

QL2 Software India Pvt Ltd.
05.2017 - 04.2022

CoE IT Engineer

QSix India Pvt Ltd. C/O Motorola India
03.2013 - 05.2017

Becholer of Commerce - Commerce

School of Open Learning Delhi University
04.2001 -
Akhilesh SinghLead - Technology