Summary
Overview
Work History
Education
Skills
Personal Information
Projects
Timeline
Generic
Saravanakumar S

Saravanakumar S

Mogaippair West

Summary

Dynamic Principal Engineer with a proven track record at Opsera, excelling in data architecture and CI/CD implementation. Adept at leading cross-functional teams, optimizing data pipelines, and driving project success. Skilled in Python programming and fostering collaboration, delivering impactful solutions that enhance performance and efficiency in cloud infrastructure.

Overview

10
10
years of professional experience

Work History

Principal Engineer

Opsera
06.2024 - Current
  • Managing team of Data Engineers with count of 6. Guiding teams picking requirements, workload management and delivering projects on time.
  • Managing team of Devops and SRE’s with count of 3. Getting requirements from various technical teams to setup Devops CI/CD features for deployments.
  • Involving in Architecture and system design for Data, Devops and AI related projects.
  • Coordinating with Product team gathering requirements and designing architecture.
  • Developing ETL’s for Data Ingestion and transformation.
  • Query performance and optimization used for Dashboards.
  • Maintaining and Improving Data Architecture for better performance including – Data pipelines, Database and Cloud architecture.
  • Managing team of frontend and backend developers of count 4. Gathering requirements for Devops and Insights products, workload management and delivering requirements on time.
  • Deciding tools for Devops and Data architecture and maintaining best practices.
  • Managing system automations required for monitoring, testing and product features.
  • Involved in maintaining scrum and release planning for couple of projects during emergency situations.
  • Part of decision-making team for deciding tools and services, Branching strategies, Deployment practices, Development life cycle, AI practices.
  • Team Size: 6
  • Tools: Databricks, Pyspark, Python, AWS code pipeline, DBT, snowflake, Devops, Infrastructure as a code, Terraform, Databricks Asset Bundles, Opsera CI/CD, Azure Devops, Copilot, Generative AI, LLM, RAG, AI Agents, prompt engineering, DLT, auto loader, Unity catalog, jfrog, Gitlab, Databricks apps, MongoDb, Bitbucket, Azure Devops, Jenkins, JIRA, Confluence, Zendesk, Kafkha, Auto loader, Databricks Dashboards, Databricks Apps, Airflow, SonarQube, system design, FastAPI, Django, Flask, MongoDB, Lakebase, Lakehouse, cursor, Langchain, Databricks ML, AI Agents, MCP

Azure Data Engineer

ChampionX
06.2023 - 06.2024
  • Responsible for Migration of SAP BW Objects to Azure Synapse analytics which includes the following activities: SAP Extractor and Table Data migration to Azure Synapse Analytics.
  • SAP BW report migration to Azure Synapse analytics.
  • SAP ABAP logic conversion to T SQL scripts in Azure Synapse.
  • Responsible for Migration of various fields data from Oracle system to Azure Synapse Analytics.
  • Responsible for the creation of ETL/Orchestration setups using Azure Data Factory and through coding using Pyspark/Python in Microsoft Fabric.
  • Responsible for setting up Devops CI/CD setups for Database/Azure Data factory deployments.
  • Responsible for taking care of AI/ML projects for customers.
  • Team Size: 6
  • Tools: Azure Data Factory, Azure Synapse Analytics(Dedicated SQL Pool, Analytics), Azure Key Vault, Azure Data lake Gen2, Azure Logic App, Azure EventHub, Azure Databricks, Azure Devops, Microsoft Fabric, Azure Open AI, Kusto Query, Log Analytics, Pytest, Azure Document AI, Azure Cognitive search, Python, Pyspark, Big Data Technologies, AI Document Intelligence, Large Language Models.

Senior Data Engineer

Gen Digital (Norton LifeLock)
10.2020 - 06.2023
  • Responsible for constructing the Data Ingestion layer from various sources including Database, web, Filestore, Streaming/messaging tools etc.
  • Performing Transformations on the Data from the Ingestion layer as per Business requirements for Analysis.
  • Orchestration of Data Ingestion and transformation activities through Azure Data Factory.
  • ETL Tool Azure Data Factory Administration with the below mentioned activities: Identity and Access control, Integration Runtime setups (Auto Resolve/Self Hosted), Linked service Connection establishment, Secret management in Azure Keyvault containing the connection string of source/sink platforms and mapping it to Linked service for the secure connection establishment from Azure Data Factory.
  • Team Size: 7
  • Tools: Azure Data Factory, Azure Synapse Analytics(Dedicated SQL Pool, Analytics), Azure Key Vault, Azure Data lake Gen2, Azure Logic App, Azure EventHub, Azure Databricks, Azure Devops.

Azure Cloud Engineer

Atos Syntel
11.2019 - 09.2020
  • Involved in setting up the Orchestration suite using VDC to form the Github repos that contains the templates and yaml files to deploy and modify the required Azure resources inside the provided tenant.
  • Virtual network installation over multiple subscriptions and setting up the peering between them for cross Vnet communication.
  • Creating the express route circuits for the on prem data center connection.
  • Creating Azure devops pipelines using Yaml scripts.
  • Team Size: 10
  • Tools: Azure VM(Windows and Linux), VM scale sets, VM encryption, Automation account, Application Gateway, Virtual networks, Express Route, Virtual Network gateway, NSG, Azure Firewall, Key vault, Storage accounts, Log analytics workspace, Monitoring solutions, Azure Data Factory, Self-Hosted Integration Runtime, HDI cluster, Network watcher, Privileged Identity Management, Management Scope layer, Azure Devops, Pipeline scripting-yaml, PowerShell, ARM template, EventHub, User Defined Route for custom routing, RBAC, VDC orchestration package, WVD Jump server, GitHub.

Devops engineer

Cognizant technology solution
03.2018 - 11.2019
  • Dealing with source control both public and private – sync, pull and integrate to vsts builds.
  • Creating pipelines and set up build with suitable template and agents according to process requirements.
  • Working on implementing CI/CD over various deployment projects – database, Azure resource, Dress rehearsal Azure subscription, Docker image to Kubernetes pools – alibaba cloud.
  • Team Size: 4

Azure Cloud Support

Cognizant technology solutions
06.2016 - 11.2019
  • Working on Azure Cloud Platform (PaaS) and working on core Azure components.
  • Implementing secure cloud infrastructure and managing the applications hosted on Azure.
  • Monitoring the applications E2E using Azure Monitor, Application Insights, Log Analytics (OMS).
  • Creating monitoring dashboards and integrating with Applications insights and Log Analytics.
  • Team Size: 10
  • Tools: Azure portal, Azure monitor, Azure resource manager, security center, Log analytics, Application insights, Application Gateway, Key Vault, NSG, VPN, ARM template deployment, custom log set up using connectors to workspace, building monitor scenarios through KQL, Functions, SQL DB maintenance and data storage activities, maintaining storage accounts.

Application support engineer

Cognizant technology solutions
05.2015 - 05.2016
  • Main tasks include RFD for (Deployments, Migrations/Upgrades, Imports/Exports), User Incidents, Running Batch Jobs, troubleshooting issues related to batch job Failure and Overrunning jobs and also working on Application related issues.
  • Handling Prod and Non-prod environments (Stress, Train and Development) for different BU’s.
  • Check active sessions in the background, ensure no critical process is running and terminate the queries if they are running in parallel for long time that are accessing same data.
  • Team Size: 28
  • Tools: Kibana, WLP, TDWC -Tivoli Dynamic Workload Console 3.1.0, Beta92, Informatica Powercenter Workflow Monitor 9.6, LogViewer, Service Now, Putty, Winscp, BSE Login and Toad, UC4.

Education

B.E -

Sri Manakula vinayagar Engineering College
01.2014

Higher Secondary -

Petit Seminaire Higher Secondary School
01.2010

Matriculation -

Petit Seminaire Higher Secondary School
01.2009

Skills

  • Data architecture
  • ETL development
  • CI/CD implementation
  • Cloud infrastructure
  • System design
  • Data modeling
  • Cloud architecture
  • Data pipelines
  • Infrastructure as code
  • Project management
  • Team leadership
  • Cross-functional collaboration
  • Change management
  • Problem solving
  • Agile methodologies
  • Machine learning
  • Data ingestion
  • DevOps practices
  • Database management
  • Big data technologies
  • Data pipeline orchestration
  • Requirement gathering
  • Agile methodology
  • Scikit-learn
  • Distributed computing
  • Natural language processing
  • Deep learning
  • Reinforcement learning
  • Apache Spark
  • Data science
  • Feature engineering
  • Model evaluation
  • Algorithm development
  • Python programming
  • Cloud computing
  • Data visualization
  • Unstructured Data Working

Personal Information

  • Date of Birth: 11/12/92
  • Gender: Male
  • Nationality: Indian
  • Marital Status: Married
  • Place of Birth: Pondicherry

Projects

Projects:

Project: Insights product – DORA metrics

Description: Constructing Insights product, a dashboard containing DORA metrics.

Project Objective:

Ø Collecting Data from source (Batch and streaming based on customers).

Ø Collection of source systems (CI/CD tools): Opsera pipelines, Github actions, Octopus, Gitlab, Jenkins, AWS code pipeline.

Ø ETL development to take care of raw data pull, transformation for silver and gold layer, data load taken care incrementally for both batch and streaming.

Ø Dashboard queries to represent kpis like Deployment frequency, Lead time for change, Change failure rate, mean time to resolve, Cycle time for changes.

Ø Automation scripts for unit testing ETL’s, Data load testing for Incremental, Data quality checks etc.

Ø Connecting with customers for requirements gathering, giving demos and resolving issues.

Project: Insights product – Devex metrics

Description: Constructing Insights product, a dashboard containing Devex metrics.

Project Objective:

Ø Collecting Data from source (Batch and streaming based on customers).

Ø Collection of source systems (Source control and ITSM tools): Github Enterprise & Onprem , Gitlab, Bitbucket, Azure repos, JIRA, Zendesk, Service now)

Ø ETL development to take care of raw data pull, transformation for silver and gold layer, data load taken care incrementally for both batch and streaming.

Ø Dashboard queries to represent kpis like commit statistics, pipeline statistics, pr size, pr stats, change request view, developer activity.

Ø Automation scripts for unit testing ETL’s, Data load testing for Incremental, Data quality checks etc.

Ø Connecting with customers for requirements gathering, giving demos and resolving issues.

Project: Insights product – AI adoptability

Description: Constructing Insights product, a dashboard containing AI Adoptability metrics.

Project Objective:

Ø Collecting Data from source (Batch and streaming based on customers).

Ø Collection of source systems (AI tools): Github Copilot, Cursor, Windsurf, source graph.

Ø ETL development to take care of raw data pull, transformation for silver and gold layer, data load taken care incrementally for both batch and streaming.

Ø Dashboard queries to represent kpis like Developer Usage, Developer activity, Adoption, Impact, Acceptance rate, Suggestion Retention Rate, AI usage report.

Ø Automation scripts for unit testing ETL’s, Data load testing for Incremental, Data quality checks etc.

Ø Connecting with customers for requirements gathering, giving demos and resolving issues.

Project: Devops for Dataops

Description: Constructing Code gen features for Devops product taking care of generating IAC for  Databricks resources (Databricks Asset Bundles).

Project Objective:

Ø Establishing Frontend and backend that generates configuration for Databricks resources.

Ø Modules: Forms to create connection to Databricks, fetch pipelines and workflows from Databricks workspace, making user to select, Editor options after getting configuration and utilized by users for editing, parameterization of fields under configuration, changes automatically pushed to github.

Ø Understanding and maintaining configurations library.

Ø Giving Demo to customers and gathering requirements.

Ø Taking care of POC for new line customers.

Ø Coordinating and partnering with Databricks accounts for new enhancements and roadmaps.

Project: Vulnerability Remediation through AI

Description: Generating remediation for the Vulnerabilities

Project Objective:

Ø Datasets containing security scans from various security tools like Sonar will be added with remediation generated by AI setup.

Ø AI setup: Langchain connecting to multiple LLM’s, Lakehouse tables containing security data, Prompt to LLM’s generating remediation.

Ø Security Data ETL consuming AI setup, parsing inputs from security data to AI setups for generating remediation, filtering records that needs AI setup (Identifying unique set of records to be part of remediation generation).

Project: Insights product – Artifact metrics

Description: Constructing Insights product, a dashboard containing Artifact metrics.

Project Objective:

Ø Collecting Data from source (Batch and streaming based on customers).

Ø Collection of source systems (Artifact tools): Jfrog

Ø ETL development to take care of raw data pull, transformation for silver and gold layer, data load taken care incrementally for both batch and streaming.

Ø Dashboard queries to represent kpis like Artifact stats(Downloads and uploads), pipeline stats(pipelines using artifacts).

Ø Automation scripts for unit testing ETL’s, Data load testing for Incremental, Data quality checks etc.

Ø Connecting with customers for requirements gathering, giving demos and resolving issues.

Timeline

Principal Engineer

Opsera
06.2024 - Current

Azure Data Engineer

ChampionX
06.2023 - 06.2024

Senior Data Engineer

Gen Digital (Norton LifeLock)
10.2020 - 06.2023

Azure Cloud Engineer

Atos Syntel
11.2019 - 09.2020

Devops engineer

Cognizant technology solution
03.2018 - 11.2019

Azure Cloud Support

Cognizant technology solutions
06.2016 - 11.2019

Application support engineer

Cognizant technology solutions
05.2015 - 05.2016

B.E -

Sri Manakula vinayagar Engineering College

Higher Secondary -

Petit Seminaire Higher Secondary School

Matriculation -

Petit Seminaire Higher Secondary School
Saravanakumar S