Dynamic Principal Engineer with a proven track record at Opsera, excelling in data architecture and CI/CD implementation. Adept at leading cross-functional teams, optimizing data pipelines, and driving project success. Skilled in Python programming and fostering collaboration, delivering impactful solutions that enhance performance and efficiency in cloud infrastructure.
Projects:
Project: Insights product – DORA metrics
Description: Constructing Insights product, a dashboard containing DORA metrics.
Project Objective:
Ø Collecting Data from source (Batch and streaming based on customers).
Ø Collection of source systems (CI/CD tools): Opsera pipelines, Github actions, Octopus, Gitlab, Jenkins, AWS code pipeline.
Ø ETL development to take care of raw data pull, transformation for silver and gold layer, data load taken care incrementally for both batch and streaming.
Ø Dashboard queries to represent kpis like Deployment frequency, Lead time for change, Change failure rate, mean time to resolve, Cycle time for changes.
Ø Automation scripts for unit testing ETL’s, Data load testing for Incremental, Data quality checks etc.
Ø Connecting with customers for requirements gathering, giving demos and resolving issues.
Project: Insights product – Devex metrics
Description: Constructing Insights product, a dashboard containing Devex metrics.
Project Objective:
Ø Collecting Data from source (Batch and streaming based on customers).
Ø Collection of source systems (Source control and ITSM tools): Github Enterprise & Onprem , Gitlab, Bitbucket, Azure repos, JIRA, Zendesk, Service now)
Ø ETL development to take care of raw data pull, transformation for silver and gold layer, data load taken care incrementally for both batch and streaming.
Ø Dashboard queries to represent kpis like commit statistics, pipeline statistics, pr size, pr stats, change request view, developer activity.
Ø Automation scripts for unit testing ETL’s, Data load testing for Incremental, Data quality checks etc.
Ø Connecting with customers for requirements gathering, giving demos and resolving issues.
Project: Insights product – AI adoptability
Description: Constructing Insights product, a dashboard containing AI Adoptability metrics.
Project Objective:
Ø Collecting Data from source (Batch and streaming based on customers).
Ø Collection of source systems (AI tools): Github Copilot, Cursor, Windsurf, source graph.
Ø ETL development to take care of raw data pull, transformation for silver and gold layer, data load taken care incrementally for both batch and streaming.
Ø Dashboard queries to represent kpis like Developer Usage, Developer activity, Adoption, Impact, Acceptance rate, Suggestion Retention Rate, AI usage report.
Ø Automation scripts for unit testing ETL’s, Data load testing for Incremental, Data quality checks etc.
Ø Connecting with customers for requirements gathering, giving demos and resolving issues.
Project: Devops for Dataops
Description: Constructing Code gen features for Devops product taking care of generating IAC for Databricks resources (Databricks Asset Bundles).
Project Objective:
Ø Establishing Frontend and backend that generates configuration for Databricks resources.
Ø Modules: Forms to create connection to Databricks, fetch pipelines and workflows from Databricks workspace, making user to select, Editor options after getting configuration and utilized by users for editing, parameterization of fields under configuration, changes automatically pushed to github.
Ø Understanding and maintaining configurations library.
Ø Giving Demo to customers and gathering requirements.
Ø Taking care of POC for new line customers.
Ø Coordinating and partnering with Databricks accounts for new enhancements and roadmaps.
Project: Vulnerability Remediation through AI
Description: Generating remediation for the Vulnerabilities
Project Objective:
Ø Datasets containing security scans from various security tools like Sonar will be added with remediation generated by AI setup.
Ø AI setup: Langchain connecting to multiple LLM’s, Lakehouse tables containing security data, Prompt to LLM’s generating remediation.
Ø Security Data ETL consuming AI setup, parsing inputs from security data to AI setups for generating remediation, filtering records that needs AI setup (Identifying unique set of records to be part of remediation generation).
Project: Insights product – Artifact metrics
Description: Constructing Insights product, a dashboard containing Artifact metrics.
Project Objective:
Ø Collecting Data from source (Batch and streaming based on customers).
Ø Collection of source systems (Artifact tools): Jfrog
Ø ETL development to take care of raw data pull, transformation for silver and gold layer, data load taken care incrementally for both batch and streaming.
Ø Dashboard queries to represent kpis like Artifact stats(Downloads and uploads), pipeline stats(pipelines using artifacts).
Ø Automation scripts for unit testing ETL’s, Data load testing for Incremental, Data quality checks etc.
Ø Connecting with customers for requirements gathering, giving demos and resolving issues.