Summary
Overview
Work History
Education
Skills
Languages
Timeline
Generic

Vinay Kumar Kamineni

Bangalore

Summary

Dynamic Engineering Lead with over 5 years of experience in data migration and ETL processes, specializing in Azure Synapse and BigQuery SQL. Demonstrated success in enhancing data quality through performance tuning and optimization. Proficient in stakeholder collaboration to deliver solutions that uphold data integrity and governance, contributing to business growth through advanced analytics.

Overview

6
6
years of professional experience

Work History

Engineering Lead

persistent
Bangalore
11.2024 - 03.2025

Description:

Data migration from Google BigQuery to Azure Synapse DB. The user’s web behavior on different campaigns launched across.

Responsibilities:

  • I interacted with the client directly to get the requirements for the product to be delivered. Analyzed the requirements and designed the Requirements Page.
  • Linked the Google Analytics with Google BigQuery.
  • Developed purging scripts to purge the data on Azure ADLS.
  • Created the tables and developed T-SQL scripts in Azure Synapse Database with the best optimization techniques.
  • Built PySpark code for automating the QA checks.
  • Experienced in data conversion, data quality, data profiling, performance tuning, and system testing.
  • Experience working with Agile, Waterfall, and Hybrid data modeling methodologies.
  • Monitored the data quality of the daily processes and ensured the integrity of the data was maintained to ensure the effective functioning of the departments.
  • Experience in data transformation, data mapping from source to target database schemas, and data cleansing.
  • Experience in performance analysis, and created partitions, indexes, and aggregate tables where necessary.
  • Responsible for data governance rules and standards to maintain the consistency of the business element names in the different data layers.

BI Consultant

Canan Technologies
Bangalore
06.2022 - 11.2024

Description:

On-premises customer data from different source systems is ingested into GCP (Google Cloud Platform) through Juniper, an ingestion framework. ETL is performed on GCP-loaded data for business analytics purposes. Grow the organization. Here, I develop the BigQuery SQL scripts based on the requirements.

Responsibilities:

  • Followed the AGILE methodology.
  • I analyzed the requirements and designed the Requirements Page.
  • Responsible for creating the GCP buckets, datasets, and BigQuery tables in different layers of different BQ projects.
  • Responsible for ingesting the data from one dataset to different datasets through BQ queries as a part of the ELT and ETL process.
  • Written DDL and DML statements for creating, altering tables, and converting characters into numeric values.
  • Responsible for the unit testing, integration testing, and capturing the evidence.
  • Responsible for walking through to the business.
  • Facilitated data requirement meetings with business and technical stakeholders, and resolved conflicts to drive decisions.
  • Developed the SQL joins, SQL queries for data retrieval, accessed data for analysis, and exported the data into CSV and Excel files.
  • Support and KT to the team as and when required.

Consultant

polestarllp
Bangalore
12.2021 - 05.2022

Description:

On-premises customer data from different source systems is ingested into GCP (Google Cloud Platform) through Juniper.

On-premises customer data from different source systems is ingested into GCP (Google Cloud Platform) through Juniper.

Ingestion Framework. ETL is performed on GCP-loaded data for business.

The purpose of analytics is to grow the organization. Here, I performed data validations from end to end.

Responsibilities:

  • Following the AGILE methodology.
  • Responsible for analyzing the requirements and designing the test plan.
  • Responsible for designing the test cases.
  • Responsible for the System Integration Testing (SIT) and capturing the evidence.
  • Responsible for the SIT walkthrough to the business to get sign-offs.
  • Responsible for generating the test reports through the Python automation framework.
  • Responsible for providing the data quality assurance.
  • Executing functional, regression testing when required.
  • Participation in review and client calls.

Data Analyst

Ivangel sales & services pvt.ltd.
Bangalore
01.2020 - 11.2021

Responsibilities:

  • Good knowledge in creating views of the tables and importing them into Power BI Desktop.
  • Experience in importing Power BI reports to the Power BI service, connecting to the data from SQL Server.
  • Working with gateways, scheduling the refresh, and creating reports and dashboards.
  • Good on sharing the dashboards and providing the security for the Power BI reports, as well as exporting and analyzing the data to Excel.
  • Creating drill-down and drill-through reports.
  • Worked on creating bookmarks, buttons, and page navigation.
  • Have good experience in implementing RLS as part of security.
  • Working on DAX functions, like Date and Time.
  • Information, Filter & Value, Logical, and Text functions.
  • Extensively used Derive Column, Data Conversion, Conditional Split, and Aggregate.
  • Understand the Report Specification document and.
  • Designing the reports based on the Report Specification document, which meets the client requirements.
  • Deploying the generated reports on the production server.
  • Created Power BI reports and published the Power BI reports in Power BI Services.

Data Analyst

Demandnxt
Bangalore
08.2019 - 01.2020

Responsibilities:

  • Involved in creating power BI reports in Import and Direct Query Mode.
  • Power BI development and administration.
  • Accessed various data sources to pull data for.
  • Reporting and analysis: great experience with Power Queries for cleaning the data.
  • Good knowledge of creating calculations, advanced calculations, parameters, slicers, and interactions.
  • Creating interactions of Highlight action and Filter action adds interactivity to the visualizations.
  • Create the calculated columns and measures with the use of DAX expressions.
  • Involved in creating data visualizations like bar charts, tree maps, line charts, scatter plots, and mapping.
  • Accessed and used various custom visuals available in Power BI, able to implement row-level security on data, and have an understanding of application security layer models in Power BI.
  • Sharing of Power BI reports and managing workspaces in the Power BI service.
  • Developing visual reports, dashboards, and KPI scorecards using Power BI Desktop.

Education

Computer Science

College of Engineering Guindy.
Chennai
04-2017

Skills

  • Data migration
  • Azure Synapse
  • BigQuery SQL
  • Data quality
  • ETL processes
  • Agile methodology
  • Performance tuning
  • PowerBI
  • DAX
  • SSIS
  • Tableau
  • Python
  • SQL Data Management
  • Database Querying Skills

Languages

Telugu
First Language
Hindi
Upper Intermediate (B2)
B2
English
Advanced (C1)
C1

Timeline

Engineering Lead

persistent
11.2024 - 03.2025

BI Consultant

Canan Technologies
06.2022 - 11.2024

Consultant

polestarllp
12.2021 - 05.2022

Data Analyst

Ivangel sales & services pvt.ltd.
01.2020 - 11.2021

Data Analyst

Demandnxt
08.2019 - 01.2020

Computer Science

College of Engineering Guindy.
Vinay Kumar Kamineni