Summary
Overview
Work History
Education
Skills
Certification
Timeline
BusinessAnalyst

VIKAS GUSAIN

Dehradun

Summary

1. Project: PremiumLossDW

  • Client: LAMB
  • Description: Led the successful development and implementation of the PremiumLossDW project for client LAMB. The project involved retrieving data from three distinct sources - Convelo, Epic, and Aclaiment. While Convelo and Epic were traditional data warehouses in SQL Server, the Aclaiment source was hosted on Snowflake.
    Integrated scripting to establish connections across Convelo, Epic, and Aclaiment data sources.
    Implemented a robust ETL framework, seamlessly managing data flow from source to staging, and further into dimension and fact tables.
    Utilized multiple lookup transformations to efficiently identify and handle updated matching records during data processing, significantly improving accuracy.
    Creation of email notifications for success and exception handling, logging mechanisms, ensuring timely communication of job outcomes.

2. Project: HubspotDataLoad

  • Client: AAA Western & Central New York
  • Description: Successfully orchestrated the seamless integration of data from various APIs into SQL Server tables through the adept use of SSIS packages and scripts. The primary objective of the project was to optimize the data transfer process from Nav and Smartlogix to Hubspot tables, achieved through the development of five distinct packages.
    Designed and implemented SSIS packages and scripts for efficient integration of data from diverse APIs into SQL Server tables.
    Streamlined the data transfer process from Nav and Smartlogix to Hubspot tables, enhancing overall data accessibility and usability.
    Created five distinct packages, each employing stored procedures to extract and transform required data into CSV files, improving data quality and usability.
    Implemented API validation using Postman to ensure seamless and accurate data transfer between systems.
    Deployed SSIS packages on AWS S3 bucket, leveraging cloud infrastructure for efficient storage and accessibility.
    Configured destination file placement on SFTP server using WinSCP with appropriate arguments in the command script for secure and automated data transfer.

3. Project: Gallons by Territory

  • Client: NocoEnergy Corp
  • Description: Developed a comprehensive dashboard for NocoEnergy Corp with a primary focus on summarizing gallons data and delivering insights across various dimensions.
    Utilized Data Analysis Expressions (DAX) extensively to craft intricate calculations and measures, extracting meaningful insights from the dataset.
    Conducted meticulous data modeling by incorporating multiple tables, enhancing the overall structure, and fortifying relationships within the dataset.

4. Project: HEAP Historical Report

  • Client: Noco Energy Corp
  • Description: Developed an SSRS report named "HEAP Historical Report" for Noco Energy Corp, focusing on providing detailed historical information related to the Home Energy Assistance Program (HEAP).
    The report includes header fields for filtering:StartDate
    EndDate
    SearchTicket (utilizing fields from the [hdIssues] table)
    Body fields include essential information presented in HTML format:Ticket Submission Date (IssueDate in [hdIssues] table)
    Issue ID (IssueID in [hdIssues] table)
    Ticket Information (Body in [hdIssues] table, formatted in HTML)
    HTML code includes cautionary information and key details such as the program year, SSD worker information, vendor details, client information, and specific actions required.

5. Project: CustomerContactPremiumPrediction

  • Client: Fortegra Insurance
  • Description: Conducted Exploratory Data Analysis (EDA) to gain insights into the dataset, identifying patterns and trends relevant to customer contact and premium prediction.
    Applied feature engineering techniques to enhance the dataset, creating new informative features for improved model performance.
    Utilized feature selection methods to identify and retain the most impactful variables for predicting customer contact and premium values.
    Implemented a comprehensive model training process, incorporating machine learning algorithms to develop an accurate predictive model.
    Conducted model hyperparameter tuning to optimize the performance and generalization of the predictive model

Overview

3
3
years of professional experience
2
2
Certification

Work History

Data Analyst

KMG IT SERVICES PRIVATE LIMITED
07.2021 - Current

    Developed and implemented data pipelines for efficient ETL using SSIS, handling large volumes of data.

  • Deployed and managed projects on AWS and Snowflake, ensuring scalable and cost-effective cloud solutions.
  • Designed and optimized data models for streamlined retrieval and analysis in Power BI, facilitating enhanced reporting.
  • Created and implemented solutions for efficient data processing and reporting in SSRS, ensuring timely and accurate insights.
  • Leveraged Python for machine learning applications, implementing models for data-driven decision-making.
  • Proficient in advanced SQL querying, data exploration, and cleansing for improved accuracy.
  • Utilized version control and collaborative development practices with GitHub, Sourcetree, and Bitbucket.
  • Experienced in project management tools like Jira for efficient project tracking and coordination.

Education

B.TECH - Mechanical Engineering

THDC-IHET
TEHRI GARHWAL, India
07.2017

Skills

  • SSIS
  • SQL
  • POWERBI
  • SSRS
  • Python
  • Statistics
  • Machine Learning
  • AWS
  • Bitbucket
  • GitHub
  • Sourcetree
  • Postman
  • Atlassian Jira

Certification

google Data analyst certificate

1 year Data Science Masters by PW Skills

Timeline

Data Analyst

KMG IT SERVICES PRIVATE LIMITED
07.2021 - Current

B.TECH - Mechanical Engineering

THDC-IHET
VIKAS GUSAIN