Summary
Overview
Work History
Education
Skills
Certification
Interests
Additional Information
Software
Accomplishments
Timeline
Generic

Sunith Kumar Samudrala

Sr. Data Analyst
Bangalore,KA

Summary

Sr. Data/BI Analyst with over 4.5 years of successful experience in Business intelligence, ETL, Data warehousing, and data mining. Recognized consistently for performance excellence and contributions to success in financial industry. Strengths in creating Dashboards, data Architecture and reporting backed by training in Big data analytics and optimization.

Overview

4
4
Languages
1
1
Certification
5
5
years of post-secondary education
5
5
years of professional experience

Work History

BI Data Analyst

HSBC
Bangalore, Karnataka
07.2016 - Current
  • 4 years 6 months; 4 years in HSBC Analytics.

Roles and Responsibilities: -

  • End to End ETL processing from diverse databases and resulting in creation of Reports and Dashboards.
  • Using SQL, SAS, and Python for Collecting and analyze data for multiple ad-hoc projects.
  • Consistently worked with various teams like Fraud, Finance, Collections, Campaigns, and Business management team which is residing at onshore.
  • Built data Analytics architecture for the data having HSBC products like Savings, Mortgages, mutual funds and insurances, majorly MasterCard Credit card etc., Which are then used to track, aggregate and analyze their sales then feed the results to business users.
  • Standards for data transformation/processing are implemented on the projects to avoid redundancy and remove unwanted fields.

Accomplishments/Duties : -

  • Collecting, reporting, and analyzing data in financial industry - plan, create, and maintain large scale data architectures to achieve Business goals and work on the requirements.
  • Automated more than 120 reports (Coding, Formatting, QC, Delivering, and operational Trend Analysis) for various business teams.
  • Saved ~10hrs of daily execution time to the Sales, Performance reports by creating intermediate tables, limiting columns in tables, and optimizing the code.
  • Established QC framework on the data flowing from upstream sources to downstream and then the reporting layer.
  • For all the enhancements made on the existing projects and new projects, final reports are delivered to user only after the QC is met and this QC is made by comparing the results of previous runs like for weekly report, Ex: trend QC for WOW (Week over Week) is created until for past 3 months and involves other statistical functions for analysis where the standards are derived from business to ensure the output values are aligned within it, same applies for MOM (month over month) and daily.
  • Establishing and administrating the accesses for database level roles in SQL Server, worked mostly where the data analytics domain is involved.

Data Engineer

Celebal Corp
10.2015 - 05.2016
  • Analysis made on real time Apache logs of the travel company servers having the raw data which is converted in to structured data then the ELK (Elasticsearch Logstach and Kibana) is used for making instant visualization reports and in parallel real time data is processed continuously by using kafka, spark streaming and finally stored in Hadoop, small queries can done by using Hive on top of Hadoop.

Education

Engineering School - INSOFE

INTERNATIONAL SCHOOL OF ENGINEERING
Hyderabad
02.2015 - 07.2015

B.tech - Mechanical Engineering

Kakatiya Institute Of Technology & Science
Warangal
01.2010 - 04.2014

Skills

Data warehousingundefined

Certification

Certified GCP Business Professional

Interests

Playing Badminton

Additional Information

Basic level logics/metrics involved:

  • Standard deviation for products, Customer prediction for product acquisition, Sales forecasting, , Employee/Customer Ranking, # and $ of new products, Rewards points, change in different types of balances, uplift % for digital metrics, Net client growth, total relationship balance, types of Fraud and their losses.

Tables/Structures:

Worked on wide range of tables like below and understanding the structure behind their creation.

Account level

Transaction level

Card level

Customer level

Statement level

Product level

Employ level

Customer demographics table

Rewards table, Etc.

There are many processes using these tables which has deep concepts in understanding and involves complexity while using it.

Software

Unix

Linux

SQL Server management Studio

Spyder

SAS EG

R studio

Anaconda

AWS Services

Accomplishments

    Built the below architecture

    Input sources

    (Card transactions, Accounts, Digital Activities)

    AMAZON Redshift to store the daily batch data

    Amazon S3

    - Transferred to the bucket named as BI analysis

    - The below data is stored in another bucket for Fraud analytics

    Amazon Athena

    Interactive Query Engine

    Quick Sight

    For Real time Analytics

    Lambda Function

    (Invoking λ functions to filter transactions with block codes)

    Amazon Firehouse

    (Receive, store and deliver)

    AWS Kinesis Streams

    (Capturing near real transactions)

    Kinesis Data Analytics

    (Collect, Process, transfer the input app stream data)

Timeline

Certified GCP Business Professional

03-2020

BI Data Analyst

HSBC
07.2016 - Current

Data Engineer

Celebal Corp
10.2015 - 05.2016

Engineering School - INSOFE

INTERNATIONAL SCHOOL OF ENGINEERING
02.2015 - 07.2015

B.tech - Mechanical Engineering

Kakatiya Institute Of Technology & Science
01.2010 - 04.2014
Sunith Kumar SamudralaSr. Data Analyst