Summary
Overview
Work History
Education
Skills
Interests
Timeline
Generic

Jeevan TM

Engineering Lead
Bangalore

Summary

A competent professional with 8 years of experience in Software Design and Development. Experience in Big data, data warehousing, analytics and modelling KPI reports and Tableau. Proficiency in Apache Hadoop ecosystem, Amazon Redshift, Python and UNIX environments. Knowledge on design, development and testing of ETL systems.

Overview

8
8
years of professional experience

Work History

Engineering Lead

Zenoti
Bangalore
03.2022 - Current
  • Expanding on previous experience in same company, develop and integrating and enabling new technologies.
  • New challenges on Tableau which includes full automation of customer onboarding and de boarding using REST APIs.

Principal Engineer

CaaStle
Bangalore
02.2018 - 02.2022
  • Design and develop ETL pipelines for new business metrics.
  • Implemented new end to end systems for multi-tenant reporting, improved ETL performance, automate Tableau content publishing, write new libraries and API interface for easy reporting management.
  • The daily transactional data is captured in MySQL server tables, then an ETL system written in JAVA based Cascading platform transforms this data and loads into Fact / Dimension schema hosted on Apache Hadoop systems like Impala & HBase.
  • New automations were built for reporting based on Hyper and Microservices, footprint on database was reduced by 80%
  • New real time data ingestion pipeline was also built using Maxwell-Kafka setup.
  • Reporting tool is Tableau
  • Reviewed plans, documents and related materials to assess projected actions and advise on changes

BI Engineer

Zenoti
Hyderabad
02.2016 - 02.2018
  • Designed KPI metrics, implemented new ETL supportive systems like Redshift and Tableau, improved ETL performance, automate Tableau content publishing which includes creation of items like Sites, Projects, Datasources, Workbooks, Users ,Groups and permission management. Project Description
  • Daily transactional data is captured in MS SQL server tables. Then ETL system based on python transforms this data and loads into Fact / Dimension schema. KPI reports are served out of these tables. For other reports, data is then loaded into Redshift with help of Amazon S3 storage as go-between. Since Redshift is columnar database, performance in rendering these reports were decently quick with correct usage of distribution and sort keys.
  • Tableau and BIME reporting tools consume end Redshift tables and cater to specific reporting need.

Assistant Systems Engineer

Tata Consultancy Services
Hyderabad
02.2014 - 02.2016
  • Improved Continuous Integration process, achieved better testing process, implemented new functionalities for code review and version control.
  • Developer Experience (DX) provides tools for installation, automation, common operations and maintenance of components comprising CBA.
  • Automatic Installation Toolkit (AIT) is one such tool of DX which is used to install CBA components on network elements. Installation involves Bootstrapping, Remote and Local installation, also preparation of ISO and flash images for maiden installation.
  • AIT is built on Python and Jenkins is used for CI activities.

Education

M.Tech - Computer Science

SJB Institute of Technology
Bangalore

Skills

    Python

undefined

Interests

Travel, Trekking, Sports

Timeline

Engineering Lead

Zenoti
03.2022 - Current

Principal Engineer

CaaStle
02.2018 - 02.2022

BI Engineer

Zenoti
02.2016 - 02.2018

Assistant Systems Engineer

Tata Consultancy Services
02.2014 - 02.2016

M.Tech - Computer Science

SJB Institute of Technology
Jeevan TMEngineering Lead