Summary
Overview
Work History
Education
Skills
Additional Information
Certification
Interests
Timeline
BusinessAnalyst

Mallesha Mahadeva

Technical Architect
Mysuru

Summary

Big data professional with cross-functional competencies in all phases of Data Engineer and Full Stack Development; targeting assignments in Big Data Analytics & Cloud Computing with an organization of high reputation.

  • Strategic professional offering 10+ years of rich experience in the IT industry with more than 5 years of experience in Big Data Analytics & more than 2 years of Cloud Computing.
  • Good Knowledge on Design Patterns and Event Driven Architecture, micro services session management, and SSO implementation.
  • Expertise in Data mining, cleansing and transformation workflows.
  • Hands-on experience in Hadoop ecosystems, ETL operations using Spark Core, Spark SQL, Spark Structured Streaming, and Event-Driven data flows.
  • Implemented zero data loss in streaming application by reading / updating Kafka offsets in delta lake with the combination of Spark Structured Streaming Approach
  • Implemented various Proof of concepts work into enterprise solutions both on Hadoop stack and cloud native and agnostic environments.
  • Experience on cluster management – Apache Livy , Yarn, spark cluster, data proc and databricks.
  • Built a metadata governance tool using apache atlas.
  • Create custom schedulers for ETL pipelines using Quarts.
  • Have provided enterprise solutions to middle east banks for searching data in compressed file systems.
  • Implemented Proof of concepts on GCP data proc. Enlighten the team about advantages of moving towards cloud managed Hadoop clusters
  • Experienced in all phases of SDLC like requirements gathering, designing, developing, configuration management, build management, testing and deployment. Implemented projects in Agile and WaterFall Methodologies.
  • Adhered to the policies, norms, compliances, Quality Assurance Process, Project Management Reviews, Audit and Compliance.
  • Have been providing precise, effective and robust solutions to meet the business expectations. Validate that solutions proposed are being tracked, managed, and reported in such a way that they are completed profitably with high client satisfaction.
  • Experienced in configuring monorepo CI/CD pipeline in circleci/jenkins along with junit, snyk and sonar cloud integration.

Overview

10
10
years of professional experience
6
6
years of post-secondary education
3
3
Certifications

Work History

Technical Architect

Kine Labs Pvt Ltd
Bengaluru
11.2020 - Current

Kine is a start up company which aims in providing a SaaS solution for enterprises, in the field of Employee Value Management.Kine consists of 4 modules Capsule, Throttle, Raise and Doc bot. As part of Implementation we picked up Kine Capsule and Kine Raise module. The capsule module provides integration platform to connect to different tools used in Software Development Process, Human Resource Management System, Collaborations and meetings tools for automatically capturing the data into Kine Data Hub.


Duties and Responsibilities

  • Took the complete Ownership of the product technology roadmap, which is linked to the company's strategic plan.
  • Simulated, designed, developed & deployed computationally complex & practical big data; built & delivered comprehensive data strategy roadmap; ensured final deliverables of highest quality.
  • Serve as a leading voice internally on what technologies and partnerships the company should lean into to execute its product roadmap and cloud transformation.
  • Defining architecture for the end to end application, involves event-driven, streaming, rule engine and ETL workflows(Spark and Kafka).
  • Containerising micro services Both frontend and backend applications
  • Upgrading on-premise Kubernetes Cluster into AWS EKS.
  • Evaluated cost, performance and service availability on Cloud Environments GCP and AWS.

Principal Specialist

Temenos India PVT LTD
Bengaluru
04.2018 - 10.2019

Joined Temenos as a Principal Specialist through the acquisition of Htrunk Software Solutions, a start-up company. Temenos is the largest banking Software Company based out of Geneva, Switzerland, which focuses on providing software solutions to core banking industries(T24 Product).


Duties and Responsibilities

  • Migrating Relational Replication Project into Big data ecosystem Project as divided into 3 modules ODS(Operational Data Store), SDS (Snapshot Data Store) and ADS (Analytical Data Store)
  • Took complete ownership of ODS and Partially involved in ADS and SDS architectural discussions
  • Suggested team to go with Lambda architecture during ETL process
  • Data mining, cleansing and transformation of T24 system data into ODS and SDS data lake
  • Implemented Spark Structured Streaming Kafka/Azure Event Hub to read data in real-time and push through the pipelines to make predictions using Spark ML(ADS)
  • Worked on Apache Atlas for metadata management and data lineage
  • Created a Single Context multi session API service for Spark
  • Implementation of a custom sink in Spark for data consumption from spark-structured streaming
  • Containerising spark application to run on Kubernetes cluster For better autoscaling and maintenance
  • Integrated Apache Livy server for submitting and monitoring spark Batch jobs.

Big Data Solution Engineer

Htrunk Software Solutions
Bengaluru
09.2016 - 03.2019

Htrunk was a startup product based company, works primarily on in-house hTRUNK Data Engineering product similar to Talend and its goal is to provide a common platform for Data Engineering Solutions (ETL, data cleansing(ML) and data visualization through Business intelligence tools(hTRUNKBI)).


Duties and Responsibilities

  • Completely involved in core product development of all the 4 modules (Designer, Deployer , Scheduler and Admin).
  • Implemented POCs on the new Hadoop ecosystem in order to integrate them to Htrunk products.
  • Wrote connectors to third party tools, RDBMS, Hadoop eco system etc.I was the core owner of the scheduler module, where we were running a group of designer jobs as workflows
  • Played a major role in PR Saving Bank Application and T24 Archival Application.Project is to automate the reports approval process between PR saving's bank and Philippines central bank.
  • Helped the team to create batch and realtime jobs extraction jobs based on COB process completion time.
  • Establish Quality Procedure for the team and continuously work towards and also monitor to ensure the team meets quality goals
  • Played a vital role in interacting with clients on client location for issues management and solution providing.
  • Indicated, led and delivered a BI tool for the big data ecosystem.

Application Programmer

IBM India Pvt Ltd
Bengaluru
07.2011 - 09.2016

Global Integrated Order Manager - ESS project - AT&T Global Integrated Order Manager (GIOM) was a web based user-friendly software application for cross-service ordering of AT&T Voice and Data services. Using AT&T Integrated Order Manager, Voice and Data customers can create requests for new components, changes to and/or disconnection of existing components as well as monitor the end-to-end status of an existing order and run various reports. Later the same project was implemented with advanced technologies and processes(RTP, CSI and OCX).


Duties and Responsibilities

  • Implemented projects in both Waterfall (IT/UP process) and Agile methodology
  • Worked on multiple areas in this project like PL/SQL, ESB, GUI part and sterling MCS framework, build script and websphere configuration
  • Act as a bridge between team and management to report project, status issues and risks to make sure the quality of an outcome product
  • Automated some of the test cases by writing junit test cases
  • Worked very closely with the requirements team to check feasibility of the changes and suggested some changes in the requirement and behavior with the help of my experience in application.
  • Verifying the estimation for the projects and providing the Function Point for the projects.

Education

Bachelor of Engineering - Computer Science

SJCE College of Engineering
07.2008 - 08.2011

Masters - Computer Science and Engineering

Reva University
Bangalore
06.2017 - 05.2020

Skills

    Languages: Java and J2EE, Shell scripting, Java scripts(ReactJS basics), PL/SQL, SCALA and Python

undefined

Additional Information

  • Received Spot award in Htrunk Software Solutions for PR Savings Bank Project 2018.
  • Received Hall of Fame award for my contribution in AT&T account.
  • Received Best of IBM award for GIOM-ESS, RTP projects.
  • Awarded as Employee of the Month for successfully leading the RTP project.
  • Was placed in the top 10 teams out of 200 for the hackathon conducted by Temenos.
  • Traveled to Indonesia, Cambodia, Philippines and Egypt for product deployment and Installation and providing technical guidance and expertise.

Certification

Google Cloud Fundamentals: Core Infrastructure - 2020.

Interests

Containerisation, Cloud and ML and Deep Learning

Timeline

Technical Architect

Kine Labs Pvt Ltd
11.2020 - Current

Google Cloud Fundamentals: Core Infrastructure - 2020.

07-2020

AWS Certified Cloud Practitioner - 2019

12-2019

Principal Specialist

Temenos India PVT LTD
04.2018 - 10.2019

Masters - Computer Science and Engineering

Reva University
06.2017 - 05.2020

Big Data Solution Engineer

Htrunk Software Solutions
09.2016 - 03.2019

Certified Scrum Master - 2016.

06-2016

Application Programmer

IBM India Pvt Ltd
07.2011 - 09.2016

Bachelor of Engineering - Computer Science

SJCE College of Engineering
07.2008 - 08.2011
Mallesha MahadevaTechnical Architect