Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

Gowthami Gelasam

Bengaluru

Summary

Kafka Data Streaming Engineer with strong experience in designing, deploying, and optimizing Confluent Kafka solutions across cloud platforms (GCP, AWS, OpenShift). Skilled in customer engagement, requirements gathering, and delivering robust data streaming architectures for enterprise banking clients. Proven ability to troubleshoot complex systems, lead technical discussions, and align Kafka-based solutions with business goals. Passionate about enabling customers to succeed with real-time data solutions.

Overview

6
6
years of professional experience
1
1
Certification

Work History

APPLICATION DEVELOPER

IBM
Bengaluru
08.2021 - Current
  • Designed, deployed, and managed enterprise-scale Confluent Kafka clusters on Google Cloud Platform (GCP), OpenShift, and AWS to support real-time event streaming use cases for critical banking applications.
  • Collaborated with cross-functional teams at ANZ to gather requirements, design scalable Kafka-based data pipelines, and deliver robust event-driven solutions aligned with business objectives.
  • Created and maintained Kafka topics, partitions, and schemas to optimize throughput and ensure high availability for MOD and e-commerce APIs, contributing to seamless customer banking experiences.
  • Enabled secure data communication by implementing Role-Based Access Control (RBAC) with Active Directory integration and ensured compliance with security and governance policies.
  • Configured and maintained the Confluent Schema Registry to support Avro, JSON, and Protobuf serialization formats, reducing integration errors and improving data validation.
  • Utilized Kafka Connect and custom connectors to integrate legacy systems, databases, and APIs into the streaming ecosystem, ensuring low-latency data ingestion and transformation.
  • Implemented Kafka Replicator across multi-region deployments to enable business continuity and disaster recovery.
  • Involved in troubleshooting and resolving critical production issues, providing high-quality support to ANZ stakeholders, and minimizing downtime.
  • Led performance tuning efforts by optimizing Kafka configurations (e.g., log segment sizes, retention policies, replication factors) to ensure low latency and high throughput under peak loads.
  • Worked in Agile/Scrum teams, often serving as Scrum Master, facilitating sprint planning, daily stand-ups, and removing blockers for smooth delivery.
  • Authored integration design documents, production deployment plans, and operations checklists to standardize delivery across projects.
  • Developed and deployed microservices for Open Banking APIs using Node.js and JavaScript, leveraging API Mesh and IBM DataPower for security and API lifecycle management.
  • Delivered CI/CD automation using Codefresh, Bitbucket, Bamboo, and uDeploy to streamline builds, testing, and cloud deployments on GCP.
  • Proactively engaged with business analysts and architects to ensure streaming platform design met compliance, reliability, and scalability standards across banking domains.

INTERN

KPIT Technologies
Bengaluru
06.2019 - 04.2020
  • Utilized Carla platform to optimize validation methods ensuring safer autonomous navigation.
  • The aim of this project is to advance research by integrating a system into a realistic driving simulator that can detect, recognize, and track road objects, making decisions that assist the user in making the driving process safer and more convenient.
  • Experienced working in the Python language.
  • Worked with large datasets, preferably with an understanding of Hadoop, HBase, Docker, and Apache Spark.
  • We used the deep-learning network YOLO algorithm, combined with data, to detect and classify the objects, and estimate the position of objects around the car.
  • The algorithm has been developed and tested using the Camera dataset collected by the ROS bag.
  • The performances are evaluated in controlled scenarios of increasing difficulty, and examined via metrics provided by CARLA.

Education

Master of Engineering - Big Data & Data Analytics

Manipal Academy of Higher Education
Manipal
05.2020

B.Tech - Computer Science Engineering

Gitam University
Bengaluru
04.2018

Skills

  • Kafka (Confluent / Apache), Kafka Connect, ksqlDB, Schema Registry
  • Event-Driven Architecture Implementation
  • CI/CD: Codefresh, Bamboo, Bitbucket, GitHub
  • Cloud Platforms: GCP, AWS, openshift
  • Languages :Nodejs, JavaScript and basics of Java
  • Microservices and APIs
  • Tools: APIMesh, IBM DataPower, UDepoly
  • Monitoring and Logging: Splunk, GCP

Certification

Confluent Certified Data Streaming Engineer

Confluent | Issued August 2025

Credential URL: https://certificates.confluent.io/69e27850-7193-4f9e-bca5-86d21088812d#acc.SVM7xgAv

Timeline

APPLICATION DEVELOPER

IBM
08.2021 - Current

INTERN

KPIT Technologies
06.2019 - 04.2020

Master of Engineering - Big Data & Data Analytics

Manipal Academy of Higher Education

B.Tech - Computer Science Engineering

Gitam University
Gowthami Gelasam