Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

Venkateswarlu Avula

Kafka Engineer
Hyderabad

Summary

Kafka Engineer with 4 years of experience working with global banking and telecom B2B clients. Skilled in understanding business use cases, providing solutions, and implementing Confluent Kafka, Flink, and middleware integrations. Proven track record in cloud migrations, real-time CDC pipelines, and secure data streaming architectures. Adept at client engagement and delivering enterprise-grade streaming solutions that align with business goals.

Overview

4
4
years of professional experience
1
1
Certification

Work History

Kafka Engineer

Ascendion Digital Solutions Private Limited
05.2025 - Current
  • Worked with TSB Bank and provided the solution and implementation to use Confluent Kafka for their business use case of capturing CDC events and sending them into MongoDB Atlas for real-time analytics.
  • Provided solutions and implemented Flink SQL jobs for the transformations based on Business requirements with low latency, improving client’s customer reporting.
  • Implementation the solution of secure role-based access by integrating Confluent Cloud with Azure AD and AWS IAM Identity Center, ensuring compliance with banking regulations.
  • Implemented the solution to automate API key and credential rotation using GitHub Actions, eliminating manual steps and strengthening client’s security posture.

Kafka Admin

Volante Technologies
12.2024 - 05.2025
  • Worked with USAA Bank and provided solutions and implementation for high availability of Kafka clusters, ensuring reliable support for critical financial systems.
  • Based on client requirements, implemented a proof of concept (POC) to demonstrate low latency with Kafka, showing how high throughput can be achieved for financial workloads.
  • Provided solution to monitor and optimize Kafka performance using Confluent Control Center and New relic, preventing outages and ensuring SLA compliance.
  • Based on business requirements, designed and implemented RabbitMQ (RMQ) solutions within the client’s architecture, enabling secure and reliable communication between enterprise applications.
  • Provided solutions for incident management and risk mitigation, reducing downtime and strengthening client operational resilience.
  • Worked with other B2B clients to design and implement Kafka clusters tailored to their business requirements, ensuring scalability and performance for diverse use cases.

Kafka Engineer

IBM India
10.2023 - 12.2024
  • Kafka Cluster Migration: Successfully led the migration of on-premises Kafka clusters to AWS Confluent Cloud, using Cluster Linking and MirrorMaker to ensure minimal downtime and seamless data transfer.
  • Stakeholder Collaboration: Collaborated with ANZ stakeholders to gather business requirements and align future planning, ensuring the streaming architecture supported both current and long-term business goals.
  • Designed scalable Kafka-based pipelines and event-driven solutions to meet enterprise banking objectives and compliance standards.
  • Daily Troubleshooting: Identified and resolved Kafka cluster issues to maintain optimal performance and reliability for mission-critical workloads.
  • Monitoring & Alerting: Implemented Prometheus, Grafana, and Kafka Manager dashboards to proactively detect and resolve performance bottlenecks.
  • Automation & Deployment: Worked with DevOps to streamline deployments using Ansible and Codefresh, accelerating delivery and reducing manual effort.
  • Kubernetes Integration: Assisted in setting up Kafka clusters on Kubernetes for scalability and high availability; supported pod management, service configuration, and Helm chart usage.
  • Kafka Connect: Developed and deployed Kafka Connectors for databases, AWS S3, Salesforce, and MQ systems. Implemented custom SMTs for enrichment and filtering, and conducted POCs to validate feasibility.
  • Access Control: Implemented RBAC to enforce secure topic management and granular access across business teams.
  • Production Deployments: Executed Kafka cluster and connector rollouts following best practices, ensuring stability and compliance.
  • KSQL and Data Streaming: Delivered ksqlDB-based real-time data streaming solutions supporting customer-facing banking applications.
  • Cloud Migration Impact: Deployed Kafka clusters on Confluent Cloud, reducing infra management overhead by 50%. Migrated workloads with 30% cost savings and seamless cutover.
  • Production Support: Provided end-to-end incident troubleshooting for ANZ’s Kafka ecosystem, ensuring high availability and minimal downtime.

Kafka Engineer

Torry Harris Business Solutions
10.2021 - 10.2023
  • Provided the solution and implementation for BT’s migration from on-premises Kafka to Confluent Cloud, ensuring secure data transfer and minimal downtime.
  • Designed and implemented Kafka-based data pipelines to process large volumes of telecom data in real time, enabling operational insights.
  • Developed and maintained Kafka Connectors to integrate Kafka with enterprise systems and applications, based on business requirements.
  • Implemented monitoring and alerting solutions using Dynatrace to proactively identify and resolve performance issues.
  • Worked on Confluent Kafka connector POCs to establish system-to-system data flows as per client business requirements.
  • Implemented KSQL transformations and streams to tune and enrich data according to business use cases.
  • Designed and implemented a solution using Oracle GoldenGate Big Data Handler to convert trail files into Kafka messages for downstream analytics.
  • Configured and optimized Kafka brokers, topics, partitions, and replication to ensure high availability and fault tolerance.
  • Provided secure implementations by enabling SSL/TLS encryption, SASL authentication, and RBAC for compliance with telecom regulations.
  • Developed and executed disaster recovery strategies including replication and backup for business continuity.
  • Collaborated with BT’s cross-functional teams to design and implement Kafka integration solutions for multiple enterprise applications.
  • Conducted periodic health checks, capacity planning, and scaling exercises to ensure the infrastructure aligned with projected workloads.
  • Provided solutions for incident troubleshooting, working with development and operations teams to restore services quickly.
  • Supported and implemented Confluent Kafka upgrades and patching, aligning the client’s environment with the latest stable releases.

Education

B.Tech - Computer Science and Engineering

Rajiv Gandhi University of Knowledge Technologies, Nuzividu
01.2021

Pre-University Course (PUC) - undefined

Rajiv Gandhi University of Knowledge Technologies, Nuzividu
01.2017

Secondary School Certificate (SSC) - undefined

ZPHS Mogallur
01.2015

Skills

  • Kafka (Confluent / Apache), Confluent cloud, Kafka Connect, Schema Registry, KSQL, Flink
  • Event-Driven Architecture & Data Pipelines
  • CI/CD & Automation: Ansible, Codefresh, GitHub Actions, Shell
  • Monitoring & Observability: Grafana, Prometheus, Dynatrace, Confluent Control Center
  • Cloud Platforms: AWS, GCP
  • Others: SQL, Java, Html

Certification

Confluent Certified Data Streaming Engineer

Timeline

Kafka Engineer

Ascendion Digital Solutions Private Limited
05.2025 - Current

Kafka Admin

Volante Technologies
12.2024 - 05.2025

Kafka Engineer

IBM India
10.2023 - 12.2024

Kafka Engineer

Torry Harris Business Solutions
10.2021 - 10.2023

Pre-University Course (PUC) - undefined

Rajiv Gandhi University of Knowledge Technologies, Nuzividu

Secondary School Certificate (SSC) - undefined

ZPHS Mogallur

B.Tech - Computer Science and Engineering

Rajiv Gandhi University of Knowledge Technologies, Nuzividu
Venkateswarlu AvulaKafka Engineer