Summary
Overview
Work History
Education
Skills
Accomplishments
Certification
Languages
Timeline
Generic

Kalyan Bollam

Hyderabad

Summary

Innovative individual with proven success evaluating requirements for software development projects to design innovative solutions with 7+ years of experience in the field of Information Technology. Out-of-the-box thinker and problem solver dedicated to improving performance as efficiently as possible. Works well in teams and consistently delivers to deadlines by leveraging the underline technical stack as per business needs.

Overview

9
9
years of professional experience
1
1
Certification

Work History

Developer - PWC

TCS - Hyderabad, India
01.2024 - Current


  • Developed a Code logic using PySpark to process the internal PWC employees data from multiple endpoints through API calls.
  • Extracted people's data from desired attributes from the json response and appended to list and using this appended list created dataframe and after some data manipulations the dataframe is written to CSV file.
  • The Full load & Delta load script works accordingly. Where delta only captures the notifications triggered data from AL(Abstract layer) using service bus queue.
  • Good Experience in using PySpark with Jupyter notebook and Databricks.
  • Troubleshooting and debugging.
  • Provide weekly detailed project reports to keep manager informed on milestones and updates.
  • Perform data transformation and cleansing.
  • Reading, Writing and partitioning DataFrames using Spark Data Frame APIs.

Developer - ASML

TCS - Hyderabad, India
03.2023 - 12.2023
  • Developed and Managed data pipelines by levering the Cloud based services for seamless data integration.
  • Experience in moving the data from the hive to AWS S3 and AWS S3 to SnowFlake Database.
  • Good knowledge in developing the scripts using bash and trigger the job through cron job scheduler.
  • Experience in developing the DAG to run the data movement from the source .i.e NIFI to AWS S3 as destination.
  • Developed Code logic using PySpark to process the data from the Kafka producer and store the data in the Apache Kudu.
  • Worked on Migrating MapReduce programs into Spark transformation using Spark and Scala.
  • Troubleshooting and debugging.
  • Provide weekly detailed project reports to keep manager informed on milestones and updates.
  • Design and co-ordinate with Data Science team in implementing Advanced Analytical Models in Compute Engine using larger datasets.
  • Perform data transformation and cleansing.
  • Reading, Writing and partitioning DataFrames using Spark Data Frame APIs
  • Wrote and developed new and well-tested code for different software projects.

IT Analyst - Marriott International

TCS - Hyderabad, India
01.2021 - 03.2023
  • Work on production tickets & enhancements.
  • Based on the request the analysis is performed to identify the root cause for the issue and document the same to know error database.
  • Monitor Jobs running in Airflow and Jenkins.
  • Act immediately on the jobs that got failed.
  • Trigger the DAG manually after the fix and wait until it is completed.
  • Writing the HQL For processing the data
  • Query the data copy from ec2 to s3 and s3 to sf.
  • perform aggregations such as count, min, max, distinct to name few.
  • Perform join operations like left, right & outer join.
  • Investigate the batch jobs scheduled in bash shell environment.

Software Engineer - Warner Music Group

Tech Mahindra - Hyderabad, India
03.2019 - 08.2020
  • Analyzing the metadata as per the requirements
  • Loaded the data in HDFS from local system.
  • Apply Transformations & Remove Unwanted data.
  • Sort the derive useful data by applying actions.
  • Create data frame and export it.
  • Writing the HiveQL for processing the data.
  • Created hive table with schema and loaded the data using Sqoop.
  • Writing queries in hive to map the data resided in HDFS.
  • Performed aggregations such count, distinct and max and min to name the few.
  • Performed the join operations such as outer, left, right and semi join.
  • Wrote clean and dynamic code, leveraging expertise across multiple programming languages to meet diverse requirements.
  • Assisted quality assurance team with testing software, investigating bugs and developing test cases.
  • Partnered with engineering team to check and optimize interfaces between hardware and software applications.
  • Developed testable software using agile methodologies to create high-quality deliverables
  • Contributed to sprint planning, prioritising backlogs to meet new demands.

Hadoop Developer - Australia & New Zealand Banking

Infosys, Hyderabad, India
11.2017 - 03.2019
  • Analyzing the metadata as per the requirements
  • Loaded the data in HDFS from local system.
  • Extract data using Spark and apply Transformations/Actions
  • Writing the HiveQL for processing the data.
  • Extensively used hierarchies, derived tables, functions, cascading.
  • Created hive table with schema and loaded the data using Sqoop.
  • Performed aggregations such count, distinct and max and min to name the few.
  • Created table with partitions and bucketing for optimization.
  • Performed various join operations.
  • Performed vectorized query operations.

Software Engineer - Management Mentor

Wipro - Hyderabad, India
03.2016 - 11.2017
  • Determining what data must be returned to the application and writing queries for it.
  • Designed mapping and identified to resolve the performance bottlenecks in Source to Target Mappings.
  • Creating and designing tables when new modules are added to a software product.
  • Extracted data from various MySQL tables into Hadoop HDFS using Sqoop incremental load.
  • Creating table indexes that optimize query speeds.
  • Developed stored procedures to test ETL load per batch and provided performance optimised solution to eliminate duplicate records.
  • Defining triggers on necessary tables.
  • Determining the right stored procedures, views, and functions for the application.
  • Optimizing any existing queries to speed up performance.

Education

Diploma of Higher Education - Electronics & Communication

SRM University
2015

Certificate of Higher Education -

Narayana Junior College
2011

Skills

  • Pyspark
  • Spark
  • Hadoop Ecosystem
  • Python
  • kafka
  • Nifi
  • Snowflake
  • Airflow
  • AWS Cloud
  • Jenkins
  • Devops
  • SQL
  • Shell Scripting
  • Postgres SQL
  • Grafana

Accomplishments

  • Appreciation letter from the client For working time bounded deliverables.
  • Stood Hero of the Week for multiple times.

Certification

  • Databricks Certified Associate Developer
  • PCAP
  • AWS Certified Developer - Associate

Languages

Tamil
Intermediate
B1
English
Bilingual or Proficient (C2)
Telugu
Bilingual or Proficient (C2)
Hindi
Bilingual or Proficient (C2)
English
Bilingual or Proficient (C2)
Telugu
Bilingual or Proficient (C2)
Hindi
Bilingual or Proficient (C2)

Timeline

Developer - PWC

TCS - Hyderabad, India
01.2024 - Current

Developer - ASML

TCS - Hyderabad, India
03.2023 - 12.2023

IT Analyst - Marriott International

TCS - Hyderabad, India
01.2021 - 03.2023

Software Engineer - Warner Music Group

Tech Mahindra - Hyderabad, India
03.2019 - 08.2020

Hadoop Developer - Australia & New Zealand Banking

Infosys, Hyderabad, India
11.2017 - 03.2019

Software Engineer - Management Mentor

Wipro - Hyderabad, India
03.2016 - 11.2017

Certificate of Higher Education -

Narayana Junior College
  • Databricks Certified Associate Developer
  • PCAP
  • AWS Certified Developer - Associate

Diploma of Higher Education - Electronics & Communication

SRM University
Kalyan Bollam