Summary
Overview
Work History
Education
Skills
Timeline
Generic
Reshab Sharma

Reshab Sharma

Senior Big Data Developer
Greater Noida

Summary

Technically adept Software & Data Engineer with 6 years of experience in designing and optimizing big data pipelines using Spark (Python, Scala), Hive, Airflow, AWS, HDFS, and Apache NiFi. Skilled in workflow automation, data orchestration, and scalable ETL solutions, with hands-on expertise in Git, CI/CD (GitLab), Grafana, and cloud-based data management.

Overview

2026
2026
years of professional experience
4
4
years of post-secondary education

Work History

Senior Big Data Developer

Net Connect Global
Noida
02.2024 - Current

Experienced Data Engineer with over 6 years of proven expertise in dynamic settings, dedicated to pioneering advancements in technology. Currently excelling in the role of a data engineer, I excel in:

  • Crafting efficient Spark programs utilizing Spark and Scala for enhanced data processing.
  • Architecting comprehensive end-to-end data pipelines to ensure seamless data flow.
  • Harnessing the power of real-time streaming data for actionable insights.
  • Designing and executing batch jobs and adeptly managing scheduling tasks.
  • Orchestration of workflows using Python airflow
  • Seamlessly integrating batch and real-time Kafka streaming data to deliver precise outcomes.
  • Proficiently troubleshooting scala job-related issues to maintain optimal performance.
  • Collaborating closely with stakeholders to grasp business requirements and drive effective implementation strategies.
  • Implementing robust data quality checks and validation processes to ensure data integrity.
  • Collaborating with cross-functional teams to design and implement scalable data solutions that meet evolving business needs.
  • Actively participating in code reviews and knowledge-sharing sessions to foster a culture of continuous learning and improvement.
  • Proven track record of delivering high-quality solutions on time while adhering to best practices and standards.

Senior Data Engineer

Tata Consultancy Services (TCS)
Noida
07.2022 - 11.2023

I worked as a data engineer, where my core responsibilities include writing Spark programs in Pypark & Scala. I have knowledge of developer tools like IntelliJ and Pycharm, data loading tools like Sqoop, workflow management tools like Airflow schedulers, Apache Nifi, versioning tools like GitLab and GitHub, continuous deployment tools like GitLab, Data Visualization tools like Grafana, and supporting tools like Putty and Winscp.

  • Updated data requirements to the global team.
  • Generated detailed studies on potential third-party data handling solutions, verifying compliance with internal needs and stakeholder requirements.
  • Automated Spark jobs using Python scripts in Airflow
  • worked on complex SQL queries to fetch the required output.
  • Analyzed complex data and identified anomalies, trends, and risks to provide useful insights to improve internal controls.
  • Developed and delivered business information solutions using AWS glue.
  • Developed, implemented, and maintained data analytics protocols, standards, and documentation.
  • Explained data results and discussed how best to use data to support project objectives.
  • Compiled, cleaned, and manipulated data for proper handling.
  • Analyzed large datasets to identify trends and patterns in customer watch time.

Data Engineer

Wipro Limited
Gurgoan
8 2019 - 6 2022

As a Data Engineer at Wipro, I worked on the Bharti Airtel Project (Airtel Holdings, also known as Airtel), an Indian multinational telecommunications services company in India. It operates in 18 countries across South Asia, Africa, and the Channel Islands.

  • My role and responsibility were to work on the collection, storage, processing, and analysis of huge sets of data. Also integrating them with the architecture used across the company.
  • Design, construct, and maintain large-scale data processing systems. This collects data from various data sources, structured or not.
  • Store data in a data warehouse or data lake repository.
  • Understand different data transformation tools, techniques, and algorithms.
  • Implement technical processes and business logic to transform the collected data into meaningful and valuable information.
  • Generated detailed studies on potential third-party data handling solutions, verifying compliance with internal needs and stakeholder requirements.
  • Designed and implemented effective database solutions and models to store and retrieve data.
  • Prepared documentation and analytic reports, delivering summarized results, analysis and conclusions to stakeholders.
  • Contributed to internal activities for overall process improvements, efficiencies and innovation.

Education

Bachelor of Technology - Computer Science

Dr. A.P.J. Abdul Kalam Technical University
08.2015 - 04.2019

Skills

Timeline

Senior Big Data Developer

Net Connect Global
02.2024 - Current

Senior Data Engineer

Tata Consultancy Services (TCS)
07.2022 - 11.2023

Bachelor of Technology - Computer Science

Dr. A.P.J. Abdul Kalam Technical University
08.2015 - 04.2019

Data Engineer

Wipro Limited
8 2019 - 6 2022
Reshab SharmaSenior Big Data Developer