Summary
Overview
Work History
Education
Skills
Certification
Timeline
SeniorSoftwareEngineer
Rajat Dhawale

Rajat Dhawale

Pune

Summary

I am a highly skilled Data Engineer with expertise in Python, ETL pipeline development, and cloud technologies. With a strong track record of delivering successful projects, I bring a holistic approach to data management and transformation. I am experienced in working on multiple client site projects, showcasing my adaptability and collaboration abilities. Proficient in AWS services, SQL databases, and data analysis libraries, I possess the technical skills to drive impactful solutions. Committed to delivering high-quality results.

Overview

6
6
years of professional experience
1
1
Certification

Work History

Senior Software Engineer

Genpact
09.2023 - Current
  • Experienced Senior GCP Data Engineer with a proven track record in building scalable systems, leading technical discussions, and delivering high-quality software.
  • Expertise in Python, GCP, AWS, Snowflake, CI/CD, and distributed data platforms.
  • Strong focus on code quality, performance, and system reliability while balancing production stability and feature delivery.

Senior Software Developer

Persistent Systems
07.2021 - 09.2023

Curelight Medication Serach Engine ( Client Confidential)
Technologies:- Python, AWS ,Jenkin ,Postgres Sql ,MySQL ,LAMDA ,S3 ,data migration ,data pipeline

Role:- Python Automation Engineer.

Key responsibilities:-

  • ETL Pipeline Development: I create, manage, and develop ETL pipelines using Python and MySQL to handle healthcare data on the backend. These pipelines are crucial for data handling and serve as the foundation for generating various reports that are later used to create dashboards and visualizations.
  • CI/CD Delivery for Multiple Environments: I have hands-on experience in setting up Jenkins ETL pipelines and delivering CI/CD for both Unix/Linux and Windows Server environments. This ensures smooth deployment and continuous integration of code changes.
  • Scripting and Automation Skills: I possess expertise in regex, shell scripting, running batch scripts from Python, and bash scripting. These skills are essential for automating tasks and enhancing data processing capabilities.
  • Cloud Technology Expertise: I am well-versed in various Cloud technologies such as AWS, GCP, and Oracle, enabling me to work efficiently in different cloud environments.
  • Version Control and Collaboration: I utilize Git for version control, collaborating effectively with team members through GitHub. This ensures a smooth workflow and facilitates seamless code integration.
  • Jenkins Dashboard Creation: I design and implement Jenkins dashboards to monitor and manage the ETL pipeline efficiently. This allows for easy scheduling, running, and log tracking of data processing tasks.
  • AWS Data Management: I load data from AWS S3 buckets, leveraging AWS Lambda for copying data once files arrive. I have extensive experience in setting up and managing data pipelines on AWS to ensure efficient and secure data flow.
  • Database Design and Optimization: I design and optimize database schemas and queries, focusing on PostgreSQL and MySQL, to achieve efficient data retrieval and manipulation. This optimization enhances application performance and data handling.
  • Kubernetes Orchestration: I have experience working with Kubernetes to orchestrate and manage containerized applications. I am proficient in creating Kubernetes manifests, managing deployments, services, pods, and utilizing features like auto-scaling and rolling updates to ensure high availability and efficient resource utilization.

Python Developer

Go Digit General Insurance
10.2019 - 07.2021

Data Pipeline for Finance Domain Project

Technologies:- Python, AWS ,Jenkin ,Postgres Sql ,S3 ,data migration ,data pipeline

Role:- Backend Python Developer

Key responsibilities:-

  • Work on project of Web scraping by using
  • Beautiful soup to deal with html data which is scrape from the website
  • Scrapy and requests to scrape the data from the various web pages so that we can scrape in background
  • Web automation using selenium and requests module to automate various web pages including government portals and also solve the issue of captch websites using google paid captcha services
  • Resolve the issue of ip blocking using ipvanish paid software
  • Use postgres sql for stroing and reporting
  • ASW services S3, AWS lambda, dynamo DB, EC2

Python Developer

E Trading Pvt ltd
05.2019 - 05.2019
  • Worked on project of Customization of Target to the customers to boost the sell
  • We use the python and pandas to read and understand and summaries and visualize the data
  • After that we identify the potential customer to who can buy more and give them incremental targets
  • We create simple UI with the help of html, javascript
  • Sql is the database
  • we use django framework to create complete application

Education

Diploma - Big Data Analytics

Center for Development
2019

Bachelor of Engineering - Mechanical Engineering Engineering & Technology

Priyadarshani Institute of
2019

H.S.C -

State Board
2017

State Board
2013

2011

Skills

  • Python
  • AWS
  • Jenkin
  • Postgres Sql
  • MySQL
  • Snowflake
  • GCP
  • data migration
  • data pipeline
  • data warehouse

Certification

  • AWS cloud practitioner
  • Google Cloud Certified Cloud Digital Leader

Timeline

Senior Software Engineer

Genpact
09.2023 - Current

Senior Software Developer

Persistent Systems
07.2021 - 09.2023

Python Developer

Go Digit General Insurance
10.2019 - 07.2021

Python Developer

E Trading Pvt ltd
05.2019 - 05.2019

Bachelor of Engineering - Mechanical Engineering Engineering & Technology

Priyadarshani Institute of

H.S.C -

State Board

State Board

Diploma - Big Data Analytics

Center for Development
Rajat Dhawale