Summary
Overview
Work History
Education
Skills
Certification
Languages
Timeline
Generic
Pooja Wattamwar

Pooja Wattamwar

Pune

Summary

Accomplished Sr. Big Data Engineer at Wipro Technologies Limited, with 4+ years of experience in ETL development, data migration, and designing orchestration pipelines using Databricks Workflows. Demonstrated expertise in PySpark, Databricks, and Microsoft Azure, delivering high-impact projects that significantly improved data processing efficiency and performance. Known for effective client communication, consistently fostering collaboration, and driving success in advanced analytics and cloud migration initiatives.

Overview

4
4
years of professional experience
1
1
Certification

Work History

Sr Big Data Engineer

Wipro Technologies Limited
Pune
03.2025 - Current
  • Client: TD Bank.
  • Project Title: Advanced Analytics.
  • Platform Used: Databricks, Bitbucket, ADF Pipelines, and Synapse.
  • Description: Worked with the client to gather requirements and provide the design to refactor the code base as per the requirements. Also created test cases for SonarQube analysis for code coverage, and helped the team set up VS Code for unit test case setup using the Pytest framework. Used the ADF pipelines for job onboarding, and moved the data from raw to target using Databricks, and the final data was stored on Synapse for further modeling purposes.

Sr. Data Engineer

Wipro Technologies Limited
Pune
12.2024 - 02.2025

Project Title: Migration of AI Products to the Databricks Platform.

Platform Used: Databricks, AWS Buckets, Bitbucket.

Description: Directly worked with the client to gather requirements and provide the design to refactor the existing code base as per the. Databricks standard and contributed to the project by setting up a dummy Databricks platform and educating developers about it. The Databricks platform is in the initial phase. Also Worked on High- & Low-level design of the product. Decoded the existing on-premises code to rewrite the code according to Databricks standards. Supported project with Databricks use cases and Databricks challenges. Have

Worked on Databricks upgraded features like Unity Catalog, Volume, and Workflow, etc. And Helped team to setup VS Code using Databricks Compute

Sr Software Developer (Big Data Engineer)

Wipro Technologies Limited
Pune
11.2023 - 12.2024
  • Client: Mastercard
  • Project Title: Mastercard Cloud Foundation
  • Platform Used: Databricks, ADF, Synapse SQL Server
  • Description: This is a Data foundation Project. Data is generated within Databricks using synthetic data generator using the reference from on prem data. It is handled by DLT Pipeline using platforms like Azure Databricks and ADF DevOps. Once data is generated then it is forwarded to the AI/ML team for further Algorithm Development.

Big Data Engineer

Celebal Technologies
Jaipur
04.2021 - 10.2023
  • Project Title: Netezza Decommission
  • Platform Used: Databricks, ADF, Synapse SQL Server
  • Description: This is a Data Migration Project. Data is migrated from Netezza warehouse to Cloud environment. It is migrated using platforms like Azure Databricks, ADF Devops and Synapse SQL Server.
  • ROLES & RESPONSIBILITIES
  • Interacting with project management to understand objectives and how they want to achieve them.
  • Active participation in Data Migration development by interacting with project teams.
  • Design and implement requirements in Databricks, provide Coding, Maintenance and Enhancement support.
  • Analyzing mapping documents and requirement gathering. Architecture in SQL and make it compatible with Synapse.
  • Getting the files into a blob from FTP/SFTP using the Stone branch tool.
  • Automated all Stone branch jobs to copy the files automatically.
  • Batch Data sources are ingested by creating a pipeline using Azure data factory.
  • ADF is used for the ingestion of on-premises data into Azure data lake storage.
  • Written the SQL code for ingestion in Databricks notebooks.
  • Deployed distributed data processing systems using the Azure ecosystem such as Azure Data Bricks.
  • Used the Azure Key Vault service to store secrets and keys used for encrypting the storage & Service Principle for authentication to access Azure resources.
  • Created ADF pipeline for each source & scheduled for automating monthly/weekly jobs.
  • Handling Team of seven Interns and Trainees, responsible for their training and Project deployments
  • Actively volunteers in interns’ technical improvement programs every week.

Education

PG-Diploma - Big Data Analytics

CDAC
Bangalore, India
01.2020

Bachelor of Engineering -

University of Pune
Pune, India
01.2019

Skills

  • ETL development
  • Big data processing
  • Pyspark development
  • Python programming
  • Data migration
  • Microsoft Azure
  • Data engineering
  • Data warehousing
  • Orchestration pipeline design
  • MS SQL Server
  • Data analysis
  • Data warehouse
  • Unit testing
  • Version control
  • Performance tuning
  • Git version control
  • Query optimization
  • Team leadership
  • Client communication
  • Technical documentation

Certification

  • Microsoft Certified Azure AI-900 Fundamentals
  • Databricks Certified Data Engineer Professional
  • Databricks Certified Data Engineer Associate
  • Databricks Certified Databricks Solution Architecture

Languages

  • English
  • Hindi
  • Marathi

Timeline

Sr Big Data Engineer

Wipro Technologies Limited
03.2025 - Current

Sr. Data Engineer

Wipro Technologies Limited
12.2024 - 02.2025

Sr Software Developer (Big Data Engineer)

Wipro Technologies Limited
11.2023 - 12.2024

Big Data Engineer

Celebal Technologies
04.2021 - 10.2023

PG-Diploma - Big Data Analytics

CDAC

Bachelor of Engineering -

University of Pune
Pooja Wattamwar