Summary
Overview
Work History
Education
Skills
Technical Proficiencies
Timeline
Generic
DIVI MOUNICA

DIVI MOUNICA

Hyderabad

Summary

Results-driven Data Engineer with eight years of experience in designing and building scalable, cloud-native data pipelines. Proven expertise in transforming raw data into reliable, production-ready datasets using modern data engineering tools and cloud platforms. Adept at leveraging Python, AWS (S3, Lambda, SQS), Snowflake, DBT, and Airflow to create efficient, automated data workflows. Passionate about continuous learning, and quickly applying new technologies and techniques to solve real-world data challenges. Known for combining technical depth with a pragmatic, hands-on approach to deliver high-impact, end-to-end data solutions.

Overview

8
8
years of professional experience

Work History

Data Engineer

Sanofi
Hyderabad
09.2024 - Current
  • Designed and implemented robust data pipelines to transform raw source data into structured formats using Python, and store it in Amazon S3.
  • Built modular and scalable DBT models for downstream data transformations, enabling maintainable and production-ready data marts, orchestrated through Apache Airflow.
  • Engineered an advanced data pipeline to replace DBT seeds, leveraging Snowpark for dynamic data loading and transformation workflows, fully orchestrated within Airflow for end-to-end automation.

Senior Software Engineer

Ivy Comptech
Hyderabad
11.2021 - 09.2024
  • Led the migration of the cashier module from Teradata to Snowflake for a major U.S. betting firm, integrating data from multiple source systems into a centralized Enterprise Data Warehouse, with a deep understanding of cashier lifecycle processes.
  • Designed modular data transformation pipelines using DBT (models, tests, macros, snapshots), and orchestrated DBT runs via Airflow, replacing legacy ETL tools like IICS for scalable, version-controlled workflows.
  • Implemented key Snowflake features, including stages, stored procedures, and Snowpark, to optimize ingestion, transformation, and secure data access.

Software Engineer

carelon global solutions
Banglore
02.2019 - 11.2021
  • Collaborated with Business Analysts to analyze requirements, profile data flows, and design new data pipeline tasks tailored to business needs.
  • Worked in a 15-member team to successfully migrate the existing codebase from on-premise systems to cloud-based platforms, ensuring scalability and performance.
  • Partnered with the middleware team to understand complex healthcare business use cases and delivered the necessary data extraction and transformation queries to support analytics and reporting.

Associate Software Engineer

SLK Software Services
Banglore
08.2017 - 02.2021
  • Designed and developed Informatica mappings, sessions, and workflows to load data from files and relational databases, incorporating complex source qualifier queries, pre/post SQL, and indirect file loading using shell scripts.
  • Optimized SQL and BTEQ scripts for performance tuning and efficient data loading into Teradata, ensuring high throughput and adherence to business rules.
  • Participated in code reviews, unit testing, and QA support, following naming conventions and best practices to ensure data quality and resolve issues across development and testing phases.

Education

B.Tech - Electrical and Electronic Engineering

JNTUA
04-2017

Skills

  • Snowflake
  • Python
  • etl
  • Airflow
  • git
  • SQL
  • Data analysis
  • Pandas
  • pdfplumber
  • DBT
  • Airflow
  • Git workflows
  • lambda functions
  • Glue
  • step functions
  • SQL performance tuning
  • Data pipeline design
  • ETL orchestration
  • Data integration
  • Big data processing
  • Spark framework

Technical Proficiencies

  • Skilled in data wrangling and transformation, with a strong ability to structure raw, unorganized data into clean, analytical formats.
  • Extensive experience in designing and building scalable, end-to-end ETL pipelines — from data extraction to publishing refined datasets to final reporting layers.
  • Automated repetitive and ad-hoc manual tasks, significantly improving efficiency and reducing operational overhead.
  • Developed custom Git workflows and ensured seamless CI/CD integration for data pipelines and dbt deployments.
  • Implemented key-pair based authentication across platforms like Snowflake, dbt, and Airflow to enhance data pipeline security.
  • Strong understanding of the modern data ecosystem, with hands-on knowledge of Apache Spark, Kafka, and distributed data processing frameworks.

Timeline

Data Engineer

Sanofi
09.2024 - Current

Senior Software Engineer

Ivy Comptech
11.2021 - 09.2024

Software Engineer

carelon global solutions
02.2019 - 11.2021

Associate Software Engineer

SLK Software Services
08.2017 - 02.2021

B.Tech - Electrical and Electronic Engineering

JNTUA
DIVI MOUNICA