Summary
Overview
Work History
Education
Skills
Icon
Certification
Languages
Timeline
Generic

Shuvamoy Mondal

Bangalore

Summary

Senior engineering professional with deep expertise in data architecture, pipeline development, and big data technologies. Proven track record in optimizing data workflows, enhancing system efficiency, and driving business intelligence initiatives. Strong collaborator, adaptable to evolving project demands, with focus on delivering impactful results through teamwork and innovation. Skilled in SQL, Python, Spark, and cloud platforms, with strategic approach to data management and problem-solving.

Overview

16
16
years of professional experience
1
1
Certification

Work History

Senior Data Engineer

Telstra
08.2023 - Current
  • Designed an architecture for complex ETL/ELT pipelines to ingest and transform data from diverse sources (Azure Storage,SFTP server, Database) into data lakes using Azure Data Factory and integrated with Azure DataBricks along with Delta lake Feature
  • Led the end-to-end development of a modernized data analytics platform that integrated data across disparate sources, enabling real-time and batch processing for analytics and reporting
  • Architected and implemented Spark-Scala framework based ETL pipelines within Azure DataBricks for high-performance data transformation and processing
  • Worked on Snowflake data warehouse to developed a suite of complex transformations using SnowPark
  • Worked on a ETL pipeline between ADLS, EventHub, DataBricks, Synapse, CosmosDB for both real-time and scheduled batch data ingestion
  • Managed raw and curated data layers in ADLS, establishing data governance policies and best practices for data ingestion, security, and access controls
  • Implemented encryption and auditing to meet enterprise security standards
  • Designed and Worked on ADF pipelines for orchestrating end-to-end ETL workflows, managing complex data dependencies, and ensuring data quality checkpoints and integrated with DataBricks to load data into multiple downstram applications.
  • Participated with DevOps Team to integrate Application by creating CICD Pipeline in Azure DevOps
  • Mentored junior team members in best practices for software development, code optimization, and troubleshooting techniques.
  • Established standard procedures for version control, code review, deployment, and documentation to ensure consistency across the team''s work products.
  • Collaborated with cross-functional teams to define requirements and develop end-to-end solutions for complex data engineering projects.
  • Reengineered existing ETL workflows to improve performance by identifying bottlenecks and optimizing code accordingly.

Senior Data Engineer (Manager)

Deloitte
04.2019 - 08.2023
  • Design a Data pipeline using Microservices like Docker and Kubernetes(GKE) in GCP to build the image which read the event from Kafka and write a raw data into GCS, which is further processed and loaded into Snowflake Database
  • Designed and implemented serverless ETL pipelines using Google Cloud Dataflow (Apache Beam), Cloud Functions, and BigQuery
  • Developed and optimized data ingestion, transformation, and orchestration using Cloud Composer (Airflow) and Pub/Sub for nearly event-driven processing
  • Designed and implemented scalable ETL pipelines using AWS Glue, Lambda, and Step Functions for batch and real-time data ingestion
  • Worked on building the framework in spark using scala language and implemented the ETL logic
  • Experience to automate the code using pipelines in Jenkins and terraform deployment tool
  • Worked on AWS Lambda to pull data from S3 and load into DynamoDB Table once the event arrived in S3
  • Automated infrastructure deployment and resource management using Terraform and CloudFormation
  • Ensured data governance, security, and compliance with IAM roles, encryption (KMS) and Optimize performance and cost by implementing partitioning, clustering, and query optimization techniques in BigQuery
  • Built and deployed ML models using AWS SageMaker for predictive analytics
  • Developed deep learning models for image classification
  • Ensured data quality through rigorous testing, validation, and monitoring of all data assets, minimizing inaccuracies and inconsistencies.
  • Participated in strategic planning sessions with stakeholders to assess business needs related to data engineering initiatives.
  • Enhanced customer satisfaction by resolving disputes promptly, maintaining open lines of communication, and ensuring high-quality service delivery.

Big Data Developer | Database Developer

Cognizant
08.2011 - 03.2019
  • Experienced in handling large datasets using Partitions, Spark in Memory capabilities, Broadcasts in Spark, Effective & efficient Joins, Transformations and other during ingestion process itself
  • Involved in creating Hive tables and loading and analysing data using hive queries
  • Worked in Data migration from On prem data lake to S3 storage using DMS
  • Experience in data ingestion from Kafka using Spark Streaming in EMR and land to S3 storage to process further to transform from raw layer and load it into Redshift Database
  • Involved in database development by creating PL/SQL functions, procedures, triggers, and packages and materialized views
  • Design and developed a project using HP Vertica columnar database
  • Expert level experience on Teradata that includes Cursors, Procedures, Functions and Partitioned Tables, Triggers, Dynamic SQL
  • Contributed to the continuous improvement of big data infrastructure by monitoring system performance, identifying bottlenecks, and implementing necessary optimizations.
  • Creating Script for Multi-load, Fast Load and BTEQ in Teradata
  • Enhanced data processing efficiency by designing and implementing big data solutions using Hadoop ecosystem tools.

Pl/SQL Developer

Mitra Systems Inc
10.2009 - 07.2011
  • Created Custom Packages, Stored Procedures, Function and SQL Scripts to load data into Pharmacy warehouse from different sources
  • Written new PL/SQL packages, modified existing code perform Certain Specialized functions / enhancement on oracle Application
  • Collaborated with cross-functional teams to integrate various systems, streamlining business processes.
  • Implemented advanced error handling techniques in PLSQL code, resulting in robust applications with minimal crashes.
  • Worked closely with business analysts to translate requirements into functional specifications, ensuring alignment between technology solutions and business needs.

Education

Bachelor of Science - Electrical, Electronics And Communications Engineering

JIS College Of Engineering
Kalyani, West Bengal
07-2006

Skills

  • Git version control
  • ETL development
  • Azure Cloud
  • AWS Cloud
  • GCP Cloud
  • Big data processing
  • Python programming
  • Kafka streaming
  • NoSQL databases
  • SPARK
  • SCALA
  • Data pipeline design
  • Performance tuning
  • API development
  • Big data technologies
  • Data governance
  • Azure DataBricks
  • Snowflake Database

Icon

Certification

  • Amazon Web Services (AWS) Certified Solutions Architect

Languages

English
Advanced (C1)

Timeline

Senior Data Engineer

Telstra
08.2023 - Current

Senior Data Engineer (Manager)

Deloitte
04.2019 - 08.2023

Big Data Developer | Database Developer

Cognizant
08.2011 - 03.2019

Pl/SQL Developer

Mitra Systems Inc
10.2009 - 07.2011

Bachelor of Science - Electrical, Electronics And Communications Engineering

JIS College Of Engineering
Shuvamoy Mondal