Summary
Overview
Work History
Education
Skills
Project : Snowflake Data Warehouse Development
Snowflake Data Migration Engineer
Data Pipeline Modernization for Multiple Dashboards
SAP BW on HANA Migration and Optimization
Accomplishments
Certification
Languages
Timeline
Generic
DHEERAJ KUMAR VERMA

DHEERAJ KUMAR VERMA

Bangalore

Summary

Ambitious Data Engineer with over 7 years of experience in building scale data intelligence solutions and dynamic field of data management and analytics, I have honed my expertise in leveraging cutting-edge tools and platforms such as Snowflake, Git and SAP BW on HANA to design, implement, and optimize robust data solutions. Throughout my career, I've developed a deep understanding of business processes, enabling me to seamlessly bridge the gap between complex technical systems and the operational needs of businesses. My approach is rooted in transforming raw data into actionable insights that empower organizations to make informed, strategic decisions. I am passionate about the power of data to drive innovation and solve complex challenges. In the ever-evolving landscape of data technology, I continually seek opportunities to learn, grow, and apply new methodologies that maximize the value of data. Whether it's transforming data into strategic insights, leading cross-functional projects, or implementing scalable data systems, I am driven by a commitment to excellence and delivering impactful outcomes.

Overview

8
8
years of professional experience
1
1
Certification

Work History

Application Development Team Lead

Accenture
Bangalore
05.2017 - Current
  • Led Data Engineering Projects: Spearheaded multiple end-to-end implementations of Snowflake-based data platforms for enterprise clients, delivering high-performance, and scalable solutions.
  • Data Modeling and Pipelines: Designed dimensional and normalized data models, and developed optimized ELT pipelines using tools like dbt and Python for seamless integration of structured and semi-structured data.
  • Cloud Data Platform Implementation: Migrated legacy systems to Snowflake, ensuring minimal downtime, improved performance, and significant cost optimization.
  • Governance and Security: Implemented robust data governance frameworks using Immuta and Monte Carlo to ensure data quality, security, and compliance with regulations.
  • Team Leadership: Managed cross-functional teams, overseeing project delivery, mentoring junior engineers, and ensuring adherence to best practices in data engineering and DataOps.
  • Pipeline Monitoring and Maintenance: Deployed monitoring solutions for seamless daily pipeline operations and resolved production issues, ensuring a 99.9% uptime for critical systems.
  • Led the development of scalable Snowflake-based data warehouses, implementing Data Vault 2.0 architecture to improve data scalability and adaptability.
  • Designed and optimized data pipelines using Talend, dbt, and GitLab to integrate multiple external systems with Snowflake.
  • Enhanced data governance frameworks using Collibra, and implemented data quality monitoring tools like Monte Carlo.
  • Streamlined reporting processes by leveraging Snowflake-specific features, like Streams, Tasks, and Time Travel.
  • Deep knowledge of Snowflake features, including virtual warehouses, micro-partitions, clustering, Snowpipe for real-time data ingestion, time travel, and fail-safe.
  • Security features such as roles, access controls, and masking policies.
  • Collaborated with cross-functional teams to align data solutions with business objectives, delivering measurable efficiency gains.
  • Migrated legacy SAP BW systems to SAP BW on HANA, improving system performance and reducing latency.
  • Designed composite providers, open ODS views, and advanced ADSO objects to streamline reporting workflows.
  • Developed SAP HANA information models (Calculation Views, Analytical Views) to support real-time analytics and reporting.
  • Created and maintained complex BEx queries for finance, procurement, and HR domains, enhancing decision-making capabilities.
  • Automated daily, weekly, and monthly process chains reduce manual intervention by 30%.
  • Provided production support, resolved critical incidents, and ensured seamless data pipeline operations.
  • Developed and maintained advanced HANA Information Models, including Calculation Views, Attribute Views, and Analytical Views.
  • Migrated legacy systems to BW on HANA, improving performance, and enabling real-time analytics.
  • Collaborated with cross-functional teams to implement LO Cockpit and Generic Extractors for seamless data extraction from SAP ECC systems.
  • Conducted performance tuning activities, like partitioning, aggregation, and index optimization, to enhance system efficiency.

Education

Bachelor of Engineering - Information Technology

KIIT UNIVERSITY
Bhubaneshwar, India
03.2017

Skills

  • Snowflake Expertise
  • Data Warehouse Development
  • Experienced in dbt
  • Data Analysis with SAP BW
  • HANA Application Development
  • Collibra Data Governance
  • Monte Carlo
  • Agile development
  • DevOps practices
  • Data Transformation
  • SQL Proficiency
  • Git

Project : Snowflake Data Warehouse Development

Role: Senior Data Engineer

Duration: 2023 – Present

Client: Leading Pharmaceutical Company

Project Description:

Designed and developed a robust Snowflake-based data warehouse to centralize, streamline, and optimize data integration, reporting, and analytics for a global pharmaceutical client. The solution enabled real-time analytics, improved operational efficiency, and supported data-driven decision-making across departments.

Responsibilities:

  • Data Pipeline Design and Implementation:

Designed and implemented data pipelines to integrate data from multiple external systems into Snowflake using Talend and dbt.

Automated data ingestion workflows, ensuring seamless integration of structured and semi-structured data (e.g., JSON, CSV).

  • Scalable Data Modeling with Data Vault 2.0:

Utilized Data Vault 2.0 methodology to build a flexible and scalable data warehouse architecture.

Enabled easy adaptability to change in business rules and ensured long-term scalability for growing data volumes.

  • Governance and Data Quality:

Enhanced data governance processes by integrating Collibra for metadata management and compliance tracking.

Implemented data quality monitoring and lineage tracking using Monte Carlo, reducing errors and improving trust in the data.

  • Snowflake-Specific Development:

Developed and optimized complex SQL queries leveraging Snowflake features such as Streams, Tasks, Time Travel, and Zero-Copy Cloning for efficient data operations.

Designed and implemented scalable solutions for near real-time data processing and analytics.

  • Workflow Automation and Optimization:

Orchestrated workflows using GitLab, automating ETL processes and r educing execution time by 30%.

Implemented CI/CD pipelines for efficient development, testing, and deployment of data workflows.

  • Real-Time Analytics and Reporting:

Delivered real-time analytics solutions and automated reporting dashboards to enable faster, data-driven decision-making across various departments.

Designed custom Snowflake-based data marts to support self-service BI for finance, operations, and marketing teams.

Impact:

  • Reduced data processing time by 40%, improving operational efficiency.
  • Enhanced data accuracy and reliability by implementing advanced validation, governance, and monitoring techniques.
  • Enabled quicker decision-making by delivering real-time data insights and automated reporting workflows.
  • Supported compliance with regulatory standards in the pharmaceutical industry through robust governance frameworks.

Tools and Technologies Used:

  • Data Warehousing: Snowflake
  • ETL/ELT Tools: Talend, dbt
  • Governance and Monitoring: Collibra, Monte Carlo
  • Version Control and Workflow Automation: GitLab
  • Snowflake Features: Streams, Tasks, Time Travel, Zero-Copy Cloning
  • Data Visualization: Tableau, Power BI

Snowflake Data Migration Engineer

Role: Data Engineer

Duration: 2022-2023

Client: Diagnostics Industry

Responsibilities and Contributions:

  • Data Migration Expertise:

Spearheaded the migration of five SQL Server data warehouses to Snowflake, achieving an impressive 99% data match and ensuring business continuity with minimal downtime.

Designed and implemented migration workflows, ensuring data integrity, performance optimization, and smooth transition to the Snowflake platform.

  • Snowflake Specialization:

Mastered Snowflake’s data loading techniques by utilizing S3 buckets, stages, and pipes to enable efficient data movement and transformations.

Built and optimized virtual warehouses, managing compute resources to ensure high performance for diverse teams and workloads.

Administered Snowflake environments, including creating roles, databases, and schemastailored to development, support, and operational needs.

  • Advanced Modeling and Development:

Leveraged dbt (Data Build Tool) to create and maintain robust data models, significantly reducing development time by 50%.

Developed and implemented SQL procedures and transformations, ensuring optimal database performance and accurate analytics.

Added comprehensive data quality checks (freshness, uniqueness, schema changes, volume, null checks) to ensure reliability and robustness of DataMart models.

  • Automation and Optimization:

Automated data ingestion and transformation workflows using Python and dbt, reducing manual intervention and improving pipeline efficiency.

Enhanced database performance by crafting complex SQL queries and optimizing query execution plans to handle large datasets seamlessly.

  • Collaboration and Support:

Worked closely with cross-functional teams to gather requirements, provide technical guidance, and deliver scalable solutions tailored to business needs.

Provided production support for Snowflake implementations, resolving critical issues and ensuring smooth daily operations.

Achievements:

  • Successfully migrated five SQL Server data warehouses to Snowflake , enabling advanced analytics and cost savings through Snowflake's pay-as-you-use model.
  • Reduced development time by 50% through automation and efficient data modeling practices using dbt.
  • Implemented robust data quality checks, significantly reducing data issues and improving trust in analytics.

Technologies and Tools Used:

  • Data Warehouse: Snowflake
  • Data Integration: dbt, Python, SQL
  • Cloud Storage: AWS S3
  • Administration: Snowflake Virtual Warehouses, Roles, Stages, Pipes

Data Pipeline Modernization for Multiple Dashboards

Role: Data Engineer

Duration: 2020 – 2022

Client: Healthcare Technology Firm

Objective:

The project aimed to modernize and optimize data pipelines to support the development and maintenance of comprehensive dashboards, providing real-time insights to healthcare professionals. The solution focused on improving data availability, accuracy, and pipeline efficiency while adhering to stringent security and compliance standards.

Responsibilities and Contributions:

  • Pipeline Design and Development:

Designed and implemented robust, scalable ETL/ELT pipelines to ingest and process data from diverse sources, including APIs, databases, and flat files.

Created and optimized Snowflake tables tailored to meet the analytical and visualization needs of various dashboards.

Leveraged Snowflake's capabilities to handle structured and semi-structured data (e.g., JSON, Parquet) efficiently.

  • Data Integration and Transformation:

Integrated data from multiple sources, ensuring seamless ingestion and transformation into Snowflake for further analysis.

Automated data pipelines using modern tools like dbt reducing manual intervention and errors.

Applied transformation logic to standardize and enrich data, ensuring high-quality inputs for downstream reporting.

  • Performance Tuning and Optimization:

Conducted pipeline performance tuning to reduce data latency and ensure real-time availability of critical healthcare data.

Implemented Snowflake features like clustering, materialized views, and query pruning to enhance pipeline and query performance.

Reduced pipeline processing time by 35% through optimized data flow and resource management.

  • Governance and Security:

Ensured compliance with healthcare data regulations (e.g., HIPAA) by implementing security best practices, including data encryption and Role-Based Access Control (RBAC).

Utilized DataOps practices for continuous monitoring, alerting, and resolving data pipeline issues proactively.

  • Stakeholder Collaboration:

Collaborated closely with data analysts and dashboard developers to gather requirements and ensure that the pipelines supported their analytical needs effectively.

Provided support for ad-hoc requests and enhancements to dashboards by rapidly adapting data pipelines to changing business requirements.

Impact:

  • Enhanced data pipeline efficiency by 35%, ensuring timely delivery of critical healthcare data.
  • Improved the accuracy and reliability of insights delivered through dashboards, enabling healthcare professionals to make informed decisions in real-time.
  • Reduced data latency from hours to minutes, enabling near real-time analytics and reporting.
  • Provided a scalable and future-proof data pipeline architecture to accommodate growing data and analytics demands.

Tools and Technologies Used:

  • Data Warehousing: Snowflake
  • Pipeline Development: dbt, Python, SQL
  • Data Integration: APIs, JSON, CSV
  • Governance and Security: DataOps practices, Role-Based Access Control (RBAC)
  • Visualization Tools: Tableau, Power BI

SAP BW on HANA Migration and Optimization

Role: SAP BW Consultant

Duration: 2017 – 2020

Client: A Multinational Mining and Resources Giant

Objective:

The project aimed to migrate the client’s legacy SAP BW system to SAP BW on HANA to leverage high-performance in-memory computing, enhance reporting capabilities, and reduce system latency. This transformation was critical for enabling faster decision-making and real-time analytics across global business operations.

Scope:

The migration included designing and implementing optimized data models, developing real-time reporting capabilities, and ensuring a seamless transition with minimal downtime. Post-migration, the system was fine-tuned to improve performance, data accessibility, and operational efficiency.

Responsibilities and Contributions:

  • System Migration and Modeling:

Successfully migrated legacy SAP BW systems to SAP BW on HANA, ensuring compatibility and performance improvements.

Designed and developed advanced data models, including Composite Providers, Advanced DataStore Objects (ADSOs), and Open ODS Views, to support real-time data processing and analytics.

Implemented HANA-native information models such as Calculation Views and Analytical Views, enabling high-speed reporting and advanced visualization capabilities.

  • Performance Optimization:

Optimized reporting performance by designing BEx Queries with advanced filters, variables, and calculated key figures, catering to dynamic business needs.

Reduced report generation time by 50%, significantly enhancing user experience for global stakeholders.

  • Automation and Process Improvements:

Automated critical Process Chains for efficient data loading, housekeeping, and error handling, ensuring seamless nightly and real-time operations.

Enhanced monitoring mechanisms to proactively identify and address bottlenecks in the data processing pipeline.

  • Support and Incident Resolution:

Provided end-to-end production support, ensuring smooth daily operations and adherence to SLA timelinesfor incident resolution.

Resolved critical system issues and implemented preventive measures to minimize recurrence.

  • Collaboration and Knowledge Sharing:

Worked closely with business users to understand reporting requirements and deliver customized solutions.

Conducted knowledge transfer sessions and training for over 200 end-users globally, empowering them to utilize the enhanced reporting tools effectively.

Impact:

  • Faster Analytics: Reduced report generation time by 50%, enabling real-time decision-making for operational and strategic processes.
  • Improved Accessibility: Enhanced data accessibility and usability for over 200 end-users across global locations, providing a unified view of enterprise data.
  • Operational Efficiency: Streamlined data loading and housekeeping activities, ensuring minimal downtime and error-free operations.

Tools and Technologies Used:

  • SAP BW on HANA: ADSOs, Composite Providers, Open ODS Views
  • SAP HANA Modeling: Calculation Views, Analytical Views
  • Reporting and Analytics: BEx Queries
  • Automation: Process Chains
  • Other Tools: SAP GUI, HANA Studio

Accomplishments

  • Successfully led the transition from SAP BW to Snowflake for critical data pipelines, reducing processing time by 30%.
  • Automated ETL workflows, resulting in a 25% increase in operational efficiency.
  • Initiated and executed an optimization project that enhanced data processing efficiency company-wide, processing 1 million+ records daily.
  • Appreciated with an ACE Award (Q3 FY24) for Accenture Extra Mile Category.
  • Received the APEX Award (FY2023) for Outstanding performance and contribution.
  • Received multiple appreciation and achievement feedback from various clients and stakeholders.

Certification

  • Certified SAFe® 5 Practitioner
  • People Leadership Credential – Chapter 1: Connect
  • Career Essentials in Generative AI by Microsoft and LinkedIn

Languages

English
Advanced (C1)
C1
German
Beginner (A1)
A1
Hindi
Proficient (C2)
C2

Timeline

Application Development Team Lead

Accenture
05.2017 - Current

Bachelor of Engineering - Information Technology

KIIT UNIVERSITY
DHEERAJ KUMAR VERMA