Summary
Overview
Work History
Education
Skills
PROJECTS
Websites
Timeline
Generic

Hemalatha N

Chennai

Summary

Data Engineer with 3 years of experience in designing and implementing data pipelines using Snowflake, AWS, and Python. Skilled in Snowflake features (Snow pipe, Cloud Migration), AWS services (S3, DMS, Lambda, Aurora Postgres, Glue, Redshift), and data engineering tools (Spark, Airflow, Databricks). Strong expertise in SQL, Python, and ETL/ELT workflows with proven success in optimizing pipelines, migrating databases, and enabling scalable cloud-based analytics.

Overview

3
3
years of professional experience

Work History

Data Engineer

Cognizant Technology Solutions
Chennai
03.2023 - Current

Salesforce CRM to Snowflake Data Pipeline (AWS + Control-M)

  • Designed and implemented end-to-end data pipeline from Salesforce to Snowflake on AWS.
  • Built raw, staging, and standardized views in Snowflake using stored procedures for data governance.
  • Automated ingestion workflows with Snowpipe and SQS triggers, boosting data availability by 30%.
  • Reduced manual effort by 40% through automation of data workflows.

AWS Data Lake & Redshift Analytics Platform

  • Created centralized enterprise Data Lake on AWS S3, establishing governance layers from raw to refined.
  • Implemented automated ETL pipelines with AWS Glue, Kinesis, and Lambda for comprehensive data ingestion.
  • Improved pipeline reliability and query performance by 45% through integration of Amazon Redshift and SageMaker.

DB2 to Aurora PostgreSQL Migration

  • Led migration of legacy DB2 workloads to Aurora PostgreSQL via AWS DMS, minimizing downtime.
  • Optimized queries, indexed data, and fine-tuned performance, achieving 50% reduction in response times.
  • Automated VACUUM and maintenance tasks with AWS Lambda, enhancing reliability and reducing manual efforts.

Education

Bachelor - Computer Science

Prince Venkateshwara Padmavathy Engineering College
Chennai
06.2022

Skills

  • Snowflake
  • AWS S3, DMS, Lambda, Glue, Redshift
  • Aurora Postgres
  • ETL/ELT Pipelines
  • Data Modelling
  • Query Optimization
  • Data Migration
  • Real-time Data Processing
  • Python
  • SQL
  • Pyspark
  • DBT
  • Databricks
  • Performance Tuning

PROJECTS

Real-time Data Pipeline using AWS & Snowflake

  • Designed and deployed a real-time data ingestion pipeline using Apache NiFi (Docker on AWS EC2) to collect and route raw data into Amazon S3.
  • Automated continuous data loading into Snowflake with Snowpipe, ensuring near real-time availability of data in staging tables.
  • Implemented Snowflake Streams and Tasks for incremental transformations and loading into multiple target tables, enabling automated ELT workflows.
  • Demonstrated ability to integrate cloud services, data orchestration, and ELT automation in a scalable architecture.

End-to-End ELT Pipeline using AWS, Snowflake & DBT

  • Built an ELT pipeline by extracting Netflix dataset (CSV), loading into Amazon S3, and ingesting raw data into Snowflake.
  • Designed data models and transformations using dbt, with separate staging, development, and production layers for robust testing and orchestration.
  • Delivered insights by creating interactive dashboards in Tableau, Power BI, and Looker Studio, supporting analytical and business reporting use cases.
  • Showcased expertise in modern data stack (AWS + Snowflake + DBT+ BI tools) for end-to-end data engineering and analytics.

Timeline

Data Engineer

Cognizant Technology Solutions
03.2023 - Current

Bachelor - Computer Science

Prince Venkateshwara Padmavathy Engineering College
Hemalatha N