Data Engineer with three years of experience in Snowflake and AWS, seeking a transition to a dynamic company that offers strong career growth opportunities.
Overview
4
4
years of professional experience
1
1
Certification
Work History
Data Engineer
BOOTLABS
Bangalore
08.2023 - 02.2025
Designed a scalable and automated pipeline to move data from OLTP systems (AWS RDS, Salesforce SFDC) to Snowflake via an S3-based Data Lake.
Performed ETL transformations on raw S3 data using AWS Glue, then moved curated outputs to a new S3 layer for Snowflake ingestion.
Enabled real-time ingestion with Snowpipe and used External Tables for querying large S3 datasets without full loading.
Automated batch loads using COPY INTO, Stored Procedures, and Tasks based on data volume and frequency.
Used staging tables in Snowflake to validate, deduplicate, and transform data before applying SCD logic.
Implemented SCD Type 1 and Type 2 using MERGE, Streams, and Tasks for versioned data loads.
Reduced processing time by loading only new or changed data instead of full refreshes.
Optimized data access and performance using clustering keys, staging layers, and Materialized Views.
Used transient tables for intermediate processing and external stages for S3 integration, reducing storage costswithout losing historical access.
Automated schema updates and pipeline deployments using GitHub Actions and CI/CD workflows.
Enforced data governance with RBAC and column-level masking to protect sensitive data.
Data Engineer
WIPRO
Chennai
08.2021 - 02.2023
Ingested real-time and batch data from external APIs using AWS Lambda and stored it in Amazon S3.
Automated data loading into Snowflake using Snowpipe with event-driven triggers for real-time ingestion.
Processed and transformed incoming JSON using Python (Lambda) and SQL logic within Snowflake.
Optimized API-based ingestion pipeline by designing efficient event-driven data flow and transient staging layers in Snowflake for validation and processing.
Secured cross-service access with IAM roles and Snowflake storage integrations.
Implemented row-level security and data masking policies to safeguard sensitive PII data.
Monitored data pipeline health and performance using AWS CloudWatch and Snowflake pipe notifications.
Automate schema updates and pipeline changes using GitHub Actions or another CI/CD tool.
Education
Bachelor of Engineering - Electrical, Electronics And Communications Engineering
Saveetha School of Engineering
Chennai
05-2021
Skills
Snowflake
Data Modeling
Data Pipeline
SQL
CI/CD
Event Bridge
DBT
ETL
AWS
Github
AWS Lamda
Amazon RDS
Python
AWS Athena
Data warehousing
Power BI
AWS Glue Job
Certification
AWS Certified Cloud Practitioner
Timeline
Data Engineer
BOOTLABS
08.2023 - 02.2025
Data Engineer
WIPRO
08.2021 - 02.2023
Bachelor of Engineering - Electrical, Electronics And Communications Engineering