Proficient in Design and Development of process required to Extract, Transform and Load data into the Data warehouse using Snowflake Cloud Database and AWS S3.
Expertise in Building/Migrating data warehouse on Snowflake cloud database.
Played a key role in Migrating Oracle objects into Snowflake Environment Using AWS Services.
Expertise in working with Snowflake Snow pipes, Internal/External Stages, Clone, Tasks and Streams.
Involved in Zero Copy cloning – Cloning databases for Dev and QA environments.
Storage Considerations for Staging and Permanent Databases / Tables.
Experience in Agile methodology.
Created Internal and External stages and transformed data during load.
Created Integration objects, file formats, stages, and used COPY INTO Or SnowPipe to ingest CSV/TXT/JSON data continuously from AWS S3 bucket.
Experienced in handling of Structured & Semi Structured data loads & unloads into the Snowflake DW.
Experienced in Continuous Data Protection Lifecycle: Time Travel, Fail-Safe zone.
Experienced in data migration from traditional on-premises to the cloud systems.
Queried Historical Results/Data based on Timestamp, Offset, Query ID
Worked of streams, Secure Views and materialized View.
Ensured the accuracy of end files before client delivery and prevented any bug leakage.
Validated each end file before sharing to the client.
Thoroughly analyzed the FIA document to ensure its appropriateness.
Collaborated with the client team to provide demonstrations of end files as part of the transition process.
Addressed client concerns regarding deliverable files through effective communication.
Overview
4
4
years of professional experience
Work History
Associate
Cognizant Technology Solutions, CTS
03.2021 - 05.2024
Bulk loading from the external stage (AWS S3), internal stage to snowflake cloud using the COPY command.
Loading data into snowflake tables from the internal stage using Snowsql.
Played a key role in Migrating Oracle Database objects into Snowflake Environment Using AWS Services
Used COPY, LIST, PUT and GET commands for validating the internal stage files.
Used import and Export from the internal stage (snowflake) from the external stage (AWS S3).
Writing complex snowsql scripts in snowflake cloud data warehouse to business analysis and reporting.
Used SNOW PIPE for continuous data ingestion from the S3 bucket.
Created Clone objects to maintain zero-copy cloning.
Data validations have been done through information schema.
Performed data quality issue analysis using Snow SQL by building analytical warehouses on Snowflake
Experience with AWS cloud services: S3, IAM, Roles, SQS.
Cloned Production data for code modifications and testing.
Ensured the accuracy of end files before client delivery and prevented any bug leakage.
Thoroughly analyzed the FIA document to ensure its appropriateness.
Collaborated with the client team to provide demonstrations of end files as part of the transition process.
ETL Tester
Accenture
01.2020 - 02.2021
Documented testing procedures for developers and future testing use.
Authored and maintained well-organized, efficient and successful manual test cases for entire team.
Developed and maintained details for known issues.
Prepared and executed test cases for business flow understanding while raising defects in HP-QC.
Engaged in test case review meetings with the BA team for better clarity.
Executed Informatica workflows/jobs to ensure data loading from source to target.
Monitored Informatica workflow status in Workflow Monitor.
Performed audit field validation as part of SCD Type2 testing.
Validated data according to the mapping document requirements.
Verified insert, update, and delete scenarios during incremental load testing.
Developed, reviewed, and executed test scripts efficiently in HP-QC.
Built automated test scripts to handle repetitive software testing work