Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

Sabath Anto

Chennai

Summary

  • Overall, 7.8 years of experience, with 5.4 years specializing in Snowflake data warehousing and development.
  • Configured and monitored Snowflake Streams for seamless real-time data integration with AWS S3.
  • Involved in the creation of the Snow Stream object to create the audit table data backup whenever DML is performed on base tables.
  • Experienced in data ingestion, transformation, and aggregation. Working on Snowpipe, Stream, Task, Sequence, Cursor, Time Travel, Audit Table, Task Tree, and writing Stored Procedures & UDFs.
  • Knowledgeable in Time Travel, Fail Safe, Zero-Copy Cloning, Data Masking Policy, and data sharing between Snowflake accounts. Created a stage for reading structured and semi-structured data into the Snowflake database.
  • Wrote procedures for handling complex business logic, data transformation, error handling, reusable code, and dynamic SQL execution.
  • Used Sequences to generate data sequence numbers for primary key serials. Worked with Oracle SQL along with data warehousing concepts like Star Schema, Snowflake Schema, OLTP, and OLAP.
  • Demonstrated ability to enhance query performance and optimize Snowflake Warehouse operations.
  • Used the Clone technique in Snowflake to take data backups as needed.
  • Developed secure views and materialized views to share data with partner applications.
  • Knowledgeable in Snowflake Architecture, Pruning, and Clustering Keys to improve query performance and minimize data processing time.
  • Expertise includes techniques such as window functions and common table expressions (CTEs) to tackle more complex analytical requirements.

Overview

7
7
years of professional experience
1
1
Certification

Work History

Snowflake Developer

Virtusa Consulting Services Private limited
Chennai
12.2021 - Current
  • Experienced with integrating the nested JSON data from AWS S3 bucket into Snowflake table, creating external stages with the VARIANT data type.
  • Developed Snowflake objects such as warehouses, schemas, databases, tables, views, pipes, stages.
  • Handled error files by using all copy options and identify through validation mode.
  • Developed ETL pipelines to load data from various sources, including AWS S3, into Snowflake.
  • Developed snowflake Procedures to automate repetitive tasks and complex workflows within Snowflake.
  • Tested and debug stored procedures to ensure they perform efficiently and accurately, minimizing errors and optimizing execution times.
  • Developed a Time Travel solution for accessing and restoring historical data.
  • Developed Snowflake objects such as warehouses, schemas,databases, tables, views, pipes, stages.
  • Generated the data sequence number for the primary key serial of the data using sequence.
  • Knowledge in Data Transformation and Analysis with Power BI.

Snowflake Developer

Datamatics Technology solutions
Chennai
05.2017 - 12.2021
  • Performed data migration from Oracle to Snowflake using the ETL
  • Tool Informatica Intelligent Cloud Services (IICS).
  • Uploaded data into Snowflake tables from an internal staging area using SnowSQL.
  • Utilized SQS queues for managing data ingestion events from AWS S3 to Snowflake, ensuring reliable and efficient data processing.
  • Involved in writing stored procedure to load data to snowflake from external/internal stages.
  • Document the stored procedures clearly to provide detailed instructions and usage examples for other developers and users, ensuring easy maintenance and future enhancements.
  • Created and maintained sequences for generating unique identifiers in database operations.
  • Bulk loading from the external stage AWS S3 to snowflake cloud using copy command.
  • Managed historical data with Snowflake's Time Travel feature for audits purpose.
  • Developed and maintained Snowflake Tasks for scheduling and automating data workflows
  • Utilized snowflake streams to capture changes in data
  • Developed a series of Snowflake tasks organized as a task tree.These tasks read data from Snowflake stream objects associated
  • With the main tables, ensuring continuous data backup during DML operations in the target Snowflake database.
  • Responsible to fix any outstanding bugs by backtracking the issue and debug the error using logical abilities.
  • Understanding customer requirements, analysis, design, development and implementation into the system.
  • Responsible in preparing the document related to test cases and updating it in Confluence page.

Education

B.E - Mechanical Engineering

Rajiv Gandhi College of Engineering
Chennai
06-2016

Skills

  • Snowflake DB
  • REDSHIFT/ORACLE/MYSQL/DATABRICKS SQL/POSTGRESQL/MICROSOFT SQL SERVER
  • AWS S3
  • ELT/ELT Concept
  • DBT
  • SSMS
  • PYTHON
  • IICS
  • JIRA
  • AUTOSYS
  • GIT
  • INFORMATICA
  • CONTROL M
  • SNOW SQL

Certification

  • PCEP-Certified Entry-Level Python Programmer

Timeline

Snowflake Developer

Virtusa Consulting Services Private limited
12.2021 - Current

Snowflake Developer

Datamatics Technology solutions
05.2017 - 12.2021
  • PCEP-Certified Entry-Level Python Programmer

B.E - Mechanical Engineering

Rajiv Gandhi College of Engineering
Sabath Anto