• Having 8.8 years of IT experience and 5+ years of experience on Software development and Implementation and Looking for a challenging role for the growth of the organization and enhance my knowledge in the IT sector.
• Having detail-oriented data professional experience in Data Cleansing and Data Enrichment with skills in ensuring data quality, accuracy, and enrichment to drive informed decision-making and support business objectives.
• Successfully lead the team to automate live data loads using snowflake data pipeline.
• Importing the data from heterogenous environments like S3, flat files and databases.
• Having good exposure on dbt and informatica.
• Experience in using bug tracking system like JIRA
• Good exposure to applying Snowflake stored procedures and automation features like Streams and Tasks.
• Knowledge of implementing Snowpipe for continuous data ingestion.
• Utilized the Snowflake features in the development such as Cloning, Time-travel, and involved in bulk loading activities.
• Used Snowflake CLI SnowSQL to incorporate the statements PUT and COPY to load data into the tables.
• Proficient at converting business processes/needs into technology requirements.
• Worked on Semi-structured data along with routine structured data.
• Involved in performance tuning and scaling the virtual data warehouse.
• Experience with AWS resources like IAM, EC2, S3, SNS, SQS, CloudWatch, VPC, RDS, MySQL, CloudTrail, cloud watch, cloud formation, Auto scaling, Load balancer (Application LB), EKS, etc.
• Experience in all the aspects of software development life cycle (SDLC) tools like Git, Maven, Nexus, Jenkins, and Docker.
• Ability to work independently and as a team with a sense of responsibility, dedication, Commitment and with an urge of learning new technologies.
• Good communication skills, problem solving skills and ability to analyze quickly and come- up with an efficient industry standard solution for a given problem
Executed migration project from Teradata to Snowflake, ensuring seamless data transition.
Utilized Talend ETL tool for extracting, transforming, and loading data into Snowflake raw layer.
Built DDLs aligning with data model specifications incorporating essential business logic.
Developed stored procedures in Snowflake to manage data flow from raw to core layer.
Automated processes using Streams and tasks for real-time data change capture.
Conducted unit testing post data load to ensure integrity between layers.
Performed bulk loading from AWS S3 to Snowflake using COPY command.
Generated status reports to communicate team achievements to stakeholders.
SnowPro Core certification