CURRENT PROJECT
DOMAIN: Investment Banking
CLIENT :BNP Paribas
ROLE :Snowflake Developer
ENVIRONMENT: Snowflake DB,Oracle,AWS S3.
ROLES AND RESPONSIBLITIES
- Converted business needs into complex SQL queries and subqueries using analytical functions, joins and operators
- Developing and maintaining the ETL process to extract, transform and load data from oracle databases, AWS S3 and other sources into snowflake
- Designed and implemented procedures, functions, and user-defined table functions in snowflake to facilitate validation processes
- Developed snowflake procedures to handle branching and looping operations for complex data transformations
- Created and optimized database objects (schemas, tables, stages) to store structured and semi-structured data
- Implemented snowpipe to automate data loading from external sources like S3 storage
- Created snow tasks as task tree to read data from snow stream objects for backup during DML operations in the target snowflake DB
- Developed and maintained data pipelines using snowpipe and streams to automate investment banking workflows, including trading operations, compliance reporting and client portfolio management.
- Bulk data from external stages (AWS S3) and internal stages into snowflake using the COPY command.
PREVIOUS PROJECT
DOMAIN: Investment banking
CLIENT: UBS Bank
ENVIRONMENT: Snowflake DB, Oracle, AWS S3
ROLES AND RESPONSIBILITIES
- Migrated data from Oracle and SQL server to snowflake database using IICS tool.
- Developed snowflake stored procedures to automate complex data transformation and validation processes, streamlining investment banking workflows.
- Monitored and tuned the performance of ETL pipelines and data warehouse using Snowflake Query profiler.
- Developed complex PL/SQL scripts to support ETL processes and ensured data validation and quality checks.
- Used snowflake time travel to access and restore past versions of data, improving data recovery and accuracy.
- Collaborated with cross-functional teams to identify data migration requirements and ensure proper schema mapping and transformation rules.
- Conducted data validation and integrity checks post-migration, ensuring data accuracy and consistency.
- Developed Snowflake procedures to handle branching and looping operations for complex data transformations.
- Automated data pipelines with Snowflake Tasks and Streams for real-time data synchronization between multiple systems, enhancing operational efficiency.