Senior Data engineer with 12+ years of experience possessing technical and functional knowledge involving requirement gathering, analysis, design, development and team management to deliver business requirements with high quality and customer satisfaction. Ability to lead, communicate and manage teams in implementation & support of Enterprise BI Solutions. Also strong communicator and presenter and can drive technical discussion with large groups.
Snowflake:
· Developed highly optimized stored procedures, functions and variables to extract data from different sources and send data to the different pipelines.
· Used snowflake tasks to trigger the stored procedure using Cron timer and have also used Streams for change data capture (CDC)
· Bulk loading from the external stage (AWS S3), internal stage to snowflake cloud using the COPY command.
· Used import and export between internal and external stage (AWS S3 to snowflake, CSV/Excel to Snowflake).
· Used FLATTEN table function to produce a lateral view VARIANT, OBJECT and ARRAY column.
· Have also worked on snowflake admin to provide access, roles and permissions and cost calculation of warehouse and storage.
Amazon Web Service (AWS):
· Designed and managed public/private cloud infrastructures using AWS which include EC2 instances, S3 bucket, RDS, Cloud Watch and IAM which allowed automated operations.
· Created different type of EC2 instances (Windows and Linux) as per the requirement /estimated data volumes.
· Created IAM policies for delegated administration within AWS and configure IAM Users/Roles/Policies to grant access to AWS resources to users.
· Worked on improving the infrastructure design and approaches by configuring the Security Groups and storage on S3 Buckets.
QLIK Replicate and Enterprise manager:
· Installation of QLIK Replicate and Enterprise manager in AWS EC2 machines and others required interfaces (ie. ODBC).
· Set up of Qlik connectivity and tasks to extract data from Oracle and SQL server sources and load into snowflake tables.
· CDC (change data capture) set up to capture the real time changes and have also performed full load, historical load, soft deletes and data anonymization during migration.
· Created separate tasks to handle Purged, non-purged and large object (JSON) data tables.
· Used Log stream and S3 bucket for staging the data file before loading the target tables.
· Used Global rules to create additional fields in target tables and have also used parallel loading for faster data load.
SQL Server Integration Services (SSIS):
· Involved in creating multiple parameterized stored procedures in SQL server to extract data from source and load the targets through data pipeline.
· Used various SSIS tasks such as Conditional split, Multicast, Fuzzy lookup, slowly changing dimension etc, which did Data scrubbing, including data validation checks during staging, before loading the data into the Data Warehouse from Flat Files, Excel and CSV files.
· Importing/exporting data between different sources like Snowflake/Teradata/SQL Server/Excel/CSV etc. using SSIS/DTS utility.
· Created reports to retrieve data using stored procedures that accept parameters.
· Used Business Intelligence Development Studio (BIDS) to design SSIS packages for Data Management applications.
Cognos:
· Developing complex reports by implementing different concepts like Master – Detail relationships, Report Bursting, Conditional Formatting, Table of Contents etc.
· Have worked on Report Studio, Query Studio, Cognos Connection, Job Scheduling and Dashboard preparation in IBM Cognos Analytics 11.1.7 and IBM Cognos 10.2.2 version.
· Used both event and time based cognos jobs to schedule the reports.
· Have used multiple sources like Teradata, SQL Server, Oracle and Snowflake and used features like Join, Union, Intersect, Except to develop multi tab reports.
Others:
· Efficient to work in Agile methodologies such as Story creation, Epic elaboration, Planning, effort estimation etc.
· Performed job scheduling in Databricks, Jenkins, SQL Server and triggering SQL Server jobs and SSIS packages through excel macro as per business needs.
· Proficient in SQL across multiple databases (SQL Server, Teradata, Snowflake, Oracle etc.)
· Strong knowledge in MS Excel (Power Query, Power Pivot, Macros, Pivot table & Charts, function and formulas, conditional formatting, database connection).
Basic knowledge in GCP, Big query and airflow.
Cloud Technologies