Ambitious, self-assertive, and goal-oriented Azure Data Engr, seeking to hone my competency and knowledge in an organization that provides me with structured and comprehensive career growth beside accomplishing company goals and to obtain the position of Microsoft Azure Data Specialist with a view to utilize my professional experience towards the growth and development of the organization.
⮚ Design and implementation of different Blobs(Page, Append Block blob) to store and manage the data in a cloud computing platform.
⮚ Utilization of Azure storage explorer software for storage content management for blob storage service, file share storage service, ADL’s Gen2 storage service with different approaches like storage connection string, storage keys and shared access signature
⮚ Managing soft delete for blob and file share storage services to maintain for different enterprise application.
⮚ Copying the data from variety of different sources (like .csv file, .zip file, .xls files) to destination storage accounts (standards, premium & ADL’s Gen2 storage account) using Azure Data Factory (ADF)
⮚ Copying the data from multiple data files of different formats into ADL’s Gen2 storage accounts with ADF pipelines.
⮚ Designing and implementation of different activities/controls (Metadata, validation, if condition, nested, copydata, filter, Get Metadata, ForEach, Set variables, Lookup…etc.) pipelines and Data flows in Azure Data Factory
⮚ Copying the data from GitHub portal to ADL’s Gen2 via ADF. Implementation of dynamics pipelines and parameterized pipelines for various ADF controls/activities.
⮚ Copying the data from one source type to load it in destination in another source type using ADF pipelines.
⮚ Deployment of stand along DB’s and Elastic pool as per the business requirements.
⮚ Copying the data from variety of different sources (like .csv file, .zip file, .xls files) to destination storage accounts (standards, premium & ADL’s Gen2 storage account) using Azure Data Factory (ADF)
⮚ Designing of data flows to fetch the data as source from SQL DB with source and another inline dataset transformation.
⮚ Designing cluster for Apache Spark for Azure Data bricks for Master nodes and Worker nodes with single and multi-nodes cluster concept
⮚ Managing multiple Python Notebooks for mounting of source data and executing different python scripts to fulfill the business requirements.
⮚ Deployment of stand along DB’s and Elastic pool as per the business requirements.
⮚ Connecting to Azure BLB SA from Azure Databricks cluster to mount the directory.
⮚ Visualizing the data in different Charts and Metrics in Azure Data bricks cluster to give value adds to the businesses for higher managements.
Project:
Blob, File Share Storage services, ADF, Apache Spark, GitHub, File share Services
Description:
The Dell Sales Enterprise project is a Business Intelligence system that enables end users to analyze data in MOLAP Cubes and reports. In this project, we pull data from multiple sources and load it into Data Warehouse Databases using Azure Cloud Services. Additionally, a MOLAP cube has been created to allow end users and high-level management to view the data. Agile project methodology has been followed in the project.
Responsibilities:
Provisioning of standard, Premium, and Azure Data Lake Gen2 Storage accounts to fetch and dump the data from a variety of different sources and destinations for the data, which is in different formats, sizes, and shapes.
● Design and implementation of different Blobs(Page, Append Block blob) to store and manage the data in cloud computing platform.
● Copying the data from variety of different sources (like .csv file, .zip file, .xls files) to destination storage accounts (standards, premium & ADL’s Gen2 storage account) using Azure Data Factory (ADF)
● Designing and implementation of different activities/controls (Metadata, validation, if condition, nested, copydata, filter, Get Metadata, ForEach, Set variables, Lookup…etc.) pipelines and Data flows in Azure Data Factory
● Performing the Fail Over(RPO & RTO) for data replication of Azure Blob Storage Services.
● Copying the data from GitHub portal to ADL’s Gen2 via ADF. Implementation of dynamics pipelines and parameterized pipelines for various ADF controls/activities.
● Copying the data from multiple data files of different formats into ADL’s Gen2 storage accounts with ADF pipelines.
● Copying the data from one source type to load it in destination in another source type using ADF pipelines.
● Developing MS-SQL queries in Python Notebooks to analyze, visualize and get the business statistics from source streams.
● Migration of data from Private cloud to Public cloud(Forward migration), Public cloud to Public cloud(Platform Migration) and Public cloud to Private cloud(Reverse Migration)
● Management and maintenance of MS-SQL Database backups, on demand backups, schedule backups, restoring of DB’s on demands, Importing DB into Azure Blb SA.
● Implementation of storage-based trigger, Event based trigger and Schedule based triggers, tumbling windows-based triggers for different source and destinations using ADF.
● Copying data from Azure SQL DB to ADL Gen2 Storage Account, and loading the data from Blob SA to SQL DB in single table/multiple tables from different source files.
● Implementation of Azure Keyvault and integrating with Azure Data Factory, storing the secrets in Azure keyvault by creating a connection string for SQL DB’s
● Managing multiple Python Notebooks for mounting of source data and executing different python scripts to fulfill the business requirements.
● Fetching and reading the data from .csv sources in Azure Databricks Apache Spark cluster with WASBS methods.
● Connecting to Azure BLB SA from Azure Databricks cluster to mount the directory.
● Visualizing the data in different Charts and Metrics in Azure Data bricks cluster to give value adds to the businesses for higher managements.
● Importing Apache Spark libraries & writing the code in Databricks cluster in %Scala as per the business needs.
PROJECTS HANDLED:
SalesQuest Metrics Domain :
The goal of SalesQuest Metrics is to track, monitor and report of SalesQuest application and to maintain sales operations user's information, the developer can use the report to understands the usage of the application and provide the standard solutions to the user, helps the SOS support team to monitor and escalate the issues.
BI Tools and Technologies: Tableau, SQL Server, PGSQL, SSIS Title : SOS Support Report Domain :
The goal of the SOS Support Report is to track, monitor of SOS support request handling, provide more insights of request flow and volume of the request handled by the support team, tracking the request from the time of request creation to resolved status.
Successfully completed Data Science Basic level certification in DELL from DECISION SCEINCES ACADEMY.