Summary
Overview
Work History
Education
Skills
Certification
Timeline
CustomerServiceRepresentative

Sridharan Parthiban

Azure Data Engineer
Chennai,TN

Summary

CAREER SUMMARY

  • Around 6 years of work experience in Banking domain and Life Science with analysis, design, development and testing in SQL Server and Azure Platform.
  • Writing and maintaining code to meet system requirements, system design and technical specifications with quality in accordance with quality accredited standards.
  • Extensive experience in SQL Server Database, developing T-SQL queries, stored procedures, other database objects and SSIS packages.
  • Providing guidance to the testing team and to ensure all the business/technical functionality stated in the requirement document and in the Functional Specification document is tested before delivering to the Client.
  • Application deployment on development, testing and production environments and following change and release management process.
  • Expertise in MS-SQL Server, SSIS, Azure Data Factory, Data Bricks, Data Lake.
  • Experience in version control like GIT/STASH, TFS.

Overview

6
6
years of professional experience
2
2
Certifications

Work History

Azure Data Engineer

COGNIZANT-GSK
2020.03 - Current

CEDAR was a project built to enable the business to view the reports in PowerBI from Azure DWH.It Contains Overall Sales details of the GSK products in different stores under Consumer Health.

Responsibilities:

  • Requirement gathering from Product Owner and design discussion with architect.
  • Providing estimates for the Sprint task.
  • Updating the Sprint task board on daily basis and providing daily status updates to client which makes it easier to showcase the efforts and predict the Spillover in advance for the Sprint.
  • Developed Common Data Model (Azure DWH) which can hold the data of all modules and countries in the same set of tables.
  • This reduces the number of tables created for each module.
  • Created Linked Services, Datasets, Key Vault Secrets, Schedule based triggers.
  • Developed Pipelines in Data Factory to copy the files from SFTP to Datalake, to enable the archival of processed files, to Process multiple files at the Same Time.
  • Developed Notebooks in Databricks using Pyspark and Scala to ingest the data from datalake to the Staging table after performing data quality checks and transformations.
  • Created Stored procedures to handle the incremental data in Dimension and Fact tables in CDM Model.
  • Integrated all the notebook and stored procedure activities in the Pipeline.
  • Created Views with Row level security which helps in restricting the data access to the users across the modules.
  • Provided Support to Testing team by fixing defects and helping them to achieve test completion on time.
  • Provided Support to Business team by guiding them in performing the UAT testing and getting it Signed off.
  • Provided Service transition to Production support team and handled issues in Warranty Support.
  • Environment/Tools: Azure Data Factory, Data Bricks, Data Lake, Logic app, Key Vault, SSMS.

Azure Data Engineer

COGNIZANT-GSK
2019.09 - 2020.03

BigDecision is an inhouse tool developed by Cognizant to perform ETL.

This project aimed at Migrating all the BigDecision components used in CEDAR to Azure Components.

Responsibilities:

  • Requirement gathering from Bigdecision Support team and design discussion with architect.
  • Created Linked Services, Datasets, Key Vault Secrets, Schedule based triggers.
  • Developed Pipelines in Data Factory to copy the files from SFTP to Datalake, to enable the archival of processed files, to Process multiple files at the Same Time.
  • Developed Notebooks in Databricks using Pyspark and Scala to ingest the data from datalake to the Staging table after performing data quality checks and transformations.
  • Integrated all the notebook and stored procedure activities in the Pipeline.
  • Migrated all the Bigdecision components with Azure data Factory and Databricks.
  • Provided Support to Testing team by fixing defects and helping them to achieve test completion on time.
  • Provided Service transition to Production support team and handled issues in Warranty Support.
  • Environment/Tools: Azure Data Factory, Data Bricks, Data Lake, Logic app, Key Vault, SSMS.

BI Developer

HCL-CBA
2017.05 - 2019.08

All the business information of the Commonwealth Bank of Australia (CBA) is stored in FMS that is a mainframe system.Accessing those information from mainframe is very expensive so BIS is used to store the information from FMS.Data get fed from FMS to BIS server on an hourly basis where the data is filtered and flows to different applications through SSIS packages scheduled in SQL server agent.

BIS is a SQL server representation of FMS and Non-FMS data that manipulates the raw data and distributes it to different application through SSIS packages.

Responsibilities:

  • Analyzing business requirements, preparing design documents and designing screens based on the requirements.
  • Involved in development and unit testing.
  • Client interaction for business requirement specifications.
  • Developing T-SQL queries, stored procedures, and database-objects, SSIS packages, SSAS Cubes using MS SQL Server as per business requirements.
  • Providing guidance to the testing team and to ensure all the business/technical functionality stated in the requirement document and in the Functional Specification document is tested before delivering to the Client.
  • Optimizing the Stored Procedure and activities like Table partitioning, database purging and Cube partitioning.
  • Created SSIS reusable packages to extract data from SQL Server and Flat File sources and load into various Business Entities.
  • Extensively used SSIS transformations such as Lookup, Derived column, Data conversion, Aggregate, Conditional split, SQL task, Script task and Send Mail task etc.
  • Used Error and Event handling; Precedence Constraints, Break Points, Check Points, Logging for SSIS packages.
  • Involved in SSIS Packages deployment and job creation through SQL script.
  • Involved in automating SSIS Packages using SQL Server Agent Jobs.
  • Involved in Monitoring and troubleshooting complex packages by using Data Viewers Grids A 3-member team was involved in the development of the project using SQL Server 2012, C#, SQL Server Data tools.

BI Developer

HCL-CBA
2016.11 - 2017.05

ODS was a project to build a near live datawarehouse that will reduce the load at mainframe systems.The semantic layer built on top of the ODS will be used by the data analytics and spotfire reporting team for customer reports.

Responsibilities:

  • Requirement gathering from client.
  • Design the most suitable solution through analysis and continuous discussions with the clients, and create documents.
  • Develop the SSIS packages and SQL stored procedures required to load the data in to the data warehouse from the mainframe.
  • Creating Excel and Text reports for the using SQL stored Procedures and SSIS Packages.
  • Developing Stored Procedures and Functions and Views for the reporting team based on their requirements.
  • Creating and scheduling of Jobs for the packages.
  • Performance tuning of stored procedures.
  • Developing automated test procedures and packages.
  • Perform peer reviews.
  • Test and deploy the Packages Environment/Tools: T- SQL, SSIS 2012, SSMS, T SQL, Stash AEWS (Adviser Early Warning System.

BI Developer

HCL-CBA
2015.01 - 2016.10

Adviser Early Warning System is used to flag and report the advisers on risk as per the set of metrics.

Responsibilities:

  • Analyzing business requirements, preparing design documents and designing screens based on the requirements.
  • Involved in development and unit testing.
  • Client interaction for business requirement specifications.
  • Developing T-SQL queries, stored procedures, database -objects, SSIS packages using MS SQL Server as per business requirements.
  • Providing guidance to the testing team and to ensure all the business/technical functionality stated in the requirement document and in the Functional Specification document is tested before delivering to the Client.
  • Optimizing the Stored Procedure and activities like Table partitioning, database purging and Cube partitioning A 2-member team was involved in the development of the project using SQL Server 2012, C#, SQL Server Data tools.

Education

B.E (CSE) - Computer Science

SMIT
CHENNAI
2010.06 - 2014.04

12th - Computer Science

St.Vincent HSS
Chennai
2009.06 - 2010.04

10th - Computer Science

St.Vincent HSS
Chennai
2007.06 - 2008.04

Skills

Azure Data Factory

undefined

Certification

Microsoft Certification-DP-200 Implementing an Azure Data Solution

Timeline

Microsoft Certification-AI-900 Microsoft Azure AI Fundamentals

2020-12

Microsoft Certification-DP-200 Implementing an Azure Data Solution

2020-10

Azure Data Engineer

COGNIZANT-GSK
2020.03 - Current

Azure Data Engineer

COGNIZANT-GSK
2019.09 - 2020.03

BI Developer

HCL-CBA
2017.05 - 2019.08

BI Developer

HCL-CBA
2016.11 - 2017.05

BI Developer

HCL-CBA
2015.01 - 2016.10

B.E (CSE) - Computer Science

SMIT
2010.06 - 2014.04

12th - Computer Science

St.Vincent HSS
2009.06 - 2010.04

10th - Computer Science

St.Vincent HSS
2007.06 - 2008.04
Sridharan ParthibanAzure Data Engineer