Summary
Overview
Work History
Education
Skills
Accomplishments
Additional Information
Software
Certification
Hobbies
Timeline
Generic

PRACHETOSH GHOSH

Senior Data Engineer
Bengaluru

Summary

/ Synopsis Technical Lead in Data Engineering and Data Services practice with almost 10 years of overall experience as well as 5+ years of experience in Markit EDM and Investment and Asset management, in the banking/asset management domain for large European and US customers. Instrumental in implementing large security, portfolio, price, rating, holding, analytics, market data etc. centralization projects, which were one of the larger implementations of the Markit EDM product. Have strong experience in all areas of system development life cycle in Markit EDM Applications including, design, development and database performance tuning. Have an in- depth knowledge over creating BRD document, technical specification document, system requirement document and technical design document for asset management and investment management system. Skilled in Investment Banking products like Charles River, BISAM with strong knowledge of SQL scripts to generate data models, perform ETL validation and generate reporting. Around 5 years of experience in ETL designing for projects creating data models, loading data into the system via Stored Procedures post Data mapping, Data profiling. Upgrading from On-Prem SQL to Cloud SAAS models like Snowflake. Profile Summary Markit EDM developer, Charles River Backend data validation, ETL Development, SQL Development, Cloud Data Engineering professional with excellence in: ~ Markit EDM Development ~ETL Architecture Designing ~On Prem to Cloud migration/Upgrades ~ CADIS development specialist ~ CI/CD pipelines and automated deployment ~ Python, EDA for Data Analysis ~ Charles River Backend validation ~ Snowflake, Snow SQL ~Recruitment/Staffing ~ SQL, PL/SQL Development ~Automated Testing using Selenium (Java/Python) ~Project Management Markit EDM Experience 5 Years of industry experience in the area of CADIS and Markit EDM with an experience on developing and analyzing complex Stored procedures, functions. Experience on developing critical Markit EDM components like Security Matcher, Core Matcher. Experience in leading a team of Cadis professional to deliver quality product. Good experience in developing and analysis of Core Markit EDM components like Data Porter, Data Constructor, Data Inspector, Rule Builder, Data Flow, Sequence. Closely coordinate with Business Analyst Team for developing EDM solutions to new requirements. Worked on development/enhancement of MEDM application. Comprising the various Markit EDM 10.2 components such as solutions, data porter, data matcher etc. Good exposure to the various MEDM specific processes such as Loading data up to matcher, Cadis in generation, Updating Cadis id back to final type tables, Post matcher biz inspection, Master table creation etc. Good exposure to the London Cadis version 7.6 as well in creating components through script. Good exposure to Markit EDM version 19 with end-to-end implementation with MEDM UI. Good knowledge of packaging the Markit components for deployments, Markit UI and Markit thick client connectivity. Strong knowledge of Autosys and JIL to deploy Markit EDM solutions. Actively involved in discussions and analysis to cater to new requirements/modifications and resolving issues of MEDM application. Strong Knowledge on Markit EDM UI with developing Master Security Dashboards. Over 5+ years of experience in Manual and Automated testing on web based, Client/Server and data warehousing applications. Experience in Integration, Functional, Regression, System Testing, Load Testing, and UAT Testing. Created ETL test data for all ETL mapping rules to test the functionality. Sound Knowledge and experience in Metadata and Star schema/Snowflake schema. Analysed Source Systems, Staging area, Fact and Dimension tables in Target D/W. Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing) Experience in using ALM. and Jira for Bug Reporting and Tracking. Expertise in querying and testing RDBMS such as Oracle, MS SQL Server using SQL, PL/SQL for data integrity. 3 years of solid experience (QA/development) in investment banking proprietary tools like MarkitEDM, CRIMS (Charles River) and data automation using Python (pandas). Exposure Strong Experience in Automating Web Service Testing using SoapUI with custom framework. Exposure to web automation tools – Selenium Web Driver, Cucumber. Leadership An effective leader of 5-8 members with excellent communication, negotiation and relationship building skills. Well versed with Agile Release Management customs with a keen eye to estimates and identifying design deficits.

Overview

10
10
years of professional experience
2
2
Certifications
2
2
Languages

Work History

Technical Lead

Mindtree India Ltd
10.2019 - Current
  • Project Abstract
  • Texas Capital Bank is a commercial bank headquartered in Dallas, Texas
  • FDM is a platform designed to upgrade from ON-Prem SQL Database to SaaS based Snowflake Platform
  • Significant Accomplishments:
  • Develop facts and dimensions to load data into the table based on existing data sheet provided in Excel
  • Prepare SQL scripts and Stored Procedures to load data into the internal DB with the changes
  • Use the data loaded via dataframes(via PySpark) into the staging area (internal and then later external) as files
  • Using get and put commands to load data into the Snowflake DB
  • Performed Data validations through information_schema
  • Performed data quality issue analysis using Snow SQL by building analytical warehouses on Snowflake
  • Cloned Production data for code modifications and testing
  • Perform troubleshooting analysis and resolution of critical issues.

Associate

Cognizant Technology Solutions
Hyderabad
11.2012 - 10.2019
  • 11 month(s))
  • Significant Accomplishments:
  • Worked extensively in cross domain areas of Investment Banking, Healthcare and Mortgage Banking
  • Led a high-performing CADIS Development Team of 10 personnel through all phases of application development; ensured that information systems, products, and services met or exceeded organization/industry quality standards and end-user requirements
  • Actively involved in discussions and analysis to cater to new requirements/modifications and resolving issues of MEDM application and coordinate with Business Analyst Team for developing EDM solutions to new requirements
  • Developed a JAVA Swing based data comparator tool to perform all EDM based regression testing in an automated manner, which worked like data automation tool till implementation of Fitness
  • Managed offshore resources for ensuring timely project completion; developed quality standards through participating in the initial software development stages; validated and enhanced the existing QA plan
  • Rewarded with “Pillar of the Month” for successfully delivering MEDM project for a new engagement with Cognizant
  • Data Engineering and Data Services practice with experience in Markit EDM and Investment and Asset management, in the banking/asset management domain for large European and US customers
  • Instrumental in implementing large security, portfolio, price, rating, holding, analytics, market data etc
  • Centralization projects, which were one of the larger implementations of the Markit EDM product
  • Skilled in Investment Banking products like Charles River, BISAM with strong knowledge of SQL scripts to generate data models, perform ETL validation and generate reporting
  • Solid understanding on data model for facts and dimensions
  • Ability to provide logical and physical design for the given business requirements
  • Well acquainted with all phases of SDLC and STLC (Agile and Waterfall)
  • Designed framework for Selenium using TestNg which was finally integrated with Jenkins as part of Continuous Integration and Implemented the automated test suites for Selenium for Oppenheimer Funds
  • Worked on seamless up gradation from 10.2.4 to 11.4 as well as led the team for up gradation from 11.4 to18.2
  • Introduced Markit EDM best practices from designing 1000,2000,4000 and 6000 solutions and integrating them successfully with Autosys and Continuous integration and deployment of new and updated MEDM components.

Education

BTech - Electrical Engineering

DGPA - undefined

Future Institute of Engineering and Management

Post Graduate Diploma (PGDDS) - Data Science

IIIT-B

MS - Data Science

Liverpool John Moores University

Skills

PL-SQL, SnowSQL, Pythonundefined

Accomplishments

  • Skilled in SQL Development working across complex solutions, Upgrades and designing tables, storing procedures, views and functions
  • Built Stored procedures for extracting fields from csv data and load it to tables, aggregating and grouping of data
  • Supported Investment Data Platform with SQL fix
  • Analyzed the existing SQL queries and performed various tuning techniques like creating Execution plans, Indexes, Hints
  • Handling standardization and cleansing of data before loading to MEDM
  • Experience in automated testing tool like fitness and Selenium
  • Experience in Schedulers like Autosys and Control-M
  • Strong understanding on data model for facts and dimensions and ability to provide logical and physical design for the given business requirements
  • Experience with CI/CD pipelines to integrate SQL Code via Redgate
  • Cloud Data Platform
  • Bulk loading from the external stage (AWS S3), internal stage to snowflake cloud using the COPY and PUT commands
  • Loading data into snowflake tables from the internal stage using snowsql
  • Used COPY, LIST, PUT and GET commands for validating the internal stage files
  • Writing complex snowsql scripts in snowflake cloud data warehouse to business analysis and reporting
  • Used FLATTEN table function to produce a lateral view of VARIANT, OBJECT, and ARRAY column
  • Used SNOW PIPE for continuous data ingestion from S3 bucket
  • Developed snowflake procedures for executing branching and looping
  • Created clone objects to maintain zero-copy cloning
  • Performed Data validations through information_schema
  • Performed data quality issue analysis using Snow SQL by building analytical warehouses on Snowflake
  • Experience with AWS cloud services: EC2, S3, etc
  • Cloned Production data for code modifications and testing
  • Perform troubleshooting analysis and resolution of critical issues
  • Data Analysis
  • Experience in Extracting Analytics data using Web Scraping, Exploratory Data Analysis usingpandas, NumPy, Seaborn and Matplotlib
  • Data Analysis using Redgate for Regression Phases and also using SQL functions
  • Charles River and ETL Testing Experience
  • Experienced in Trading Tool like Charles River Development (CRD) with strong domain knowledge on the Trading lifecycle
  • Performed ETL development and testing on the Charles River Upgrades
  • Hands-on experience in working on ETL using Markit EDM, Informatica, Cognos BI reports, CRIMS (Charles River IMS).

Additional Information

  • Personal Details Date of Birth: Present Address: Permanent Address: , 24th May 1988 001 Vaastu Serenity Apartment, Bengaluru- 560060 18A Bikramgarh Flat 103 Kolkata-700032 Gender: Male Marital Status: Married Nationality: Indian
  • Develop fixes for Investment Data Platform preparing real time SQL scripts. Prepare SQL scripts to Back pop missing data onto the platform. Built new ETL packages using Microsoft SSIS. Developed an ETL Database Model to carry out new procedures while creating various database warehouse dimensions to accept ETL outputs. Prepared data model for facts and dimensions. Provided logical and physical design for the given business requirements, Worked on backend upgradation of Charles River from 18R1 to 22R2. Gathered all necessary requirements from business analysts, subsequently developing physical data models using transformations while creating DDL scripts to design database schema and database objects. Integrate the DWH code with the Redgate for the automated deployment. Awarded Panelist Award for Fresher’s hiring from Mindtree in Q2.
  • Developed an END FLAG based solution to End Client Contracts without performing a soft delete to the business entity. Integrated END FLAG based solutions to the MEDM UI from the thick client. Developed solutions to perform check entities with Deletion and Contract End to resolve data conflicts. Worked on BAU tickets and enhancements on Markit EDM v19.1.12.
  • Migrate all the EDM components from version 7.6 to version19.2. Develop new solutions for loading data from Portia and RBC. Develop Rules and Data Flows to classify Cash and Equity Asset Classes. Integrated load Solution to Security Matcher for generating security Cadis ID for both RBC and Portia. Securities from Both Portia and RBC were enriched on the Transaction Matcher to prepare the dataset to reconcile. Good exposure to the various MEDM specific processes such as Loading data up to matcher, Cadis id generation, Updating Cadis id back to final type tables, Post matcher biz inspection, Master table creation etc. Creation of MEDM Load(1000), Security Matcher (2000),Master(4000) and Domain(6000) level solutions Preparing a reconciliation Screen to get all the Dashboards for Position, Security and Transaction. Also prepared Matcher Realignment UI to take care of CADIS Realignment from UI. Awarded Team-A Leader in January 2021 in recognition of successful delivery for the Project and for high level of customer satisfaction. Project #5: Project Abstract Standard Life Aberdeen plc is one of the world’s largest investment companies, created in 2017 from the merger of Standard Life plc and Aberdeen Asset Management PLC. Operating under the brand Aberdeen Standard Investments. Significant Accomplishments: Migrate all the solutions from version10.2 to version18.2. Develop new solutions for ASI. Develop new UI screens as per the business requirements. Closely coordinate with Business Analyst team for developing solutions to new requirements. Worked on development of MEDM application. Comprising the various Markit EDM 18.2 components such as solutions, data porter, data matcher etc. Good exposure to the various MEDM specific processes such as Loading data up to matcher, Cadis id generation, Updating Cadis id back to final type tables, Post matcher biz inspection, Master table creation etc. Creation of MEDM Load (1000), Master (4000) and Domain (6000) level solutions Actively involved in discussions and analysis to cater to new requirements/modifications and resolving issues of MEDM application. Awarded Team-A Leader and Best Partnership in June 2020 and July 2020 in recognition of successful delivery for the Project and driving a critical deliverable.

Software

Markit EDM

SQL

Snowflake

Certification

Python Hackerrank

Hobbies

Gardenong

Timeline

Python Hackerrank

04-2023

SQL Hackerrank

03-2023

Technical Lead

Mindtree India Ltd
10.2019 - Current

Associate

Cognizant Technology Solutions
11.2012 - 10.2019

BTech - Electrical Engineering

DGPA - undefined

Future Institute of Engineering and Management

Post Graduate Diploma (PGDDS) - Data Science

IIIT-B

MS - Data Science

Liverpool John Moores University
PRACHETOSH GHOSHSenior Data Engineer