Project: Common Egress Framework
Client: Tesco
Duration: July - 2022 to Present
Project Description: Common Egress Framework(CEF), is a Tesco Analytical Platform (TAP) standardized capability for data egress flows to be consumed by internal or external users or systems in a secured, standardized and monitored way.
We egress data in two primary ways:
- Real-time: Stream data using Kafka connector.
- Batch: Data is egress out on batch file.
CEF is a framework used mainly by Data Engineers persona, but in the near future we plan to develop a user interface to accommodate all TAP user personas’ needs.
Data egresses using CEF is consumed for analytical purposes.
Role: Involving in architecture design and development for various use cases. Working on Spark with Scala and Java integration with Kafka Connectors and Kubernetes to transform and process different types of collection data.
Closely working with business stakeholders to establishing the parameters for configurations, development, testing and debugging of applications.
Project: Support Assist Cloud
Client: Dell
Duration: Mar-2020 to July - 2022
Project Description: The project goal is to migrate on going
Support Assist applications to Azure Cloud and CloudIQ.
Dell Support Assist is a software installed in Dell devices which
collects the system stats on regular and periodic basis and send to Support Assist Intelligence Engine Cloud in the form of collections. Where these collections are being processed for different use cases for Dell Support team like SAIE, SATD, SATC, DTC where support team analyze and assist on any issues, updates, and changes for clients.
Role: Involving in architecture design and development on Azure Platform for various use cases. Working on Spark with Scala and Java on Azure HDI and DataBricks integration with Eventhubs to transform and process different types of collection data.
Closely working with business stakeholders to establishing the parameters for configurations, development, testing and debugging of applications. Managing technical team of 2 people.
Project: Local Regulatory Report - Banking
Client: Global Bank - A client of Tech Mahindra
Duration: Sep-2017 to Jan-2020
Project Description: As part of the regulatory and supervisory functions bestowed on it, the Reserve Bank of India collects various fixed format data (called 'Returns') from commercial banks, financial institutions, authorized dealers, and non-banking financial institutions.
Role: As a Big Data developer migrating data from various data sources to HAAS and configuration changes required for implementation for various portfolios. Involving in architecture design and automating of current dataflow. Developing the reporting system using Java/J2EE to meet the business requirements. Writing hive queries for report development and data points, writing shell script and pySpark to automate the data flow through Spark.
Project: eBusiness Analytics
Client: US Insurance client of Accenture
Duration: Jun-2015 to Jan-2017
Project Description: Data lakes development and reporting.
Role: As a developer involved in front office
applications development in Java/J2EE to load and process a large set of data coming from variety of portfolios. Loading
the data in HDFS from other system using Sqoop. Writing the Map Reduce jobs for processing and cleaning the data.
Writing custom UDFs for data processing and cleaning in Hive and Pig. Involved in creating Hive tables, loading with data and writing hive queries to generate Scorecards and Reports.
Project: ShopIBM.com Common Commerce Engine
Client: US Retail of IBM
Duration: May-2013 to Jun-2015
Project Description: Common Commerce Engine is a commerce
engine for the worldwide IBM.COM e-Commerce websites.
Role: Design the User Interface with HTML, CSS, JSP, validation using Java Script, jQuery. Prepared SQL for data Manipulation in application. Provide inputs to team for application design and implementation. Extended existing code to implement new business logics in JAVA. Involved in bug fixing, unit testing, deployments.