Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic
Prakash Kumar Sahu

Prakash Kumar Sahu

SAP ANALYTICS LEAD CONSULTANT
Bengaluru

Summary

Techno-Functional consultant with 15+ years of experience in consulting and business product development, including a significant role at Shell UK. Specializing in Analytics and Reporting (A&R), I have expertise in SAP BW4HANA, SAP ABAP, SAP HANA, SAP S/4 HANA Embedded, SAP IP, SAP BPC, SAP SAC, SAP Business Objects, SAP AFO, Microsoft Azure, Databricks, AWS S3, Power BI, Python, and PySpark. Throughout my career, I have excelled in various roles and utilized these skills to drive impactful outcomes. Responsive expert experienced in monitoring database performance, troubleshooting issues and optimizing database environment. Possesses strong analytical skills, excellent problem-solving abilities, and deep understanding of database technologies and systems. Equally confident working independently and collaboratively as needed and utilizing excellent communication skills.

Overview

16
16
years of professional experience
4
4
Certifications

Work History

Senior Cloud Data Engineer

Shell
Bengaluru
03.2024 - Current
  • Implementation of end-to-end analytics solution to improve charging success rate (CSR) and uptime for all the EV chargers operated by Shell Recharge in Tier 2 EU markets and Ubitricity in NL who uses Driivz as charge point management system (CPMS)
  • Helping business with actionable data insight using Power BI dashboard to improve charging success rate (CSR), 1% improvement is CSR resulted in $180,000 additional revenue per market
  • Data analysis of EV data – OCPP (Open charge point protocol) and charger master data
  • End to end design, build and implementation of Databricks lakehouse using medallion architecture for EV charging OCPP (open charge point protocol) data for 6 European markets
  • Design and development of OCPP schema and ocpp parsing function
  • Developing Databricks notebook by writing optimized and efficient pyspark code for complex data transformation of highly nested Json data and implementation of business logic
  • Using Databricks SQL and Genie for data analysis and testing
  • Code version controlling, code review, code conflict resolution, pull request, merge using GitHub
  • Deploying Databricks code to pre-prod and production environment using Github actions
  • Code scanning using SonarQube and fixing bug and any vulnerability
  • Co-ordinating with EVEC (Charger downtime alerting) team for helping them in deploying alerting solution for Driivz
  • Design and development of Databricks workflow in batch and streaming mode for data orchestration
  • Continuous improvement and process optimization in Way of working (WoW)
  • Mentoring junior data engineers in the team
  • Work collaboratively and effectively communicate with a small, motivated team of data engineers, data scientist, data analyst, product technical lead and product owner
  • To work on PoC and recommend new technologies/features that simplify or improve the tech stack
  • Working with Databricks product support team for any product related bug
  • Helping in organizing Hackathon every quarter so that team can collaborate and learn working on some innovative items.

Senior Cloud Data Engineer

Shell
Bengaluru
04.2022 - 02.2024
  • Design, develop, and maintain scalable, reliable, compliant, and highly available data pipelines and back-end services for real-time decisioning, back-end services for real-time decisioning, reporting, optimization, data collection, and related functions
  • Work collaboratively and effectively communicate with a small, motivated team of engineers and product managers
  • Experiment with and recommend new technologies that simplify or improve the tech stack
  • Implements scalable data models, data pipelines, data storage, management, and transformation solutions
  • Responsible for managing the data architecture and developing the next-gen data platform as well as conducting deep data analysis and optimal performance throughout the data lifecycle
  • To extract, transform, and load (ETL) data from various sources (AWS-Redshift for EV data, SMG_VOC for customer feedback, ER Data Lake for social media data, Adobe analytics for Campaign data, CRM Hub data etc.) into ADLS Gen2 storage in parquet and delta format
  • To develop and implement data integration processes using Azure Synapse Analytics pipeline
  • To make sure that the data is efficiently cleaned, transformed, and loaded
  • Writing optimized and efficient code using PySpark and Spark SQL in Databricks for implemention of business logic
  • Optimized the infrastructure cost by implementing some of the best practices and processes, e.g., Implementation of Vacuum on Dela Lake, Data purging, Databricks cluster management etc
  • Implementation of GDPR – ROE and Inactivity
  • Configuring GitHub code repository for Azure services and developing CI/CD pipelines on Dev-Ops and Github-Actions
  • Provisioning various azure resources like Synapse, Databricks, ADLS Gen2, key vault, SQL database, Virtual Network(V-Net) etc for dev, acc and prd environment
  • Managing access control for various Azure resources using Service principal (SPN), Azure Security Group(ASG), Privileged Identity Management(PIM) etc
  • Implementing Audit log and diagnostic analysis for Azure resources
  • Implementation of end-to-end(E2E) monitoring for Synapse/ADF data pipelines
  • Deployment of codes and artifacts to Acceptance (ACC) and Production (PRD) environment using Github-Actions
  • Working and contributing to Continuous Improvement (CI) initiative
  • To facilitate the internal and external knowledge sharing session
  • Mentoring and guiding junior team members
  • Preparing various documentation for the project like Technical Design Document, Unit test document, knowledge sharing document, how-to-do guide etc
  • Raising Service request to Microsoft for resolving any unsolved problem within Shell
  • Closely coordinate with Microsoft support engineers to get the issue resolved
  • Working closely with azure Security support team to get resolve all the security and access issue
  • Design and build of various Dremio objects (PDS/VDS) as per business requirement
  • Performance tuning of Dremio PDS and VDS by optimizing the SQL code and using reflection (raw and aggregate) functionality.

Lead Designer and Lead Technical Analyst – Analytics & Reporting

Shell
Bengaluru
04.2021 - 10.2022
  • As a Lead Designer and Lead Technical Analyst, responsible for Analysis, design, development, testing and implementation of SAP HANA Analytics and Reporting solutions (new implementation) for Shell Trading Canada (STC)
  • Participating in business requirement gathering and GAP Analysis calls with various stakeholders
  • Closely working with Business Analysts (BA) to translate business requirements into Functional Specifications (FS) documents
  • Understanding the business requirement and preparing High Level Design (HLD) and Technical Design Document (TDD)
  • Designing and prototyping complex business requirement in SAP HANA Sandbox system to test feasibility of the solution and then implementing same in Development system
  • Attending the project demand meeting and design review meeting with ARCC team to keep them updated with project status and getting design document approved so that build can be started
  • To keep informed the project manager, implemention lead, BA and other stakeholders about project design, build and testing status
  • Co-ordinating with developers for completing the build and testing as per design
  • Helping developers in complex build where they need technical help
  • Build review and performance optimization of SAP HANA modelling and SAP Business Object reporting
  • Co-ordinating with Endur team for getting transactional data, co-ordinating with SLT team to replicate new tables in HANA from SAP systems (GSAP, TSAP)
  • As part of this project we will be delivering end to end ( Data ingestion, Data Transformation, Data modelling and Reporting) 6 new MIT (Indirect Tax) report, 1 enhancement of existing MIT report and 1 enhancement of existing MDT(Direct Tax) report
  • Solution is based on SAP BODS and SAP SLT for Data Ingestion, Native HANA for data transformation and Modelling and SAP Business Object for reporting.

Team Lead and Senior Developer

Shell
Bengaluru
02.2021 - 09.2021
  • As a Team Lead and senior developer responsible for Analysis, design, development, testing and implementation of customized SAP BW (Backend) and SAP Analytics Cloud (frontend) solutions for DSPAR (Downstream Performance, Analytics and Reporting) Finance team
  • Participating in business requirement gathering call with DSPAR team
  • Conducting Daily project standup call with developers (SAP BW – 2, SAP SAC – 2)
  • Preparing various project documents like design document, change summary, test script, Unit testing document, ITC Test document, SOM KT document etc
  • Planning for RT release in weekly and quarterly release bundle
  • Design and build of business Datawarehouse (SAP BW) data modelling, data transformation using SAP ABAP, SAP Bex query design and build, writing customer exit variable code using ABAP
  • Worked on complex SAP BW hierarchy (DRAC SFS hierarchy and DRAC Governance hierarchy)
  • Building SAP Integrated planning (SAP IP) solution for enabling file upload in Planning ADSO(Cube type) by business users
  • Building Models in SAP analytics Cloud by consuming BW Bex queries and developing stories and Analytic Application based on these models
  • Responsible for development review and code review of technical deliverables
  • Optimized backend Business Datalake which improved the performance of Bex queries and reports by 75%
  • Developed Bex queries for Downstream Flash (Reporting on COB levels) and Detailed P&L(Reporting on SFS Hierarchy Node)
  • Developed reporting solution for many important KPIs like Downstream Clean, Identified Items, Downstream Reported, Adjusted Earnings, CFFO, Clean Total Opex, Cash Capex, Working Capital, EBITDA, Operational Availability (OA), Utilization, Volumes.

Analytics – Order to Cash (OTC) Lead

IBM
Kolkata
02.2020 - 02.2021
  • As an Analytics Solution Advisory Consultant, responsible for Analysis, design, development, testing and implementation of SAP BW4HANA and S4HANA solutions for new implementation for our IBM US customer Axalta Coating Systems
  • Development Standards and Naming Convention documentation for BW/4HANA, S4/HANA, Native HANA and Embedded Analytics
  • Installation of Business Content for datasources in S/4 HANA system, activating extract structure in LBWE, filling setup table in SBIW and doing testing in RSA3
  • Designed framework for implementing Datasource Enhancement using BADI and AMDP
  • Designed framework for implementing Customer Exit Variable using BADI and AMDP
  • Implemented transformation logic in SAP BW4HANA using AMDP code for pushing data intensive logic in HANA data base
  • CDS based Generic delta extractor development for Sales & Distribution flow
  • Replacing 2LIS_11
  • 12
  • 13
  • Datasorces from custom developed CDS extractor
  • Hana View based datasource development to extract data from other HANA DB and stage into BW/4HANA
  • Generating custom hierarchy datasource for functional area, business area etc in S4/HANA using t-code BW07 and loading that hierarchy data in BW/4HANA
  • Enhancement of existing standard SAP S4HANA Fiori apps, analysis and enhancement of backend CDS view and OData Service
  • Development of CDS Views in SAP S4HANA system to consume in Fiori apps
  • Enhancement of datasources in S4HANA system using BADI and AMDP method
  • Development of SAP BW4HANA data flow for “Sales lifecycle Reporting” and to achieve 10 KPIs in area of Supply Chain management (SCM)
  • Developed Virtual data model using CDS views for OTC – CPU Dashboard devlopment
  • Walkthrough with the business for standard BW4HANA and Fiori reports to fulfill the business KPIs
  • Doing the FIT-GAP analysis for standard reports
  • Developed BW query using 0FI_ACDOCA_10 dataflow to replace Fiori app Material Inventory Values – Summary
  • This app was not having Value help(F4) for Business area hierarchy
  • So we achieved F4 help for Business area hierarchy in BW query
  • This flow was based on Material Ledger.

SAP BW and BO Developer

Cognizant
Bengaluru
07.2014 - 04.2015
  • Developer at offshore
  • Preparing Technical design document
  • Writing business logic in ABAP (CMOD) and Webi context formula
  • Designing and developing the Bex reports and Web Intelligence report
  • Testing of reports for data reconciliation between Webi and Bex reports
  • Performance optimization for the Bex and Webi reports
  • Preparing test script for unit testing of reports
  • Coordinating with onsite team for report development and status update.

SAP BW Developer

02.2014 - 06.2014
  • Work with Business Analyst in translating business requirements into Functional Requirements Document and to Detailed Design Documents
  • Lead analysis sessions, gather requirements and write specification and functional design documents for enhancements and customization; Analyze product impact
  • Communicate activities/progress to project managers, business development, business analysts and clients
  • Develop implementation and test plans, build software acceptance criteria, coordinate and work with clients to oversee the acceptance and dissemination process
  • Preparing Technical Design Documents
  • Designing and developing Data Model, Data flow, process chains
  • Writing and testing the sap abap logics at transformation level(start, end, field and expert routines), DTP and Infopackage filter level for dynamic filter of data, abap lookup for reading data from master data infoobjects and DSO active table, deriving logics based on variables value maintained in table TVARVC
  • Coordinating with onsite team for proper requirement gathering, analyzing the business process, Identifying and analyzing the solution, Implementation, Designing, Configuration, Customization
  • Coordinating with QlikView reporting team and doing minor adjustments in BW data models based on their suggestion.

SAP BW Developer

04.2013 - 01.2014
  • Developer at offshore
  • Coordinating with onsite team for proper requirement gathering, analyzing the business process, identifying and analyzing the solution, implementation, designing, configuration, customization, production support
  • Impact analysis for any new changes, enhancement and development
  • BW specific ABAP development.

SAP BW and BPC developer

05.2012 - 03.2013
  • Lead SAP BW developer at offshore
  • SAP BW-BPC integration developer
  • Coordinating with onsite team for proper requirement gathering, analyzing the business process, Identifying and analyzing the solution, Implementation, Designing, Configuration, Customization, Unit/Integration testing, go-live and post-go-live support
  • Impact analysis for any new changes, enhancement, and development
  • Prepare technical design documents
  • Mentoring the other team members.

SAP BW and BO developer

02.2011 - 04.2012
  • Consultant at client location
  • Interacting with the client for proper requirement gathering, analyzing the business process, Identifying and analyzing the solution, Implementation, Designing, Configuration, Customization, Unit/Integration testing, go-live and Post-go-live support
  • Bringing innovative ideas and implementing that for the better business of the user
  • Mentoring the other team members
  • Organizing knowledge sharing session
  • Chief knowledge Officer of the project.

SAP BW Developer

02.2009 - 01.2011
  • Developer at Offshore
  • Co-coordinating with Onsite for proper requirement gathering and providing a timely quality solution
  • Interacting with end user for resolving incidents of different severity level and enhancement of the current development.

Education

Bachelor of Technology - Electronics & Communication Engineering

Netaji Subhash Engineering College
Kolkata, India
04.2001 -

Skills

Client Relationships

Certification

SAP BW4HANA

Timeline

Senior Cloud Data Engineer

Shell
03.2024 - Current

Senior Cloud Data Engineer

Shell
04.2022 - 02.2024

Lead Designer and Lead Technical Analyst – Analytics & Reporting

Shell
04.2021 - 10.2022

Team Lead and Senior Developer

Shell
02.2021 - 09.2021

Analytics – Order to Cash (OTC) Lead

IBM
02.2020 - 02.2021

SAP BW and BO Developer

Cognizant
07.2014 - 04.2015

SAP BW Developer

02.2014 - 06.2014

SAP BW Developer

04.2013 - 01.2014

SAP BW and BPC developer

05.2012 - 03.2013

SAP BW and BO developer

02.2011 - 04.2012

SAP BW Developer

02.2009 - 01.2011

Bachelor of Technology - Electronics & Communication Engineering

Netaji Subhash Engineering College
04.2001 -
Prakash Kumar SahuSAP ANALYTICS LEAD CONSULTANT