Summary
Overview
Work History
Education
Skills
Certification
Training
Timeline
Generic
Bikash Kumar Behera

Bikash Kumar Behera

Summary

A competent professional with 12.5 years of diversified experience into data space for both development and AMS portfolio. Currently working as Reconciliation Developer. Experienced in Telecom, Electronics, Hi-Tech & Media, Consumer Product Goods (CPG), Banking, Financial Services and Insurance (BFSI) industry domain. Excellent communication, interpersonal, analytical skills, and strong ability to perform as part of a team.

Overview

13
13
years of professional experience
7
7
Certifications
1
1
Language

Work History

Reconciliation Consultant

Capgemini
7 2023 - Current

GRT is bank-wide horizontal function providing strategic optimized reconciliation services to various business line within Barclays such as Markets wholesale lending, Finance, Corporate retail, Barclaycard, UKRB and Wealth. GRT team plays a pivotal role by helping business identify and prevent firms’ exposure to market risk, financial risk, and counterparty risk by reconciling bank books and records with those of external parties. GRT provides a scalable, enhanced, and standardized platform for reconciliations, breaks management and reporting. GRT global team delivers reconciliation solutions across clusters on platforms, product, process, and technology management.


  • Worked on Cash (Nostro), Stock (Depot), Exchange Broker reconciliations.
  • Strong understanding of financial concepts such as general ledger (GL), Accounts payable (AP), Accounts receivable (AR) etc.
  • Ensure that financial transactions are accurately reconciled.
    Able to troubleshoot and solve technical problems that arise when working with applications.
    Represent the progress about user story in Scrum, DA, CAB meetings.
  • Good verbal and written communication skills during working with other developers, business analyst and end users.
  • Provide root cause for data issues raised by business users.

Technical Lead

Wipro
04.2019 - 07.2023

The purpose of TLM is used for reconciliation of bank/finance, Capital data from multiple source systems and AMP applications data e.g., GL, Westpac, HSBC, Heritage etc. Using TLM format multiple source system data to standard TLM require format. As part of Reconciliation team solving daily operational processing/challenges to create centralized database to provide best version of data for business and all downstream applications.


  • Worked in migration of TLM 2.7 to RP 3.0.3
  • Worked in TLM RP version upgrade from 3.0.3 to 3.0.9
  • Worked in TLM application server migration from On Premise (BizCloud) to AWS CloudX
  • Worked in IBM WebSphere Application Server version upgrade from v8 to v9
  • Engaged in vulnerability/MS/Linux patching, outage, security upgrade activities
  • Raise access requests for users in Tivoli Identity Manager/ Tivoli Access Manager.
  • Replicate the data in UAT environment to identify the root cause of data issue.
  • Do restart of ActiveMQ, ServiceMix and Matching, TLM Engines in TLM app servers whenever required and validate the corresponding logs to make sure the application services availability.
  • Do restart of JVM servers in IBM WAS Console and validate the corresponding application logs.
  • Import new matching rule to Production through Config Transfer (CT) and validate the same in SmartStudio.
  • Performed TLM deployment using TLM Universal Controller (Installer) as part of change management process.
  • Used TLM Control Process Designer for scheduling job to archive data.
  • Perform health check for TLM WebConnect & TLM View & SmartRecs as part of daily BAU work.
  • Co-ordinate with SmartStream vendor for any issues related to TLM applications (AFS, ACI, COAC)
  • Resolve FSM enquiry raised by business for any data requests or data issues or data loads.
  • Prepare both WSR, MSR report for business.
  • Coordinate with Production support team to ensures all the BAU jobs scheduled in production completed within stipulated SLA
  • Provide root cause for data issues raised by business users.
  • Chair the Disaster Recovery (DR) activities in TLM server.

Module Lead

Mindtree
08.2018 - 04.2019

The purpose of WK CTLS strategic track is to build a master layer by creating golden copy of records and creating centralized DB to provide best version of truth for all downstream applications.


  • Serving as the data management SME to stakeholders, providing expertise on all data deliverables and resource requirements.
  • Working closely with Support/QA to reconcile client data challenges and provide continuous improvement to the process.
  • Farming data strategy and data governance guidelines and monitoring incoming data for quality and consistency.
  • Do data validation to make sure data are in synch between legacy system and new system.

Application Developer

IBM
02.2015 - 08.2018

The purpose of MCKB is to build a centralized system of customers, equipment, revenues, service, marketplace, print shop,
and accounts for Xerox business in order to facilitate a core platform for the organization. MCKB provides 360 -degree view of Xerox sales & marketing data. The support team ensures smooth operations and timely availability of all reports to business.


  • Coordinated with the front-end design team to provide them with the necessary stored procedures and packages and the necessary insight into the data.
  • Worked on SQL*Loader to load data from flat files obtained from various facilities every day.
  • Monitor the batch jobs scheduled in cron as part of BAU loading.
  • Receives mks tickets & analysis to establish the root cause for data issue finds by end user.
    Involved in deployment activities for both major & dot releases.
  • Wrote Unix Shell Scripts to process the files on daily basis like renaming the file, extracting date from the file, unzipping the file and remove the junk characters from the file before loading them into the base tables.
    Involved in the continuous enhancements and fixing of production problems.
  • Created PL/SQL stored procedures, functions and packages for moving the data from staging area to data mart.
  • Created scripts to create new tables, views, queries for new enhancement in the application using TOAD.
  • Created indexes on the tables for faster retrieval of the data to enhance database performance.
  • Debugging of API methods using Soap UI/Eclipse application to find out the database object call to do further analysis to establish the root cause of data issue.
  • Extensively involved in using hints to direct the optimizer to choose an optimum query execution plan


The purpose of this project is to support various legacy applications running in production environment for supporting & maintaining claims, contracts, vehicles, freelancers, various ad types for L’Oréal products campaign, employees working as re gular and contract basis, address, divisions, territories, accounts, products, billings, budgets for L’Oréal cosmetics in order to facilitate a legacy application management system for the L’Oréal business.


  • Developed Advance PL/SQL packages, procedures, triggers, functions, Indexes to implement business logic using SQL Navigator. Generated server-side PL/SQL scripts for data manipulation and validation and materialized views for remote instances.
  • Involved in creating UNIX shell scripting. Defragmentation of tables, partitioning, compressing and indexes for improved performance and efficiency.
  • Involved in table redesigning with implementation of Partitions Table and Partition Indexes to make Database Faster and easier to maintain.
  • Experience in Database Application Development, Query Optimization, Performance Tuning and implementation experience in complete System Development Life Cycle.
  • Performed modifications on existing reports as per change request and maintained it.
  • Used Crystal Reports to track logins, mouse overs, click-through, session durations and demographical comparisons with SQL database of customer information.
  • Worked on SQL*Loader to load data from flat files obtained from various facilities every day. Used standard packages like
    UTL_FILE, DMBS_SQL, and PL/SQL Collections and used BULK Binding involved in writing database procedures, functions and packages for Front End Module.
  • Used principles of Normalization to improve the performance. Involved in ETL code using PL/SQL in order to meet requirements for Extract, transformation, cleansing and loading of data from source to target data structures.
  • Involved in the continuous enhancements and fixing of production problems. Designed, implemented and tuned interfaces and batch jobs using PL/SQL. Involved in data replication and high availability design scenarios with Oracle Streams. Developed UNIX Shellscripts to automate repetitive database processes.
  • Used Bulk Collections for better performance and easy retrieval of data, by reducing context switching between SQL and PL/SQL engines.
  • Created PL/SQL scripts to extract the data from the operational database into simple flat text files using UTL_FILE package.
  • Partitioned the fact tables and materialized views to enhance the performance.
  • Created records, tables, collections (nested tables and arrays) for improving Query performance by reducing context switching.
  • Used Pragma Autonomous Transaction to avoid mutating problem in database trigger.
  • Extensively used the advanced features of PL/SQL like Records, Tables, Object types and Dynamic SQL.
  • Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application.

Systems Engineer

TCS
09.2011 - 01.2015

The purpose of EDW is to build a centralized repository of Customer, Address, Contact, Employee, Product, Asset, Billing, Account, Orders, Service Requests, Billing, CDR (Call details records) , Broadband, Wi-Fi, Usage, Incidents, Interactions, Payments, and Campaigns for BT Retail in order to facilitate a central MIS reporting platform for the organization. EDW provides 360 -degree view of BT Retail Data. The support team ensures smooth operations and timely availability of all reports to business.


  • Monitor the batch jobs scheduled in BMC Control-M workload scheduler tool.
  • Supporting applications on EDW to ensure that business critical reports are available on time and as expected.
  • Have understood the process flow & operating model of BT for my roles and responsibilities.
  • Create project, module and repository creation for upcoming applications deployment into production using OWB client.
  • Generates map for procedures & packages using OWB ETL tool. Extract map, procedure using OMB plus.
  • E2E knowledge on broadband data of BT from various source systems (Phoenix, CISP, WiFi).
  • Doing data level investigation to understand and analyze fault and advise accordingly.
  • Deploy rpd in OBIEE for new dashboard/report build/release in production environment.
  • Debug the code in test environment to identify the root cause of data issue.
  • Interacting with business regarding solving various issues and updating it about daily progress.
  • Ensured that Business Requirements can be translated into Data Requirements.
  • Attending calls with end customers/users to understand their requirement in case of issues and if required passing it to the concerned team.
  • Deploying new code and bug fixes in accordance with BT change management process.
  • Ensure mandatory compliance across BT Services.
  • Analysis and troubleshooting of operational issues.
  • Managing Incidents to ensure they are worked by within the SLA.
  • Completing customers Service requests at the earliest.

Education

Bachelor of Technology - Computer Science And Engineering

BIJU PATNAIK UNIVERSITY OF TECHNOLOGY
04.2001 -

Skills

SmartStream TLM

undefined

Certification

IBM Agile Explorer

Training

  • Wipro Project Management Program (WPMP) - Foundation
  • Oracle Cloud Infrastructure Foundation 2021 Associate
  • Oracle Cloud Platform Application Integration 2021 Specialist
  • Oracle Cloud Database Migration and Integration 2021 Specialist
  • Oracle Cloud Infrastructure 2021 Architect Associate
  • AWS Solutions Architect Associate 2022

Timeline

Technical Lead

Wipro
04.2019 - 07.2023

Module Lead

Mindtree
08.2018 - 04.2019

Application Developer

IBM
02.2015 - 08.2018

Systems Engineer

TCS
09.2011 - 01.2015

Bachelor of Technology - Computer Science And Engineering

BIJU PATNAIK UNIVERSITY OF TECHNOLOGY
04.2001 -

Reconciliation Consultant

Capgemini
7 2023 - Current
Bikash Kumar Behera