Summary
Overview
Work History
Education
Skills
Languages
Experience Details
Hobbies and Interests
Personal strengths
Disclaimer
Timeline
Generic

Abhishek Kumar Tiwari

Pune

Summary

Technical Lead with extensive experience in big data solutions and fraud detection at LTIMindtree. Expertise in Hadoop, SparkSQL, and data migration strategies, driving operational efficiency. Successfully optimized batch processes for digital transformation in the banking sector. Currently focused on Azure services, Power BI, and JARVIS Dashboard implementation, with an emphasis on telemetry data architecture.

Overview

15
15
years of professional experience

Work History

Technical Lead LTIMindtree

Microsoft
Pune
04.2024 - Current

Created EV2 environment, completing setup and rollout processes.

Executed Python scripts within Azure Synapse CI/CD pipeline.

Managed SQL pool operations, including query execution, spark pool creation, and notebook development.

Deployed and executed Python and Linux shell scripts via shell extension on EV2 portal.

Developed App registration process in Azure services to facilitate service management.

Transitioned applications to new telemetry 360 certificates, enhancing coverage across clusters.

  • Implemented Power BI dashboard modifications for CPER and cluster status, publishing updates to designated workspaces.

Technical Lead LTIMindtree

ABSA Bank
Pune
07.2022 - 03.2024
  • Led technical initiatives for ARO insurance entities in banking sector, enhancing product offerings for retail and business clients.
  • Managed distribution channels including Direct, Broker, and Agent to optimize product reach.
  • Oversaw integration of KIT systems for life insurance products, addressing manual processes with Excel spreadsheets.
  • Facilitated administration and reporting improvements for complex product management workflows.
  • Directed ingestion and migration efforts for Project Zafira, ensuring seamless implementation.
  • Engaged in client-facing role while balancing functional and technical responsibilities throughout project lifecycle.
  • Implemented Azentio Suite Group Life module, streamlining underwriting, claims, reinsurance, and financial functions.
  • Supported transition to digital strategies by modernizing insurance product delivery mechanisms.

Tech Lead and Developer TCS

Morgan Stanley
Pune
01.2022 - 07.2022
  • Worked on Batch optimization for hadoop and spark scripts. We had jobs running in application where need to check successor and predecessor and see any job running longer and try to optimize the batches to meet SLA.
  • Project: Batch Optimization
  • Customer: Morgan Stanley

Hadoop and Spark developer TCS

PNC
Pune
12.2017 - 12.2021
  • Developed Hadoop and Spark solutions for fraud detection in transactional data from Mentas source system.
  • Analyzed customer transactions in Cash Vault to identify trends displayed in Tableau.
  • Investigated suspicious transactions in CCOP, addressing credit card overpayment issues.
  • Executed HTCC analysis to detect human trafficking through transaction logic checks.
  • Created new databases for Data Segregation, providing restricted data access to third parties.
  • Optimized SparkSQL framework to improve job performance and efficiency.
  • Collaborated with Agile team on portfolio management and balance reconciliation projects.
  • Engaged with clients in both functional and technical capacities during project delivery.

Hadoop Developer TCS

PNC
Pune
06.2015 - 11.2017
  • I Have been involved in development and post production support activities of Nielsen’s digital products like Digital Ad Rating US, Trackstar, Watch by Warehouse(WBW), MediaView etc. WBW is newly developed products , however, DAR US is the conversion from legacy system(Netteza to Hadoop). DAR china and WBW were developed under waterfall approach, however, DAR US is being completed under Agile methodology. DAR is developed for providing aggregated trend analysis of advertisements posted over web.
  • Trackstar is the registration and master data warehouse for the campaign owners. Trackstar UI is responsible for all details related to campaign e.g brand, market, demography etc. Files have been received from facebook and census to process in Hadoop and loads all the data in particular tables.
  • However, Mediaview is utilized to host the aggregated report which are used by in memory database Hazelcast to publish to the end users.
  • Involved writing Impala/Hive queries scripts for for different job sequence
  • Analysis of data and code related to census or face book impressions if any days got mismatch in the trend which causes impact of end users.
  • Involve in writing Linux shell scripting for different functions like prompting other databases and Hazel cast servers
  • Involve in Developing shell scripts/Crontabs where an Oozie workflow cannot be run
  • Converting Netteza queries into Impala/Hive queries
  • Developing prompting system refresh queries
  • Developing Impala DQA queries to filter bad data being released
  • Writing Oozie workflows to schedule hadoop jobs
  • Maintaining development cluster for trainees and POC Development
  • Facebook issue in files
  • Resolved critical issue like bad character or junk values in campaign name which causes failure in the daily processing.
  • Analysis of logs in the history server and resolve the issue.
  • Involving the issue raised by clients or external customer e.g Issue in populating the latest data in history table by load id, current timestamp or processing flag.
  • Extract the report of impression (census , facebook)campaign as per client request.
  • Universal estimate factor load and adjustment factor in Hadoop table and validations.
  • Space issue on server and cleared the space to run the job without fail.
  • Update impala node(dayrhewbwp009.enterprisenet.org) in case of failure and connect to updated node and test it to run the query in hue.
  • Mediaview issue as client is not able to see the data.
  • Monitoring the Jobs (Hold, cancel, resume and order jobs ) in ops console and fix the failure.
  • Job eventing and schedule job
  • Project: Digital Ad Rating US
  • Customer: Nielsen

Verification and BI Engineer TCS

INFOR
Pune
03.2013 - 05.2015
  • Maintained data warehouse for Infor customers using various products, enabling data retrieval for performance management activities.
  • Developed data warehouse solutions to address complex queries from multiple data sources including Access, CSV, and SQL.
  • Verified Infor product functionality on Lawson S3 and Landmark systems.
  • Logged discrepancies between Lawson S3 and Landmark in Jira for tracking and resolution.
  • Conducted retesting of issues upon new build releases to ensure quality compliance.
  • Performed system testing to validate ERP process flows within finance operations.
  • Facilitated training sessions to enhance resource skills and knowledge.
  • Engaged with clients regularly to maintain strong communication and collaboration.

Test Engineer TCS

General Electric
Pune
06.2012 - 02.2013
  • Conducted detailed analysis of project specifications to ensure accurate requirements gathering.
  • Prepared comprehensive requirements and design documents based on specifications.
  • Executed GUI testing of locomotive screens for functionality verification.
  • Developed test plans aligned with specifications and source code.
  • Performed thorough execution of test plans, focusing on GUI testing effectiveness.
  • Migrated Tilcon Screen architecture to Graphical framework architecture, enhancing performance.
  • Verified graphical framework screens against original specifications to ensure compliance.

Test Engineer TCS

NISSAN
Pune
05.2010 - 05.2012
  • Enhanced battery life for electric and hybrid vehicles through optimized BMS design.
  • Conducted cell balancing to improve efficiency and shelf life of battery packs.
  • Designed firmware for Lithium-ion battery controller ECU using embedded C language.
  • Executed testing on development board and previously developed battery simulator.
  • Implemented communication protocols (CAN, LIN) for ECU and inter-cell data exchange.
  • Utilized DDT Tool for precise fault detection and diagnostics within the system.
  • Performed detailed module specification analysis and created design flow diagrams.
  • Developed comprehensive test plans, executed test cases, and generated reports.

Education

Bachelor of Technology - ECE

Cochin University Of Science And Technology
Kochi
07-2009

Skills

  • Unix and Apache Hadoop
  • Spark SQL and Hive
  • HDFS and Hue
  • Cloudera and Impala
  • Oozie and Sqoop
  • Python programming
  • Jira and MediaView tools
  • Big data expertise
  • Data migration strategies
  • Batch optimization techniques
  • Azure Services
  • Power BI
  • Azure DevOps

Languages

English, Hindi

Experience Details

  • Current organisation: TCS
  • Full name - Abhishek Kumar Tiwari
  • Total Experience – 12 Yrs
  • Relevant Exp in Big data – 7 Yrs
  • Relevant Exp in (Spark): 4 yr (pyspark)
  • Highest education: BTECH

Hobbies and Interests

Listening Music, Watching Movies, Swimming, Riding, Reading novels

Personal strengths

  • Optimistic
  • Good at working in a group
  • Adaptive to different work environment
  • Team building

Disclaimer

I hereby declare that the above information is true to the best of my knowledge and belief. Place: Pune

Timeline

Technical Lead LTIMindtree

Microsoft
04.2024 - Current

Technical Lead LTIMindtree

ABSA Bank
07.2022 - 03.2024

Tech Lead and Developer TCS

Morgan Stanley
01.2022 - 07.2022

Hadoop and Spark developer TCS

PNC
12.2017 - 12.2021

Hadoop Developer TCS

PNC
06.2015 - 11.2017

Verification and BI Engineer TCS

INFOR
03.2013 - 05.2015

Test Engineer TCS

General Electric
06.2012 - 02.2013

Test Engineer TCS

NISSAN
05.2010 - 05.2012

Bachelor of Technology - ECE

Cochin University Of Science And Technology
Abhishek Kumar Tiwari