Summary
Overview
Work History
Education
Skills
Accomplishments
Projectssummary
Namesignature
Languages
Disclaimer
Professional skills summary
Personal Information
Timeline
Generic

Nikhil Valsaraj

Bangalore

Summary

To associate with a progressive organization that provides me opportunities to share my knowledge and enhance my skills, develop my career, and be a part of a team that dynamically contributes towards the growth of the organization.

Possess strong skills in database management systems, Big Data processing frameworks, data modeling and warehousing. Have successfully led teams in creating innovative data solutions to improve system efficiency and business decision-making processes. Demonstrated impact through enhanced data availability and accuracy in previous roles.

Overview

11
11
years of professional experience

Work History

Senior Data Engineer

Publicis Sapient pvt ltd
Bangalore
10.2021 - Current

Data Engineer

IBM India pvt ltd
Bangalore
07.2018 - 09.2021

System Engineer

IBM India pvt ltd
Bangalore
07.2014 - 06.2018

Education

B.Tech - Electrical and Electronics

Amrita School of Engineering
01.2013

Intermediate -

Veda Vyasa Vidyalayam, Calicut
01.2009

Matriculation -

Amrita Vidyalayam, Calicut
01.2007

Skills

  • Spark
  • GCP
  • Core Java
  • Dynatrace
  • Bigquery
  • PySpark
  • Hive
  • Spanner
  • Python
  • DBT
  • Scala
  • Jenkins
  • MS Excel
  • Pandas
  • Airflow
  • Apache Beam
  • SQL
  • Kubernetes Basics
  • Github
  • Dataflow
  • GCS
  • Numpy
  • Matplotlib

Accomplishments

  • Received Managers choice awards several times for timely delivery.
  • GCP Certified Professional Cloud Architect
  • GCP Certified Associate Cloud Engineer
  • Received Deep skill award for learning new skills and implementing in the project.
  • Appreciation from Client and leads for my deliverables.
  • Certified Microsoft Azure Fundamentals Practitioner
  • AWS Certified Cloud Practitioner

Projectssummary

  • Lloyds Risk - Credit Decisioning Tool, 02/2023, Current, The project is based on Agile delivery model tuned to align with Big Data development. We are working with Lloyds Risk Team on developing the credit decisioning tool. We are doing transformations on various credit decision related tables present in spanner table in google cloud and doing various transformations on it like parsing, mapping etc and writing to various gcp storages like gcs, spanner,big query etc. The output is then consumed by external applications. The stream and batch jobs are developed on apache beam (dataflow in gcp) and deployed using CI/CD to Kubernetes cluster in gcp. We have developed a streaming dataflow pipeline which reads data from spanner change stream and continuously transforms the data. We are making use of Data Build Tool(DBT) in transforming the data from the raw layer of bigquery to the staging layer of bigquery., Apache Beam, GCP Dataflow,DBT, Spanner, Spanner change stream, GCS, Bigquery, Jenkins, Java, Liquibase, Kubernetes Engine., Spanner, GCS, Bigquery, Intellij,GCP shell, Develop batch and streaming Dataflow jobs for extracting, transforming and writing data from different sources. Implement various error handling scenarios for dataflow jobs. Involved in doing code optimization. Involved in mapping and parsing. Involved on deploying the code in different environments. Active participant in iteration planning calls, daily scrum calls, retrospection and demo meetings.
  • Lloyds Consumer Servicing, 11/2021, 02/2023, The project is based on Agile delivery model tuned to align with Big Data development. We are working with Lloyds Consumer Servicing team to develop/maintain new and existing notifications which the customers receive. There are different types of notifications like push, hybrid etc which the customers receive for incoming, outgoing, credit card transactions etc. We also develop specifics smart alerts to customers opted for specific notifications and we do the transformations on spark which are both real time and batch. The data comprise of customer data, transactions data, customer preferences and device info etc which are stored in specific databases. We apply different transformations and load it into hive and then load it into hbase after doing few more transformations before sending to external api responsible for sending notifications to customers., Apache Spark, Scala, Java, Kafka, Hadoop, Hive, Hbase, Jenkins, Git, python, BDD, cucumber., Hive, Hbase, HDFS, DB2, Intellij, pycharm, putty, winscp, cyberark, splunk, Performs data analysis (working schema data validation), coding and unit testing. Developing different alerts based on business requirements. Involved in doing code optimization. Involved in BDD development as part of functional testing. Involved on running the code in cluster. Active participant in iteration planning calls, daily scrum calls, retrospection and demo meetings.
  • IBM Enterprise Performance Management, 06/2018, 09/2021, The project is based on Agile delivery model tuned to align with Big Data development. We are working on IBM cloud model using spark/scala/python. The data comprise of finance, labor etc which comprise of dimensions and facts. We apply different transformations and load it into S3 bucket in parquet format. The dimension and fact data is then read and joined by applying n different transformations and loaded into cloud database. The final enriched data is stored in IBM cloud storage where its used for generating reports using Cognos. The data is also validated in each and every phase as it moves from one source to another. The source and target data is also validated using data validation tool. The entire flow is automated using Airflow dags. Also involved in prod support activities where we monitor the running of master dag through airflow and fix the issues., Apache Spark Scala, PySpark, Hadoop, Airflow, Docker, Kubernetes, S3, Jenkins, Git, python, cognos, Airflow., S3, DB2, IBM Cloud Storage, Intellij, pycharm, Dbeaver, cyberduck., Performs data analysis (working schema data validation), coding and unit testing. Involved in spark/scala/python coding to read dimension data from source system(DB2) and after transformation write it into cloud model. Worked extensively on implementing different dataquality rules as per business requirements. Involved in doing code optimization. Involved in scala coding for Data Validation. Involved on running the code in cluster. Performing data analysis using MS excel. Active participant in iteration planning calls, daily scrum calls, retrospection and demo meetings. Involved in building dashboards in cognos for visualization purpose.
  • STELLA COMMERCE- CCE, 09/2014, 05/2018, The Common Commerce Engine code is based on WebSphere Commerce Business Engine based on JAVA. CCE provides a web and telephone sales platform for commerce, with the basic functions of catalog browsing, add to cart, checkout and customer satisfaction survey. Initially I was part of the separation activities between IBM and lenovo called partsportal where we had to build separate site for Lenovo. I was also part of migrating the CCE from WCS 6 to WCS 7 and the project is called Stella.I have worked on IBM Employee Purchase Program, which is an ecommerce site where we are selling electronic goods to IBM employees, friends, family and contactors in US and Canada. Stella is based on Agile methodology hence familiar with all concepts regarding agile like daily scrum, creating user stories and tasks, refinement calls Kanban board., Java, Jenkins, Eclipse, Mapreduce, WCS., WCS developer, JAVA developer, CMC, WAS admin console, responsibility for overall technical delivery of project. Mentoring juniors

Namesignature

NIKHIL VALSARAJ

Languages

  • English
  • Malayalam
  • Tamil
  • Hindi

Disclaimer

I hereby affirm that all particulars mentioned in this document are true to the best of my knowledge and belief.

Professional skills summary

Result-oriented and competent professional with 9.11 years of comprehensive experience and knowledge of Software Engineering. GCP Certified Professional Cloud Architect (09/2022-09/2024). GCP Associate Cloud Engineer. Experienced in AGILE software development methodology. (11/2023-11/2025). Worked on Banking, IT enablement and e-commerce domains. Track record for consistently meeting goals and delivering a high level of job performance. Excellent communicator with exceptional talent for problem solving and ability to handle multiple functions and activities in high pressure environments with tight deadlines. Motivated and goal driven with a strong work ethics, continuously striving for improvement coupled with excellent administrative aptitude with an eye for detail and the commitment to offer quality work. Worked in various phases of Software Development Life Cycle including Requirement analysis, software development, integration, software implementation and deployment, Testing. A strong team player thereby ensuring the overall productivity is always high and the goals are met on time.

Personal Information

  • Place of Birth: Kerala, India
  • Date of Birth: 01/31/1991
  • Nationality: Indian
  • Marital Status: Married

Timeline

Senior Data Engineer

Publicis Sapient pvt ltd
10.2021 - Current

Data Engineer

IBM India pvt ltd
07.2018 - 09.2021

System Engineer

IBM India pvt ltd
07.2014 - 06.2018

B.Tech - Electrical and Electronics

Amrita School of Engineering

Intermediate -

Veda Vyasa Vidyalayam, Calicut

Matriculation -

Amrita Vidyalayam, Calicut
Nikhil Valsaraj