Summary
Overview
Work History
Education
Skills
Additional Information
Software
Timeline
Generic
Monu Kumar

Monu Kumar

Data Engineer
Shamli

Summary

Seeking an opportunity to make my career in an organization where I can exert all my diligence, and knowledge to achieve the personal as well as organizational goals.


  • PROFILE SUMMARY :-

I am having 4 Years Experience as a Dynamic IT Professional Big data engineer with documented track record of success, 3 years 3 months of experience as a Big Data Engineer.

  • Experience of Big data and Hadoop eco systems like HDFS, Spark with Scala, Map Reduce, Sqoop, Hive
  • Load and transform large sets of structured & semi-structured data from Relational Database Systems to HDFS and vice-versa using Sqoop tool
  • Designing and creating Hive external tables and creating Partitions and Bucketing techniques in Hive using Hive Query Language(HQL)
  • Experience in varied business process including Production, Planning and Sourcing
  • Experience in using ORC File, AVRO file & Parquet file formats
  • Expertise on usage of SQL queries to extract data from RDBMS databases – MySQL and Oracle
  • Experience on importing the data from RDBMS databases MySQL, and Oracle into Hadoop Data Lake using Sqoop
  • Run ETL flow with the help of Airflow
  • Experience In the Extraction of structured data from (BQ) / Big Query with the help Of Sqoop under GCP Bucket And Triggered the ETL flow (DAG) with the help of Airflow.

Overview

5
5
years of professional experience
5
5
years of post-secondary education
1
1
Language

Work History

Engineer-Technology

Iris Software Technologies Private Limited
Noida
02.2023 - Current

Big Data Developer

Future Focus Infotech Pvt Ltd
New Delhi
02.2022 - 01.2023
  • Description of the project: This project deals with all Canada Market related functionality
  • We migrated workflow with to GCP cloud domain with Internal frame work of the organization
  • According to the client requirement we have to do the validation and prepare the validation sheet
  • Business Domain/Function: Finance, Supply Chain, Merchandising domain
  • Environment/Technology Stack: Sqoop, Hive, Spark-Sql, Git, JIRA, Confluence
  • Project Role:, Experience into migration of data Hadoop2GCP migration
  • Experience in importing and exporting data using Sqoop & Extraction wrapper from RDBMS to HDFS /GCS Bucket and vice-versa
  • Experience extensively on the Hadoop Distribution File System Command
  • Experience with the SQOOP to load data and other live in put data to HDFS
  • Processing data in HDFS by use of extensive HIVE tables
  • Experience in using ORC File, AVRO files formats
  • Created and worked Sqoop jobs with incremental load to extract the data in an incremental last modified mode to HDFS
  • Created a merge jobs to merge the old data that was already there in HDFS/GCS Buckets with new extracted data before loading into Hive
  • Very good understanding of Partitions, bucketing concepts in Hive and designed both Managed and External tables in Hive to optimize performance
  • Optimizing of existing algorithms in Hadoop using Spark Context, Spark-SQL, Data Frames and Pair RDD's
  • Developed Spark scripts by using Spark-shell commands as per the requirement
  • Experienced in performance tuning of Spark Applications for setting right Batch Interval time, correct level of Parallelism and memory tuning
  • Used Spark API over Cloudera Hadoop YARN to perform analytics on data in Hive
  • Experience in the Extraction of structured data from (BQ) / Big Query with the help Of Sqoop/Conf File under GCS Bucket and Run the ETL flow (DAG) with the help of Airflow
  • Accomplishments/Results: Received multiple appreciations from client.

Data Engineer/DTE

LinkQuest Technologies Pvt Ltd
Noida
12.2019 - 10.2021
  • Description of the project: This is an existing application where we need to develop new workflow and update some existing workflows as per client request
  • Business Domain/Function: Finance domain
  • Environment/Tools: Microsoft Sqoop, Hive, Spark-Sql, Git, JIRA, Confluence
  • Load and transform large sets of structured & semi-structured data from Relational Database Systems to HDFS and vice-versa using Sqoop tool
  • Designing and creating Hive external tables and creating Partitions and Bucketing techniques in Hive using Hive Query Language(HQL)
  • Experience in varied business process including Production, Planning and Sourcing
  • Experience in using ORC File, AVRO file & Parquet file formats
  • Expertise on usage of SQL queries to extract data from RDBMS databases – MySQL, MSSQL, Big Query, Mainframe, DB2 and Oracle
  • Experience on importing the data from RDBMS databases MySQL, and Oracle into Hadoop Data Lake using Sqoop
  • Run ETL flow with the help of Airflow
  • Experience In the Extraction of structured data from (BQ) / Big Query with the help Of Sqoop under GCP Bucket And Triggered the ETL flow (DAG) with the help of Airflow
  • Accomplishments/Results: Received multiple appreciations from client.

Telecom Engineer

Integrated wireless Solutions
Noida
09.2018 - 03.2019
  • Description of the project: KPI monitoring based upon the threshold
  • Compare the KPI with required values and we have to prepare the report and sent them to client for the validation
  • Function: Telecom domain
  • Logs came to server we have to
  • Involved in requirements in analysis and testing
  • Added new upgrades and enhancements on application
  • Used AZQ toll for Error handling and Log Tracing in the application
  • Used Master Pages, themes and skins to give a consistency look and feel throughout the application.
  • Participated in troubleshooting, fixing and production moves
  • Accomplishments/Results: Received appreciation from client.

Education

B.Tech - Electrical, Electronics And Communications Engineering

IIET College
Samani Kurukshetra Haryana
08.2014 - 06.2018

High School Diploma -

DAV Inter College Un UP Board
Un Shamli
08.2013 - 06.2014

S.S.C -

CCSSH School Un UP Board
Un , Shamli
08.2011 - 04.2012

Skills

Strategy Planning and Implementation, Minimizing Performance Bottlenecks, Quality Assurance

undefined

Additional Information

  • POSTAL ADDRESS: - Village Rera Majara Harsana,Post Toda, Dist-Shamli,UP 247778

Software

Hadoop, Sqoop, Spark, Hive, PySpark, Airflow, Jira, Git

Timeline

Engineer-Technology

Iris Software Technologies Private Limited
02.2023 - Current

Big Data Developer

Future Focus Infotech Pvt Ltd
02.2022 - 01.2023

Data Engineer/DTE

LinkQuest Technologies Pvt Ltd
12.2019 - 10.2021

Telecom Engineer

Integrated wireless Solutions
09.2018 - 03.2019

B.Tech - Electrical, Electronics And Communications Engineering

IIET College
08.2014 - 06.2018

High School Diploma -

DAV Inter College Un UP Board
08.2013 - 06.2014

S.S.C -

CCSSH School Un UP Board
08.2011 - 04.2012
Monu KumarData Engineer