Currently working as an DevOps Engineer with big-data/Hadoop admin in o9 solutions, Bangalore from Feb ‘24 to till date.
Over 8 years of professional work experience in IT and 6 years of experience in designing and implementing the systems and infrastructure that support large-scale data collection and analysis.
Experience working with Azure administrator, Azure DevOps, GitHub, Hadoop, Cloudera, Hortonworks, Red Hat Enterprise Linux (RHEL) administrator in a large enterprise environment.
Good troubleshooting skills to monitor server performance, hardware, network-related issues.
Ability to handle multiple tasks, have good time management skills and can handle work under pressure.
Well organized with strong problem solving and good communication skills as well as the ability to work both in teams and independently
Overview
9
9
years of professional experience
7
7
years of post-secondary education
Work History
Devops Enginner
o9 India Management Pvt.Ltd
2 2024 - Current
Bigdata Devops Engineer
Arrow Electronics Pvt.Ltd
Bangalore
10.2021 - 02.2024
Maintaining cluster health and HDFS space for better performance
Moving data efficiently between the clusters using DISTCP Migrating CDH clusters to CDP Private cloud.
Infrastructure as a Service (IaaS) and Azure DevOps operations: responsible for creating, managing, and maintaining the tools, and its helps to conception, design, development, implementation, testing and deployment of new software products.
More exposer in setting up CI/CD pipeline integrating to build and run Terraform jobs to create infrastructure.
Deep experience in Git hub, Build/Release Management, Hadoop, and Spark platforms.
Allocating the name and space Quotas to the users in case of space problems Experience with Azure Databricks clusters.
Responsible for Maintaining/Administration of Azure and Hadoop Eco-Systems with Continuous Integration Tool.
Manager and review the log files for troubleshooting
Knowledge of HDFS data storage and support for Azure Data Lake
Responsible for commissioning and decommissioning of nodes from Clusters.
Bigdata Administrator
Pactolus Solutions Pvt.Ltd
Bangalore
04.2020 - 06.2021
Installed Apache Hadoop clusters (Dev, QA and Prod ) from scratch level
Maintaining cluster health and HDFS space for better performance
Moving data efficiently between the clusters using DISTCP Setting up Hadoop quotas
Review the log files for troubleshooting
Involved in Cluster planning to setup new cluster
Optimizing Ecosystem Tools jobs and Yarn jobs
Configured Presto Cluster with hive and Postgres database
Commission node to cluster and rebalance cluster
Configured HA for name node, Resource manager, Hive Server2 in Apache Hadoop
Installed Ranger in Apache Hadoop
Configured various property files like core-site.xml, hdfs-site.xml, mapred-site.xml based upon the job requirement.
Bigdata Administrator
Photon Infotech Bangalore
Bangalore
12.2019 - 04.2020
Installed Hadoop Clusters for Ingestion of data through Kafka
Collaborated with multiple teams for design and implementation of Kafka cluster
Maintaining cluster health and HDFS space for better performance
Created Kafka topics and partitions with Minimum in sync replicas
Rebalancing the Hadoop Cluster
Involved in minor and major upgrades of Hadoop and Hadoop eco system
Allocating the hard and advisory Quotas to the volumes in case of space problems.
Bigdata Administrator
HCL Technologies
Bangalore
04.2017 - 12.2019
Maintaining cluster health and HDFS space for better performance
Moving data efficiently between clusters using DISTCP Migrating CDH clusters to CDP Private cloud
Setting up Hadoop quotas and Rebalance Hadoop clusters
Managing alerts from Cloudera manager
Allocating name and space Quotas to users in case of space problems Experience with Azure Data bricks clusters
Define Hadoop cluster configurations and specifications as code to ensure consistency and reproducibility across environments
Manager and review log files for troubleshooting
Configured Backup Disaster recovery
Knowledge of HDFS data storage and support for Azure Data Lake
Using Git to manager configuration files
Responsible for commissioning and decommissioning of nodes from Clusters.
Database Administrator
Tata Technologies
Mumbai
10.2015 - 12.2016
Backup database and restore Database from source to target server Moving Data by using db2move, EXPORT, IMPORT, LOAD utilities
Performance database Maintenance Utilities like REORG and RUNSTATS
Experience in Maintain of databases, performance testing, and trouble shooting
Implementing HADR Checking file system utilization Adding tablespace containers to DMS tablespace
Managing Database user Privileges (Granting and revoking permissions)
Generate DDL using db2look from one database to creating same objects into different server
Provide 24
7 on call DBA support for production environment
Education
B. Tech - Electronics and Communication Engineering
Business Associate & MIS - Executive at Altruist Customer Management India,Pvt.LTDBusiness Associate & MIS - Executive at Altruist Customer Management India,Pvt.LTD