Summary
Overview
Work History
Education
Skills
Certification
Personal Information
Timeline
Generic
Mehul Patel

Mehul Patel

Pune

Summary

IT professional with 8.5+ years of experience, including 3+ years specializing in DevOps, cloud platforms, and infrastructure automation. Proficient in creating CI/CD pipelines, managing cloud infrastructure, and implementing automation for efficient and reliable deployments. Infrastructure as Code (IaC), and AI/ML-powered applications. Adept at integrating LLM applications, containerization, and cloud-native services. Skilled in utilizing Github Actions, Jenkins, and Teamcity for Continuous Integration (CI) pipeline. Worked with various cloud platforms such as GCP and AWS, with a solid understanding of Azure. Previously specialized as a Big Data Administrator, demonstrating strong skills in managing large-scale data systems. Expertise in Hadoop Framework with a fair understanding of Big Data concepts, cloud computing, and Linux administration. Experienced in using Terraform, Jenkins, Docker, and cloud platforms such as GCP, AWS, and Azure to streamline development and operations. Possesses a good understanding of Kubernetes, with experience in containerization, deployment strategies, and troubleshooting in cloud environments. Proficient in Unix shell scripting and Ansible for automation of tasks and configuration management. Skilled in troubleshooting infrastructure-related issues. Effective communicator with good analytical and coordination skills.

Overview

10
10
years of professional experience
2
2
Certifications

Work History

Senior Engineering lead

Persistent Systems Limited
09.2021 - Current
  • Company Overview: Client: Charles Schwab (U.S)
  • Designed and implemented multi-region Cloud Run deployments using Terraform
  • Automated CI/CD pipelines with GitHub Actions, reducing deployment times
  • Designed and implemented multi-region Cloud Run deployments using Terraform
  • Fair understanding of Kubernetes for container orchestration
  • Integrated and deployed GenAi Hub, an AI-driven application that leveraged multiple LLM models and an evaluation framework hosted on GCP.
  • Configured Cloud Composer and Dataflow for job orchestration and processing, storing results in BigQuery datasets to enhance prompt fine-tuning for evaluation frameworks.
  • Responsible for managing big data Infra Clusters and deployed services across all environments from UAT, DEV, TEST, PROD
  • Managed Cloud resources for five environments (Dev, QA, Preprod, Prod) in GCP with focus of cost optimization and performance
  • Implemented Docker-based microservices architecture
  • Integrated security practices into CI/CD pipelines, implementing vulnerability scanning and policy compliance checks
  • Managed env secrets using Google Secret Manager for secure application deployments
  • Client: Charles Schwab (U.S)

Associate

Cognizant Technology Solutions
09.2018 - 09.2021
  • Company Overview: Client: Discover Financial Services (U.S)
  • Manage Hadoop Clusters in Development, Pre-Production and Production Environment
  • Initiation and administration of the Major Compaction for HBase tables and monitor through Grafana
  • Troubleshooting and analyzing Hadoop cluster/component failures and job failures
  • Manage underlying file system of all Hadoop servers across environments
  • Working experience on setting up Ranger policies, access control for Kafka, Hive, Hdfs, Hbase
  • Closely work with Cloudera vendor on various Hadoop infra level cases
  • An IT professional with 8.5+ years of experience, including 3+ years focusing on DevOps, cloud platforms, and infrastructure automation
  • Proficient in creating CI/CD pipelines, managing cloud infrastructure, and implementing automation for efficient and reliable deployments
  • Experience in resource provisioning using Terraform on GCP platform for application deployment from scratch
  • Experience with Github Actions, Jenkins and Teamcity for Continuous integration (CI) pipeline
  • Worked with different cloud platforms such as GCP and AWS, with a good understanding of Azure
  • Previously, specialized as a Big Data Administrator, with strong skills in managing large-scale data systems
  • Expertise in Hadoop Framework with fair understanding of Big Data concepts, Cloud computing, Linux administration
  • Experienced in using Terraform, Jenkins, Docker, and cloud platforms such as GCP, AWS, and Azure to streamline development and operations
  • Good understanding of Kubernetes, with experience in containerization, deployment strategies, and troubleshooting in cloud environments
  • Experience in Unix shell scripting and Ansible for automation of tasks and configuration management
  • Experience in troubleshooting infra related issues
  • An effective communicator with good analytical & coordination skills
  • Analyzing the log files and finding the resolutions and root cause for the same
  • Client: Discover Financial Services (U.S)
  • Technology/Skills: Cloudera, Hortonworks, Kafka, Kerberos, CDH, HDFS, Hue, Impala, YARN, Hive, HQL, SQL, Spark, Zookeeper, Ranger, Sentry, Spark

Associate

Defacto Veritas Pvt.ltd
05.2015 - 08.2018
  • Monitor and Manage Hadoop Clusters in Development, Pre-Production and Production Environment
  • Work with engineering software developers to investigate problems and make changes to the Hadoop environment and associated applications
  • Conducted POC's and Deployed Hadoop Cluster Architecture with High Availability using AWS cloud services
  • Operating with multiple project teams to understand the infrastructure requirement and set up environments on a shared multi cluster platform
  • Continuous monitoring and managing the Hadoop Clusters
  • Technology/Skills: Hortonworks Cloudera, CDH, HDFS, YARN, Hive, Zookeeper, Sentry, Kerberos, AWS, S3, EC2

Education

Bachelor of Engineering - Electronics & Telecommunications

Pune University
Pune

Skills

Cloud & DevOps: GCP Azure AWS, Terraform, ARM Templates, Kubernetes, DockerServerless & PaaS Services: Cloud Run,Azure Functions,AWS Lambda,Infrastructure as Code (IaC): Terraform, ARM Templates, CloudFormationCI/CD Pipelines: GitHub Actions, Jenkins, BambooMonitoring & Logging: Prometheus, Grafana, ELK StackProgramming & Automation: Bash, Ansible,PythonLLM & AI Technologies: Azure OpenAI, Langchain, LlamaIndex

Certification

IBM Cognitive Course Certificate in Big Data

Personal Information

Date of Birth: 12/06/93

Timeline

Google Cloud Certified Professional Cloud Architect

03-2024

Senior Engineering lead

Persistent Systems Limited
09.2021 - Current

Associate

Cognizant Technology Solutions
09.2018 - 09.2021
IBM Cognitive Course Certificate in Big Data
08-2018

Associate

Defacto Veritas Pvt.ltd
05.2015 - 08.2018

Bachelor of Engineering - Electronics & Telecommunications

Pune University
Mehul Patel