DevOps Engineer with 9+ years of experience in automating CI/CD pipelines, infrastructure provisioning, and
configuration management using Terraform, Ansible, Helm, AWS Lambda, and Python. Proficient in GitHub, Bitbucket,
and GitLab for advanced code management. Led cloud migrations to AWS and GCP, achieving cost savings and
scalability improvements. Implemented monitoring with Prometheus, ELK, and Loki stack. Skilled in leveraging cloud
technologies and containerization(k8s/Kubernetes), Argocd, Istio for scalable, reliable solutions. Strong collaboration and
problem-solving skills, dedicated to continuous improvement and innovation.
Paytm Payment Gateway 2.0 is an advanced version of Paytm’s existing payment gateway services, designed to handle a
larger volume of transactions, provide better security, and offer a seamless experience to both merchants and customers
Technologies: AWS cloud, Kuberentes ,ISTIO service mesh, API Gateway, Kafka, Prometheus, Alertmanager, grafana
Elasticsearch, loki, filebeat, vault, Argocd, Argo-Rollout, jenkins, redis, Jenkins, Tempo
Engage with the development team in a collaborative manner to ensure that the software development process is
smooth and efficient
Provide DevOps support to the development team and ensure that the software development workflow infrastructure
is reliable and scalable
Migrated services from AMD to Graviton, resulting in significant cost savings amounting to thousands of dollars
Implemented centralized logging and tracing with Loki, Grafana, Tempo, and Promtail for Istio logs
Implemented GitOps practices using ArgoCD, enabling automated application deployment and Canary strategy using
Argo Rollout to orchestrate progressive updates across all applications
Executed successful migrations of several applications to the AWS cloud, adhering to best practices, and optimizing
performance in the cloud environment
Wrote numerous Python and Shell scripts, automating daily tasks and enhancing operational efficiency
Proficient in infrastructure-as-code (IaC) and configuration management practices using tools such as Terraform
Ansible, Boto3, and AWS CLI, ensuring consistent and reproducible infrastructure deployments
Experienced in setting up AWS API Gateway solutions to manage authorization and authentication, ensuring secure
access control and identity management in microservices architectures
Experienced in managing EKS clusters, performing Istio upgrades, and patching server vulnerabilities in production
environments
Managed Apache Kafka clusters, facilitating real-time data streaming and integration across systems, while
optimizing performance and ensuring data security
Deployed 100 microservices in Kubernetes(k8s) with Istio service mesh, implemented traffic-based and progressive
rollout strategies for controlled deployment with GitOps practices and Helm charts
Led disaster recovery planning and implementation, including backup and restoration, to ensure business continuity
Utilized KEDA to implement event-driven autoscaling for Kubernetes workloads, optimizing resource usage and
improving application performance
Deployed and configured Prometheus for comprehensive monitoring of system and application metrics, ensuring
real-time visibility into performance and health
Applied Kubernetes security best practices such as RBAC (Role-Based Access Control), network policies, and pod
security policies to enhance cluster security and prevent unauthorized access
Utilized Trivy for comprehensive Docker image scanning to identify and mitigate vulnerabilities, ensuring secure
container deployments
Project-2
Gojek, Kafka Data Warehousing
Overview:
Kafka data warehousing involves using Apache Kafka as a central platform for real-time data streaming and storage
It allows
for efficient data ingestion, processing, and distribution across various systems
Technologies: Google cloud, Google kubernetes Engine (GKE), Kafka, Prometheus, Alertmanager, grafana, Elasticsearch,
filebeat, vault, Gitlab CICD
Utilize Terraform to automate the provisioning of infrastructure on Google Cloud Platform (GCP)
Written terraform modules to create GKE using Gitlab CICD pipeline runner
Employ Chef to automate the configuration
Chef used to install and configure Kafka and Kafka MirrorMaker.
Implement Kafka MirrorMaker to create Kafka mirrors
This involves configuring the replication of data between Kafka
clusters, ensuring high availability and fault tolerance for the data streaming and processing setup
Use Filebeat to collect logs from the Kafka servers,logs are then shipped to an Elasticsearch cluster using logstash.
Education
Bachelor of Technology - Computer Science
Uttar Pradesh Technical University
June 2013
Skills
Skills & Abilities
Team management and performance optimization
Mentored and trained junior team members on DevOps practices and tools
Operating Systems: Linux ( Centos and Ubuntu )
Monitoring:
Cloudwatch, Prometheus, Grafana and Alertmanager
Logging: ELK(Elasticsearch), EFK and Loki Stack
Scripting Languages: Shell Scripting, Python
Other Tools- nginx, Jira, Kafka, Vault ,Fluentd, Grafana, DNS, Logstash, Filebeat, Kibana, boto3 and Git
Cloud : AWS (EC2,EKS,S3,Lambda,IAM,ELB,VPC,AMI,Route53,Autoscaling,WAF,SNS) and Google Cloud(GCP)