

Accomplished IT professional specializing in multi-cloud solutions and DevOps practices with 4 years of experience. Proficient in CI/CD methodologies, cloud provisioning, and user administration in Linux environments. Experienced in managing cloud services across AWS, Azure, and OCI, with a focus on security, monitoring, and cost-effective resource management. Strong ability to collaborate with cross-functional teams to deliver high-quality solutions.
Project 1 :
CAIPERS (California Public Employees Retirement System), DevOps Engineer, Health Care, CAIPERS is a web-based enterprise application for a variety of retirement, health benefit programs, services, and administers other programs dedicated to protecting the financial security of its members We provide a system which integrates different benefits systems of CAIPERS like Health benefits, retirement benefits, death benefits and long-term care benefits in a single umbrella. So that all members can access different products offered by CAIPERS on a single web page, I configured a CI/CD setup using a Jenkins pipeline job with various DevOps tools to generate WAR/EAR files and deployed them through orchestration. As most of the infrastructure is hosted in the AWS cloud, I am responsible for creating and managing EC2 instances, configuring EC2 instances with containers to enable an environment for deployment, and maintaining AMIs of the configured EC2 instances. I deploy the packages and OS binaries to artifacts in AWS cloud ECR with Jenkins, maintain backups of Jenkins builds in AWS S3, create backups of EBS volumes, and restore them. I ensure the servers can sustain the load by configuring the Elastic Load Balancer and Auto Scaling in the AWS cloud. I build and push container images for the Java application, am involved in writing Docker files to build customized images for creating containers, and create secrets and manifest files for the applications to store credentials and use them. I deploy the containers into the cloud using Kubernetes, including deployment, services, networking, and providing routes for the application for external access. I collaborate with developers and testers to achieve organizational goals and educate them about the build process.
Project 2 :
Whole Foods Market, Cloud Infra Developer (AWS Cloud), Technology Media & Telecommunications, as a large conglomerate in the technology sector, the client is facing challenges related to employee onboarding and HCM activities, which include duplicate identities. As part of the solution, the goal is to build a system that would create a common identity store and publish or expose a single identity for each employee. This would impact HCM, payroll, and employee onboarding-related operations. Build microservices based on specific business requirements and expose features using REST API (AWS API Gateway, AWS Lambda). Share REST API documentation using AWS API Gateway for microservices, set up AWS resources like AWS API Gateway, Lambda, VPC endpoint, AWS KMS, AWS Application Load Balancer, AWS Route 53 for REST API, database using Postgres, AWS DynamoDB, communication channel using AWS SQS, for bulk storage using AWS S3, and for monitoring and logging, use AWS CloudWatch using AWS CDK, implement authentication and role-based authorization for each REST API and microservices, set up CI and CD pipeline, trigger from Brazil native tool, CodePipeline with blue-green deployment checks, write the test cases with approximately 95% code coverage using Java, integrate AWS API Gateway and Lambda with CloudWatch, create a CloudWatch dashboard for monitoring and alerting, as a cloud developer, work with about four team members, review the coding standard and set up a coding standard checklist to minimize logical and coding standard defects with the TL
Project3:
McDonald's Japan, Boston University, Tupperware, Infra DevOps Engineer (AWS, Azure, OCI), Consumer, worked as an AWS administrator responsible for provisioning and managing cloud infrastructure and automating services on public cloud platforms like AWS, Azure, OCI, and managing the infrastructure for large-scale industries for retail business, worked on VPN configuration (IPsec), port opening, and DNS binding, worked on Linux OS, users and groups, permissions, logical volumes (LVM), multipath configuration, NFS, involved in designing and deploying AWS resources (including EC2, S3, RDS, IAM, CloudWatch, ELB, Route 53, VPC, CloudTrail) focusing on high availability, fault tolerance, and auto scaling, created VPC and public and private subnets to accommodate web servers, application and database servers, increasing EBS-backed volume storage capacity when the root volume is full using the AWS EBS volume feature, provisioned AWS RDS, DynamoDB for client requirements, created S3 backups using versioning enabled and moved objects to Amazon Glacier for archiving purposes, created users and groups using IAM and assigned individual policies to each group, experience working with monitoring tools such as Datadog, created SNS, SQS notifications for the client team, experienced in handling internal and external audits for SOC2, also worked as an Azure DevOps engineer in one of the modules for this project focusing on end-to-end CI/CD pipeline configuration through Azure Data Factory and several Azure services, have worked on creating build and release pipelines using Azure DevOps, software configuration management (automate CI and CD pipelines) using Terraform, Azure DevOps, and Git, created ARM templates to reuse similar deployments with some standard configurations and naming conventions, involved in monitoring the process and making daily reports on the performance of the process, worked on resizing Azure VMs based on on-demand requirements from the client and managed entire infrastructure services, worked on Azure DevOps for execution of pipelines using PowerShell scripts, taking of Azure Terraform state file backup to Azure blob storage and forming it as an artifact to provision through the existing state file or updating for the existing state file, automated the scaling of resources through PowerShell scripts and implemented it on a scheduled basis through Azure DevOps pipelines