Years of Professional Experience
A seasoned IT professional with 12 years of experience at Accenture Solutions, specializing in Python development, Azure and AWS cloud services. Expertise spans across multiple Azure & AWS solutions, including Infrastructure as Code (IaC) and API Gateway. Skilled in DevOps practices with hands-on experience in CI/CD pipelines, Docker, Kubernetes, version control using Git, and automated deployments. Demonstrated ability to manage cloud infrastructure and drive project success through team collaboration and leadership.
Proficient in the full Software Development Life Cycle (SDLC), ensuring seamless project execution from requirements gathering to deployment and maintenance. A dedicated team player, consistently delivering high-quality, scalable solutions in dynamic environments.
Azure
AWS
Python Programming
Fast API
Pandas
Numpy
SQL Server PLSQL
Snowflake
BigData
Docker
Kubernetes
DevOps
Years of Professional Experience
Certification
Project Title: Cloud-Native Data Processing and Machine Learning Pipeline for Financial Data Analytics.
Description: Developed and deployed a scalable, cloud-native data processing pipeline for financial analytics using AWS, Docker, Kubernetes, and Python to support real-time and batch data processing for a large banking client. The pipeline collected, transformed, and stored high volumes of transaction data from various financial systems, enabling predictive analytics and machine learning (ML) models to enhance fraud detection and customer insights.
Key Responsibilities:
• Developed Python-based microservices for real-time financial data ingestion and processing.
• Utilized AWS CloudFormation to provision infrastructure, deploying services on EC2, ECS, Docker, and Kubernetes.
• Implemented data storage using S3 and DynamoDB, with event-driven communication via SQS and SNS.
• Deployed machine learning models for fraud detection using SageMaker.
• Automated model updates with Lambda triggered by new data in S3.
• Built CI/CD pipelines using Git and Jenkins, automating deployments and testing.
• Monitored application health with CloudWatch and used X-Ray for performance analysis.
• Secured the pipeline with IAM roles, KMS encryption, and VPC configurations.
Outcome: The solution reduced fraud detection time by 30%, improved customer behavior predictions, and automated data processing at scale, reducing operational overhead, and enhancing system performance.
Project Title: Scalable Data Processing Pipeline.
Description: Designed a scalable data processing pipeline on AWS using services such as EC2, S3, Lambda, SNS, and SQS. The system ingests large volumes of data into S3, where it is processed by Lambda functions triggered by SQS messages. EC2 instances are used for heavy processing tasks, while load balancers ensure high availability and reliability. Python scripts and Linux shell scripting automate deployment and orchestration, enabling seamless data flow and real-time analytics. This architecture.
Optimizes resource usage and reduces operational costs.
Project Title: Trading Analytics Dashboard.
Description: Developed a Trading Analytics Dashboard using Python and MySQL to analyze and visualize trading data. The system collects real-time market data, storing it in a MySQL database for easy access and manipulation. Automated reports on trading performance and risk assessment are generated using Python scripts. Linux shell scripting is used to schedule data retrieval and processing tasks, ensuring timely updates. This project provides traders with actionable insights to optimize their trading strategies.
Awards and Accomplishments:
Community Involvement and Contributions: