Summary
Overview
Work History
Education
Skills
Certification
Timeline
SoftwareDeveloper
Samarendra Sahoo

Samarendra Sahoo

Cloud Architect
Bhubaneswar

Summary

Quality-driven Cloud Architect well-versed in Cloud Platform best practices, project management requirements and IT operations. Successful at building robust solutions for changing business needs. Enthusiastic about meeting market challenges with scalable technologies.

Experience Summary:

  • With around 14 years of progressive experience, I specialize in architecting and building large-scale enterprise applications, Data Warehouses, Business Intelligence systems, data pipelines for Big Data, Near Real-time data, and Legacy data, as well as developing analytic solutions and implementing machine learning models.
  • Good experience in architecting modern applications using micro service approach based on Amazon Web Services (AWS) and Google Cloud Platform (GCP).
  • Implementation experience with identity management and networking in cloud platform.
  • Experience in implementing scalable design including high availability, redundancy, fail-over and load balancing.
  • Good knowledge in AWS Services like IAM,EC2,Lambda, S3,SNS,Glue,Step Function.
  • Strong knowledge in GCP Services such as Cloud Run, Cloud Function, GKE, Pub-Sub, Cloud Scheduler, Big Query, Cloud Storage.
  • Demonstrated expertise in architecting and optimizing data solutions, including extensive experience with Amazon Redshift for high-performance analytics, and a proven track record in designing and implementing Enterprise Data Warehousing (EDW) solutions for scalable and efficient data management.
  • Proficient in designing and implementing serverless architecture using AWS Lambda, demonstrating hands-on experience in building scalable and event-driven solutions for diverse business applications.
  • Skilled in orchestrating complex workflows through AWS Step Functions, coupled with expertise in data integration and ETL processes using AWS Glue, ensuring efficient and scalable data processing pipelines in cloud environments.
  • Experienced in leveraging PySpark for advanced big data computation, demonstrating proficiency in optimizing and analyzing large datasets. Additionally, well-versed in working with DataFrames to facilitate efficient data manipulation and processing within the PySpark framework.
  • In-depth understanding of GCP BigQuery , with the ability to work with complex data sets ,develop creative solutions for data analysis, integration with BI tool such as Tableau .
  • Skilled in Python with proven expertise in using new tools to drive improvements throughout entire software development lifecycle.
  • Good experience in developing web applications implementing Model View Controller (MVC) architecture using Java and Spring Boot.
  • Experience in Object Oriented Analysis in developing back-end framework using various design patterns.
  • Good knowledge of web services with REST protocols.
  • Experienced in requirement gathering, use case development, Business Process flow, Business Process Modeling and extensively used UML to develop various use cases, class diagrams and sequence diagrams.
  • Documentation of Architecture, Design and Operational procedures.
  • Experience in Test Driven Development and Behavior Driven Development methodologies for enterprise projects.
  • Experience in working with various Integrated Development Environments like IntelliJ IDEA, Visual Studio code, Jupiter Notebook , PyCharm and Eclipse.
  • Knowledge in working with continuous deployment using GitHub Actions .
  • Experience in using various version control systems like SVN , Git .
  • Well versed with all AGILE ceremonies – Daily stand ups, Sprint Planning, Backlog Grooming, Sprint Review and Retrospective.
  • Building data pipelines between MySQL, PostgreSQL and Big Query and writing data migration scripts using Python.
  • Proficient in writing SQL Queries, Stored procedures, functions, packages, tables, views, triggers using relational databases like Oracle, MySQL and PostgreSQL.
  • Experience in Docker for local development and production deployment.
  • Extensive knowledge of ML frameworks, libraries, data structures, data modeling, and software architecture.
  • Designing machine learning systems to automate predictive models.
  • Experience in Supervised and Unsupervised Machine Learning Techniques (Linear Regression, Logistic Regression, SVM, Decision Tree, Random Forest, Neural Networks, K means clustering, PCA).
  • Experience in understanding the client business problem, formulate analytical solutions using various statistical models and business decision making tools and provide insights for value addition.
  • Cross domain experience in Retail and Utility.

Overview

14
14
years of professional experience
1
1
Certification

Work History

Cloud Architect

The Home Depot
Atlanta,Bhubaneswar
01.2023 - Current

Project: ProCRM

Building a platform for Home Depot' Pro Associate Sales Representatives to get sales driven incentives which in turn drives sales growth across organization

Responsibilities:

  • Worked with Stakeholders to identify business requirements and to provide end to end solution
  • Architecting micro service design in Amazon Web Services(AWS) using Fargate, Lambda and AWS CloudWatch Events to implement automated weekly sales target and monthly and quarterly incentive payment calculation systems
  • Implementing end to end security using IAM ,authentication protocol , encryption and virtual Private cloud networking
  • Monitoring the service health and performance using Amazon CloudWatch
  • Building data pipelines to connect to Enterprise Data warehouse(Redshift) and integrating with sales target and payment calculation services
  • Integrating with HR work day systems for data exchange and data transformation using S3, Redshift, Glue and Step Functions.
  • Leveraged PySpark and DataFrames for advanced big data computation and efficient data manipulation .
  • Design and development using Java, Spring Boot, Python and Flask Framework
  • Designed schema in Red Shift as per the business need
  • Creation of Metric tables/Views in Red Shift for analytics
  • Used Celery as task queue and SNS as messaging broker to execute asynchronous tasks
  • Driving onshore and offshore Team for the deliverables
  • Involved in code reviews using GitHub pull requests.
  • Using Docker to achieve micro service architecture
  • Monitoring, logging and debugging in AWS
  • Worked with DevOps Teams to automate building jobs and deployment using GitHub Actions
  • Improved code reuse and performance by making effective use of Factory design pattern

Cloud Architect

The Home Depot
Atlanta
01.2022 - 12.2022

Project: ProCRM

Building a platform for Home Depot's sales executives to get realtime insight of quotes and orders , sales variation in history , sales target vs actual sales

The goal was to monitor the sales insights and drive the strategy for sales growth.

Responsibilities:

  • Worked with Stakeholders to identify business requirements and to provide end to end solution
  • Architecting micro service design in Amazon Web Services using Docker, Lambda, S3,Redshift and AWS CloudWatch Events to implement data pipelines
  • Designing and developing the data pipelines using Pysprark to be used for analytics
  • Designing and developing integration of Enterprise Data Warehouse(Rdshift) with salesforce
  • Designing and developing integration of Enterprise Data Warehouse(Redshift) with Tableau
  • Creation of Tableau dashboards and providing filters and row level security to the dashboards
  • Involved in code reviews using GitHub pull requests.
  • Monitoring, logging and debugging in AWS environment.
  • Worked with DevOps Teams to automate building jobs and deployment using GitHub Actions

Senior Cloud Engineer

The Home Depot
Atlanta
02.2019 - 12.2021

Project: Proreferral

Proreferral's business is to provide a platform to connect consumers with Service Providers for consumer's home improvement needs.


Responsibilities:

  • Worked with Stakeholders to identify requirements and provide end to end solutions
  • Architecting micro service design in Google Cloud Platform using Cloud Run, Cloud SQL and Cloud Scheduler to implement automated weekly sales target and monthly and quarterly incentive payment calculation systems
  • Implementing end to end security using IAM ,OIDC authentication protocol , encryption and virtual Private cloud networking
  • Monitoring the service health and performance using GCP's inbuilt Cloud Monitoring
  • Design and development using Java and Spring Boot to deliver micro service architecture in Google Cloud Platform
  • Designed schema in PostgreSQL as per the business need
  • Migrating data from legacy MySQL to PostgreSQL
  • Maintaining data pipelines from MySQL to PostgreSQL and PostgreSQL to BigQuery
  • Building Database APIs to be accessed by Web services
  • Creation of Metric tables in BigQuery for analytics
  • Used pubsub as messaging broker to execute asynchronous tasks
  • Involved in code reviews using GitHub pull requests
  • Using Docker for local and production deployment
  • Updated and maintained Github Actions for automatic building jobs and deployment
  • Improved code reuse and performance by making effective use of various design patterns and refactoring code base
  • Taken ownership of entire lifecycle of the project including Design, Development, and Deployment, Testing and Implementation and support


Senior Developer

The Home Depot
Bhubaneswar
01.2018 - 01.2019

Project:Quote Builder


The goal of the project is to provide a highly reliable automated home improvement solution system with quote on demand to end customers.


Responsibilities:

  • Involved in software development life cycle (SDLC) of tracking the requirements, detailed design, development, system testing and user acceptance testing
  • Involved in design, analysis and architectural meetings
  • Created Architecture Diagrams, and Flow Charts, case diagrams, class diagrams, database tables, and mapping between relational database tables
  • Followed Agile software development practice paired programming, test driven development and scrum meetings
  • Developed entire backend using Java and Spring Boot
  • Implemented MVC architecture in developing the web application with the help of Spring MVC
  • Involved in code reviews using GitHub pull requests
  • Using Docker for local and production deployment
  • Deploying, Monitoring, logging and debugging in Google Cloud Environment.
  • Updated and maintained Jenkins for automatic building jobs and deployment
  • Improved code reuse and performance by making effective use of various design patterns and refactoring code base
  • Implemented various SQL queries and Functions and Triggers as per the client requirements
  • Building Database APIs to be accessed by Web services
  • Collaborating with team members, management, and clients to ensure projects are completed as per the requirement

Machine Learning Engineer

TNB
Bhubaneswar
06.2016 - 01.2018

Project: Customer Analytics

Increasing utilization of myTNB app, Identification of drivers of complaints using pattern mining, Identification of future enquiries to be converted to complaints using machine learning model


Responsibilities:

  • Worked with Stakeholders on requirements to identify business goals
  • Data Cleaning, Data Exploration, Feature engineering, Vectorizing Text Data, building machine learning model on vectorized data, Cross Validation for better accuracy, Topic modeling and sentiment analysis on Facebook Data Dump provided by client using Python and its libraries
  • Identifying rules for segments using tree-based approach
  • Developed a decision tree model using scikit-learn with features extracted from Hadoop storage and assigned scores to customers through the tree model to present the probability of the customer using myTNB app
  • Designing end to end solution through Natural Language Processing (NLP) and SVM and representing insights and recommendations to Client Service Team Text Preprocessing through Noise Removal, Lemmatization, Stemming and Standardization
  • Text to Features/Vectors (Feature Engineering on text data) using statistical method TF–IDF
  • Applying dimensionality Reduction on features using PCA
  • Cross Validation for hyper parameter tuning for bias variance trade-off and better accuracy
  • Environment: Python 2, Hive, DBvisualizer, JIRA, PL/SQL, Logging Module, Scikit-learn, Git

Machine Learning Engineer

Nielsen
Mexico City
10.2014 - 05.2016

Project: Char Coding

Automating Operation Team's manual work of tagging characteristics to UPC through Natural Language Processing (NLP) & Machine Learning Model


Responsibilities:

  • Translated the customer requirements into design specifications and ensured that the requirements translate into software solution
  • Text Preprocessing through Noise Removal, Lemmatization, Stemming and Standardization
  • Text to Features/Vectors (Feature Engineering on text data) using statistical method TF–IDF
  • Applying dimensionality Reduction on features
  • Building SVM classification model on densed features to extract characteristics
  • Building SVM classification model on densed features to identify the correct commodity
  • Cross Validation for hyper parameter tuning for bias variance trade-off and better accuracy
  • Used Logging module for event logs
  • Unit testing using pytest
  • Environment: Python , Oracle, DBvisualizer, JIRA, PL/SQL, Scikit-learn, Git, Logging Module

Developer

Nielsen
Bhubaneswar
02.2010 - 09.2014

Project: GNSO

Through NSO, Nielsen provides market analysis reports for products to end Clients at different hierarchical levels using Audited data as well as Sales Data


Responsibilities:

  • Developed backend modules using Java and Spring MVC
  • Develop, test, deploy and maintain the website
  • Designed and developed data management system using MySQL
  • Wrote python scripts to parse XML documents and load the data in database
  • Involved in development of MySQL tables, stored procedures, and functions
  • Creating unit test/regression test framework for working/new code
  • Responsible for debugging and troubleshooting the web application
  • Handling the day-to-day issues and fine tuning the applications for enhanced performance
  • Tracking the Incidents, Service requests through Incident Tracking Tool (Service Now)
  • Environment: Python, Java, Linux, HTML, CSS, PL/SQL, MySQL, Apache Web Server, UNIX.

Education

Post Graduate Diploma - Applied Statistics

Bachelor Of Technology - Electronics & Communication

Skills

    Programming Languages: Java,Python,Pl/SQL

undefined

Certification

Google Cloud Certified Associate Cloud Engineer

Timeline

Google Cloud Certified Associate Cloud Engineer

02-2023

Cloud Architect

The Home Depot
01.2023 - Current

Cloud Architect

The Home Depot
01.2022 - 12.2022

Senior Cloud Engineer

The Home Depot
02.2019 - 12.2021

Senior Developer

The Home Depot
01.2018 - 01.2019

Machine Learning Engineer

TNB
06.2016 - 01.2018

Machine Learning Engineer

Nielsen
10.2014 - 05.2016

Developer

Nielsen
02.2010 - 09.2014

Post Graduate Diploma - Applied Statistics

Bachelor Of Technology - Electronics & Communication

Samarendra SahooCloud Architect