Summary
Overview
Work History
Education
Skills
Certification
Voluntary Assignment
Personal Information
Trainings Attended
Professional Snapshot
Affiliations
Timeline
Generic

Sukrut Sardeshpande

Bengaluru

Summary

Dynamic Solution Architect with a proven track record at Epam, specializing in AI/ML and cloud integration. Expert in Azure and AWS, I successfully optimized data pipelines, enhancing performance by 30%. My strong leadership and innovative problem-solving skills drive impactful solutions in fast-paced environments. Passionate about leveraging technology to achieve business goals.

Overview

20
20
years of professional experience
1
1
Certification

Work History

Solution Architect(AI/Data)

Epam
04.2024 - Current
  • This is a Data integration and Data first approach. As AI/ML architect you are responsible for managing and recommending the solutions for data analysis and data integration with multiple cloud platform
  • Expertise in cloud computing platforms (Azure, AWS, GCP) for AI/ML deployments including databricks.
  • In-depth knowledge of Generative AI models and services like Azure OpenAI Microsoft Copilot etc., plus open source LLMs.
  • Design and architect AI and GenAI solutions on AWS,Azure.
  • Create and develop comprehensive architecture diagrams for AI solutions.
  • Worked on AWS Bedrock, sagemaker with FM models.
  • Develop and deploy AI based architecture designs(Agentic AI application).
  • Develop and manage high-volume, real-time, and batch data ingestion pipelines using Azure Data Factory (ADF).
  • Implement event-driven architectures for real-time data movement and processing.
  • Develop large-scale data processing solutions using Azure Databricks and Apache Spark with PySpark.
  • Optimize data partitioning, caching, and indexing for efficient performance.
  • Manage complex transformations and aggregations for structured and unstructured datasets.
  • Design and implement data models in Azure Synapse Analytics using dedicated SQL Pools and Spark Pools.
  • Optimize query performance, workload management, and cost efficiency in Synapse Analytics.
  • Implement columnstore indexes, partitioning strategies, and data caching to enhance performance.
  • Manage and preprocess large datasets to support training and evaluation of AI models.
  • Optimize existing AI solutions for performance and scalability.
  • Stay updated on the latest AI trends in aws and azure.
  • Provide technical guidance and support to engineering teams.
  • Engage in technical discussions and presentations with clients to understand their needs.
  • Key Skills: Azure, AWS Sage maker, Databricks, SAS, Docker, Kubernetes, GenAI, Azure, Microservices, Jenkins, git, Kafka, GCP Big Query, Deep Learning, Computer Vision

Solution Architect (Cloud, DevOps, M. BigData)

Accenture
08.2022 - 04.2024
  • This is a Data integration and Data first approach. As AI/ML architect you are responsible for managing and recommending the solutions for data analysis and data integration with multiple cloud platform.
  • Worked with SAS Analytical Team for Price change and demand check which includes.
  • Deployed some retail apps in Kubernetes environment.
  • Experience with (data) version control (Git, DVC), orchestration/DAGs tools (Airflow, Kubeflow, or equivalent), and MLOps platform (ModelOp, Seldon or equivalent).
  • Working on cloud service provider ML ecosystem such as AWS SageMaker & Azure ML.
  • Working with Data in motion tools like Confluent Kafka.
  • Experience in containerization (Docker) and container-orchestration systems such as Kubernetes.
  • Working on parallel distributed systems such as Apache Spark, Flink, etc.
  • Design and implement data ingestion, transformation, and movement using Azure Data Factory(ADF), Azure Synapse, and Data Lake.
  • Collaborate with business stakeholders, data scientists, and engineers to build robust data solutions.
  • Develop and manage high-volume, real-time, and batch data ingestion pipelines using Azure Data Factory (ADF).
  • Experience working with AWS and Azure AI ecosystem such as Extract, Comprehend, Kendra, Cognitive Services, etc.
  • Experience working with NLP based algorithms such as BERT, ALBERT and deep learning algorithms such as CNN, RNN, RBM, Autoencoders, etc.
  • Cleaned, merged and manipulated datasets and conducted feature engineering using Pandas.
  • Key Skills: Azure, AWS Sage maker, ML, SAS, Docker, Kubernetes, Azure, Microservices, Jenkins, git, Kafka, GCP Big Query, Deep Learning, Computer Vision
  • Highlights/Achievement: Successfully integrated 300+ Application with confluent Kafka as producer or as consumer. Performed Lake integration for 100+ application. Automate data ingestion process for streaming application.

Solution Architect (Cloud, DevOps, ML)

HCL/IBM
02.2016 - 09.2022
  • This is a Pricing solution from IBM of US retailers (Walmart, Safeway, Kroger called Omni Pricing Solution. This is SAS based solution and allow customer to keep their product’s price competitive.)
  • Worked with SAS Analytical Team for Price change and demand check which includes.
  • Performed POC on AZURE migration for Dev Environment.
  • Prepared capacity and architecture plan to create the Azure Cloud environment to host migrated IaaS VMs and PaaS role instances for refactored applications and databases.
  • Worked on DevOps component like Jenkin CI/CD design which includes components like Git, Jenkin, Docker, Kubernetes, ansible, Selenium, Terraform.
  • Deployed Azure IaaS virtual machines (VMs) and Cloud services (PaaS role instances) into secure VNets and subnets.
  • Implemented high availability with Azure Resource Manager deployment models.
  • Designed Network Security Groups (NSGs) to control inbound and outbound access to network interfaces (NICs), VMs and subnets.
  • Build and implement a ML models (Supervised/Unsupervised) to business use cases.
  • Assemble large, complex data sets that meet requirements to support data science and analytics projects.
  • Performed re-hosting for some application module in azure environment.
  • Deployed some retail apps in Kubernetes environment.
  • Cleaned, merged and manipulated datasets and conducted feature engineering using Pandas.
  • Key Skills: Azure, Migration, ML, SAS, AZURE ML, Docker, Kubernetes, Azure, Microservices, Jenkins, git
  • Highlights/Achievement: Successfully Migrated SAS Application & Database from VMware to AZURE using Azure migration strategy because some application components were not cloud ready. Prepare DevOps best practices, training and knowledge base document for team members. Closely worked with US customers and our engineering team to convey the business requirement into technical points and learned tools like ML and SAS.

IBM Global Services
02.2016 - 09.2022
  • Company Overview: NYK Lines is a global Leader in shipping and Logistics field. They are based in south Korea and world leader in shipping and Logistics and IBM is managing their Cloud Datacenter.
  • Involved in designing a migration plan for client applications utilizing almost all of the AW’S stack (Including Elastic Beanstalk, DevOps).
  • Work with Global Clients and lead for guidance on IBM cloud & Virtualization.
  • Deployed the Elastic Load Balancer & configured HTTPS certificates & managed scalable & highly available systems on AWS.
  • Set up a VPC environment & designed an effective backup strategy depending upon client requirements.
  • Superintended production applications on AWS & initiated corrective depending on customer feedback & surveys.
  • Complied with the established software development life cycle methodology to deliver effective solutions.
  • Created the architecture and created the Cloud Formation template to facilitate deployment.
  • Managed azure data engineering and analytics with azure synapse analytics and powerBI.
  • Provided complete infrastructure solution to 30+ clients including configuration, BOM of Server & Storage components.
  • Worked with ML team various machine learning techniques (Linear regression, Logistic regression, Un-Supervised Learning, Time Series Analysis, Recommendation Engine) using Python to build dynamic models to Support the business using available data to identify the position of shipping container and provide Realtime analysis to customer.
  • Worked on Automation tools like Ansible and DevOPS (Jenkin, Docker, Kubernetis & git) in cloud environment.
  • Worked as SPOC for all technical issues related to Codevelops & Analytics.
  • Worked on predictive and descriptive analysis with respective Machine Learning Models for better predictions delivery of container.
  • NYK Lines is a global Leader in shipping and Logistics field. They are based in south Korea and world leader in shipping and Logistics and IBM is managing their Cloud Datacenter.
  • Key Skills: aws, Migration, ML, Docker, Kubernetes, Microservices, kafka, Jenkins, git, cloud formation, DevOps, SQL, NoSQL, JAVA, SELENIUM, aws code commit
  • Highlights: Successfully Migrated Applications & Database from on prem (Korea Datacenter) to IBM data center in US with minimum downtime. As a SME/Architect on cloud, I was involved all phases of data migration for NYK’s shipping application on deployed on Kubernetes. Created procedures and frame work to manage & deploy new changes and implement continuous improvement culture.

Release/Program Manager (Asset & Security Management)

ANZ Bank
Bangalore
12.2013 - 01.2016
  • Leading a team of Data Engineer at Global locations to Provide support for highly complex System Upgrades and Oracle Data Migration activities from physical to virtual environment in Azure and AWS environment.
  • Employed Principal Component Analysis to analyse collinearity and reduce the dimensionality of datasets.
  • Designs the integration environment and incorporates industry standard architectures.
  • Successfully managed a project to Collaborate with product and engineering team members to define and develop integration process and road map for upcoming Data integration and Data migration projects in ANZ.
  • Created statistical models (ML algorithms/Regressions) & cluster groups (k-Means) to boost market share.
  • Design, Deploy and maintain enterprise class security, network and systems management applications within an AWS environment.
  • Experience in managing and building data pipeline for building Real time Dashboard.
  • Ability to independently execute on a project, from ideation to testing to delivery, and can pro-actively interact with other engineers to access necessary resources or data.
  • Deployed cloud stack using AWSOpWorks.
  • Created monitors, alarms and notifications for EC2 hosts using Cloud watch.
  • Analyzing, executing, and streamlining DevOps practices.
  • Created various charts in Jupyter Notebook using Matplotlib to perform a preliminary analysis (Supervised Learning-Supervised Learning, DL on the collected data.
  • Facilitating development process and operations.
  • Worked on BigData Technologies like hdfs, hive, spark, sqoop, kafka, flume, Phoenix, HBase.
  • Highlights: Successfully managed a project to integrate 1000 + Unix Systems along with all related financial application to Active Directory and remove local authentication through Dell, Identity Manager. As a Release Manager managed all phases of data migration for Murex & Trading Application by unifying access to accounts, external accounts, and privileged accounts. Managed to Unify security policy information from multiple sources to mitigate security risks.

Big data & Exadata Consultant

TJX Client
09.2008 - 12.2013
  • As Architect worked in diverse domains such as: Oracle Exadata Environment with Epagogic environment.
  • Used Flume to collect, aggregate, and store the web log data from different sources like web servers, mobile and network devices and pushed to HDFS.
  • Design of Redshift Data model, Redshift Performance improvements/analysis.
  • Continuous monitoring and managing the Hadoop cluster through Cloudera Manager.
  • Developed Spark jobs to transform the data in HDFS.
  • Manage complex transformations and aggregations for structured and unstructured datasets.
  • Design and implement data models in Azure Synapse Analytics using dedicated SQL Pools and Spark Pools.
  • Optimize query performance, workload management, and cost efficiency in Synapse Analytics.
  • Implement columnstore indexes, partitioning strategies, and data caching to enhance performance.
  • Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.
  • Create a procedure for the P2V migration and make a decision along with the client team to decide which servers to be virtualized.
  • Worked on setting up Hadoop cluster and MapReduce.
  • Managing/administrating Exalogic using OEM and managing Exalogic using CLI/Graphical user interfaces.
  • Perform Oracle EBS migration.
  • Highlights: Managed of 12 employees across 3 locations including large offshore and third-party contractors. Lead a team of technical staff through defining the requirements for supporting Exadata/Datawarehouse applications for the use of customer's retail business. Preformed Data Migration from various sources to HDFS. Analysed system requirements and developed a backup and restoration resolution in complex ERP environment with 20% increase in system efficiency.

Lead Tech (US Onsite with NYTP) Consultant

AIG Financial Product Corporation
Wilton
01.2010 - 12.2011
  • Exalogic Elastic cloud (Physical and Virtual) Administration.
  • Upgrading Exalogic components and ZFS Facilitating automation of Backup/Admin process with scripts (Bash and Perl).
  • Migrated large volume of PB data warehouse data to HDFS.
  • Utilize AWS services with focus on big data Architect /analytics / enterprise data warehouse and business intelligence solutions to ensure optimal architecture, scalability, flexibility, availability, performance, and to provide meaningful and valuable information for better decision-making.
  • Worked on tools Flume, Storm and Spark.
  • Proof-of-concept to determine feasibility and product evaluation of Big Data products.
  • Writing Hive join query to fetch info from multiple tables, writing multiple Map Reduce jobs to collect output from Hive.
  • Involved in migration of data from existing RDBMS (oracle and SQL server) to Hadoop using Sqoop for processing data.
  • Screened Hadoop cluster job performances and capacity planning, and monitored cluster connectivity and security.
  • Creating templates from VM's deploy VM's from templates and allocate resources.
  • Highlights: Played a pivotal role in designing & implementing FPs backup infrastructure using Legato Network 7.6.3. Excellent track record of carrying out Solaris 10 Live upgrade on over 80 servers. Instrumental in providing extensive support for Golden Source Application implemented on Web sphere Platform.

Tech Analyst (US Onsite from NYTP)

CVS Caremark
Woonsocket
01.2010 - 12.2011
  • Carried out complete installation, configuration, management and support of Solaris, AIX and Linux Servers & OE.
  • Performed Administration Tasks on NetBackup for Unix filesystem Backup.
  • Handling complete planning and migration of Data.
  • Managed 25 nodes of Hadoop Cluster.
  • Worked on tools Flume, Storm and Spark.
  • Proof-of-concept to determine feasibility and product evaluation of Big Data products.
  • Writing Hive join query to fetch info from multiple tables, writing multiple Map Reduce jobs to collect output from Hive.
  • Driving collection of performance data via a predefined centralized collection methodology specified by IBM.
  • Highlight: Successfully carried out Solaris 10 Live upgrade on over 120 servers.

Sr Technical Consultant (US Onsite NYTP)

Info crossing(A Wipro Company)
Broomfield
01.2006 - 12.2008
  • Undertook complete installation, configuration, management and support of Sun Solaris 10 (Zones) and Veritas Netback up in Data Centre environment.
  • Rendered complete support for Veritas Cluster configuration for AOC (Administrative Office of California); also supported Red Hat Linux 5.0 including patch upgrade, package administration, etc. And kernel upgrade & patch installation.
  • Screened Hadoop cluster job performances and capacity planning, and monitored cluster connectivity and security.
  • Spearheaded designing, capacity arrangement, cluster setup, performance fine-tuning, structure planning & scaling.
  • Performed System refresh from Production to Quality and Sandbox Systems.
  • Involved in working of big data analysis using Pig and User defined functions (UDF).
  • Created Hive External tables and loaded the data into tables and query data using HQL.
  • Used Sqoop to efficiently transfer data between databases and HDFS and used Flume to stream the log data from servers.
  • Highlight: Holds the distinction of performing Solaris 10 upgrades at client sites. Undertook Solaris 10 Live upgrade on 100 servers.

Senior System Engineer

Fidelity Investment Inc.
Bangalore
11.2006 - 09.2008
  • Successfully Deployed new servers in Datacenter that includes, SUN Server Hardware, Storage & Software products.
  • Handled complete configuration and management of Veritas Volume Manager 4.x and Veritas Cluster 4.X.
  • Provide Support to Data Engineering team on fixing issues on day-to-day basis.
  • Worked with Data Center Team on performance issues related to BigData (Hadoop) environment.
  • Closely working with US project Managers to make sure project deliverables are not getting impacted due to technical issues.

Technical Support Engineer (Solaris)

Sun Microsystems Inc.
Bangalore
06.2005 - 10.2006
  • Successfully rendered complete Tier-2/3 Technical Support to Sun Spectrum contract customers for SAP, SUN Server Hardware, Storage & Software products.
  • Handled complete configuration and management of Veritas Volume Manager 4.x and Veritas Cluster 4.X.
  • Provide support for SUN E25k, Sun Fire 6800 Server.
  • Managed service calls from beginning until resolution of problem using Call Management tools.
  • Ensured adherence of support services level with goals set for customer satisfaction.
  • Highlights: Led and guided a team of 8 Engineers on Solaris Administration. Holds the distinction of visiting various high profile client sites. Drove the team of Field Engineers for installations & troubleshooting of complex hardware and for performing crash recovery & performance tuning activities.

Education

Post Graduate Diploma - IT & System Management (MBA)

ICFAI
12.2017

Executive Development program - Supply Chain

IIT Delhi
12.2014

Post Graduate program - AI/ML

University of Texas at Austion

Diploma - Mechanical Engineering

Directorate of Technical Education

Skills

  • Machine Learning
  • Databricks
  • snowflakes
  • Python
  • Scala
  • LLM
  • Agentic AI
  • kafka
  • Cloud Migration
  • AWS
  • AZURE
  • Cloud
  • CI/CD
  • Docker
  • Kubernetes
  • DevOps
  • REST
  • Kinesis
  • Azure data facory
  • Networking
  • Open-source tools
  • Apache Spark
  • ML
  • AWS Redshift
  • GCP BigQuery
  • Azure Data Engineering
  • Redis
  • AWS Sagemaker
  • SAS
  • GenAI
  • Microservices
  • Cloud formation
  • SQL
  • NoSQL
  • JAVA
  • SELENIUM
  • Aws code commit

Certification

  • AWS Machine Learning Speciality
  • Databricks certified professional
  • AWS Certified Solution Architect professional
  • Azure certified data engineer
  • PG in AI/ML
  • Training on BigData/DevOps/DataScience from Edureka

Voluntary Assignment

Working actively with an NGO (Saarathi Foundation) for teaching kids in villages of Bangalore.

Personal Information

Date of Birth: 09/19/75

Trainings Attended

  • AI/ML Training from University of Texas
  • BigData/Scala Training
  • Data Science Training from Edureka
  • DevOps (AWS/AZURE) Training from Edureka
  • Project Management from pmi.org
  • Supply chain management from IIT Delhi.
  • AWS Configuration and Migration.

Professional Snapshot

  • A Visionary leader with specialization in Cloud Migration & Modernization Strategy, Digital Cloud Transformation Roadmaps, Cloud Native Architectures, experience and insight to elevate any application, computing platform infrastructure to Cloud.
  • Senior Cloud Architect (Director) with extensive experience in Digital Transformation and API Monetization Strategy via Cloud Platform technologies such as Platform as a service (Paas), Software as a service (Saas), employing PCF & AWS (Platforms) with Micro Services Architecture to deliver agile scalable solutions for Financial Clients/Banks.
  • Over 20+ years of experience in architecting and strategizing solutions for Cloud Application development, Deployment Virtualization and Containerization, Legacy Application Architecture assessment, conduct Cloud Feasibility, Cloud Suitability Assessment to migrate to Cloud Platform.
  • Certified AWS Solutions Architect, currently employed with Accenture as Solutions Architect. 18+ years of hands-on experience with AWS VPC, EC2, ECS, CloudWatch, CloudTrail, Lambda (Server-less Architecture) and AI/ML.
  • Experienced in client facing roles, creating RFP, SOW’s and Presentation decks to address Customer requirements and deliver solutions.
  • Strong problem solving and analytical skills with ability to handle multiple projects concurrently and experience in Onshore and Offshore developmental model.
  • Hands of experience in Machine Learning area like Matplotlib, Supervised Learning-Supervised Learning, SVM, Decision Tree, Time Series Analysis, Python Programming, Market Basket analysis, XG boost, Keras, IOT.
  • Knowledge on Bigdata stack (Hortonworks) and other services of the AWS family i.e., Kinesis Firehouse, redshift, Bigdata Spark, Kafka, Flume, Sqoop, Hive, Pig, HBase.
  • Excellent technical ability and knowledge to implement, understand, support and maintaining our existing core applications and infrastructure; verifiable experience in planning, execution and monitoring of projects in multi-cultural environments.
  • Strong Customer facing experience and delivery management.

Affiliations

  • Co-Founded www.saarathi-foundation.org which work to empower our child to complete basic school education and work for sustainbility in different social areas.

Timeline

Solution Architect(AI/Data)

Epam
04.2024 - Current

Solution Architect (Cloud, DevOps, M. BigData)

Accenture
08.2022 - 04.2024

Solution Architect (Cloud, DevOps, ML)

HCL/IBM
02.2016 - 09.2022

IBM Global Services
02.2016 - 09.2022

Release/Program Manager (Asset & Security Management)

ANZ Bank
12.2013 - 01.2016

Lead Tech (US Onsite with NYTP) Consultant

AIG Financial Product Corporation
01.2010 - 12.2011

Tech Analyst (US Onsite from NYTP)

CVS Caremark
01.2010 - 12.2011

Big data & Exadata Consultant

TJX Client
09.2008 - 12.2013

Senior System Engineer

Fidelity Investment Inc.
11.2006 - 09.2008

Sr Technical Consultant (US Onsite NYTP)

Info crossing(A Wipro Company)
01.2006 - 12.2008

Technical Support Engineer (Solaris)

Sun Microsystems Inc.
06.2005 - 10.2006

Post Graduate Diploma - IT & System Management (MBA)

ICFAI

Executive Development program - Supply Chain

IIT Delhi

Post Graduate program - AI/ML

University of Texas at Austion

Diploma - Mechanical Engineering

Directorate of Technical Education
Sukrut Sardeshpande