Results-driven Technical Lead and architect specializing in Cloud Security products, with a decade of experience in Java/J2EE technologies. Proven track record in integrating Email Data Loss Prevention (DLP) solutions with enterprise cloud security, enhancing product scalability, and implementing innovative features. Expertise includes Oracle Endeca 11.1 integration with Spring-based REST services and leading Machine Learning initiatives utilizing IBM Watson. Committed to delivering high-quality solutions while ensuring exceptional team collaboration and client satisfaction.
Overview
19
19
years of professional experience
1
1
Certification
Work History
Technical Lead
Skyhigh Security
03.2022 - Current
And architect for Cloud Security product Email Data Loss Prevention (DLP), integrating enterprise DLP product with Cloud security product, integration of in house malware detection product, adding new features, upgrading the cloud service api calls, creating tools and enhancements for better maintainability and scalability of the product
Technical Architect for the search platform Oracle Endeca 11.1 and its integration with Spring based REST services, Endeca to Solr migration proposals
Lead a team for a POC on Machine Learning based AI search with IBM Watson, POC on Google Assistant skill – conversations using Google home devices and Phone’s google assistant
Technical Lead with an experience of around 10 years mainly on Java/J2EE technology
Extensive experience on Oracle Endeca Search 11.1, ATG e-commerce solution 10.2 and 11.1 versions, ATG Endeca Integration (Experience manager and Guided Search Service), Mobility with ATG applications and ATG REST Web services
Trained in Microservices, Google cloud platform
Interest towards booming technologies like Cloud, Machine learning, etc
Retail domain knowledge and experience on e-commerce, vendor management, store & warehouse management
Passionate on Design & Development in various SDLC models
Highly self-motivated professional, proficient in prioritizing and completing tasks on time with multitasking ability to achieve project/team goals
Excellent team leading skills in delegating tasks, guiding team members to help achieve individual goals
Experience in client interaction onsite achieving highest customer satisfaction.
Technical Architect
Mcafee
Bangalore
03.2021 - Current
Fail open ReArch Design & Impl, Project : Fail open ReArch Design & Impl
Technologies : Spring boot, RabbitMQ, Redis, S3 api’s, Email being a time critical service, it’s important that the DLP is done within the SLA and response action is taken correctly
Anything which fails processing within SLA and the violation is exposed, then it is called Fail open
Due to its criticality for customers, there were several changes done in the architecture to minimize or address the fail open situations there by increasing the stability of the product
Design changes to prevent fail opens due to short infra failures by identifying and routing impactful events through a new pipeline, fail open timeout changes and others
Enabling Automated & Manual offline DLP for impacted events to process them offline to remove any violated emails if possible (not applicable for external users)
Enable email tracing dashboard for internal use as well as for Customers
About 80-90% of fail opens were reduced with the design changes
Project: New features like Email Activity Monitoring, Email domain validation, Policy action queue etc
Assistant POC
Google
Mcafee
08.2019 - 03.2022
Client : Sams Club
Technologies, Actions on Google, Dialog flow, Firebase, Node JS
Technologies : Spring boot Microservices, RabbitMQ, Redis, Haraka email server, Several new features are added to the Email dlp product – some of these explained below
Capturing the email activities of the customer and pushing it to the Activity monitoring system
Designing the approach involves communication between the Haraka email server and the DLP system to capture the processing status and add all the exit point status in the haraka email server
Handling the Redis failures, email server/ processing server failures to make the feature resilient
Identifying the right attributes to be sent to Activity monitoring system for different Cloud service providers for easy anomaly detection
Making sure performance of email DLP SLA is unaffected with the inclusion of activity monitoring events since the number of events are in millions per day
Enabling only valid domains of the customer’s Microsoft account in the tenant configuration by integrating with Cloud service api calls
Seemless migration of existing configurations to remove the invalid domains and show enhanced UI with valid domains as well as provide option for the customer to review the invalid domains before removing them permanently
This feature enhances the security aspect of the product across tenants
Adding policy action queue to address problems like - Rate limits during Cloud service provider api calls and other errors in completing a policy response action in turn fails the processing of the event
Temporary caching of the partial response is done to make sure these events are subsequently retried and complete the policy response action to create the incident
Several other enhancements and fixes were made to the product to make it more stable.
Technologies : Spring boot Microservices, RabbitMQ, Redis, MySQL, REST Web services
Technical Lead
Mcafee
Bangalore
09.2019 - 07.2020
EPO is the enterprise security product used for several use cases in Mcafee
With Mvision Cloud being acquired, we had to enable the customers to use the same classifications and rules defined in the EPO to be applied for their cloud service events as well
In order to accomplish this, the policy engine of EPO is separated as a scanning engine and is deployed as a cloud service
Mvision cloud creates a separate flow for the cloud service providers which are configured to use EPO classifications via the scanning engine
We did multiple additions to this integration like retry logic for the scanning service on specific failures which led to dropping events, creating phase wise evidence file feature and send it back to EPO, aligning the severity levels for the incidents across EPO and Mvision incidents, Universal tenant id migration, adding IAAS services & messaging for the scanning service, etc
Our team became an expert in EPO in just couple of months of taking over the EPO integration and has received accolades
Technologies : Haraka mail server, Spring boot Microservices, RabbitMQ, Redis, MySQL, REST Web services, Exchange Online Cloud API’s, The Email DLP part of the Mvision cloud product is taken up by our new team
We did several enhancements to the product as well as address many customer issues & tickets
Certain important features like securing the email domains for the tenants, merging incidents & notification for internal & external users in both inline and passive modes, relaying for empty mail from emails, attaching the original email in the incident notification, retrying the cloud service api’s where there is a delay in receiving the email by the email server there by completely eliminating the number of quarantine failures in production due to File not found, Elk alerts for backlogs, emails dropped, tenant based haraka node processing status, etc.
Technical Architect
Mcafee
Bangalore
01.2020 - 06.2020
This project is about integrating the Mcafee GAM in house product with the acquired Cloud security product Mvision Cloud
The current Mvision product does malware detection with GTI which has limited malware detection capabilities but doesn’t handle 0-day threat detection which GAM was able to handle
So, we have to integrate the GAM web service with Mvision cloud in order to identify the 0-day threats for all the cloud services configured with malware policies
The design is targeted on how to efficiently make calls to GAM since we have to send the entire cloud file to the service for processing
Like what files (file types, new files, GTI unknown files, etc) has to go to GAM, how do we do achieve parallel processing of GAM calls with separate queues, consumers and processors, how to apply the current weighing strategies to prioritize the tenant’s events, how to handle the cloud file between the NRT flow and the malware flow as well as GTI and GAM – basically to avoid making multiple calls to the cloud service to download the file which can potentially increase the throttling errors, we made use of S3 bucket in certain cases, how do we merge the GTI & GAM responses in the policy & incident pages in the dashboard, how to enable customers select 0’day threat detection from the dashboard and so on
This is a much-awaited sellable new feature of the product
The analysis, design and estimation is completed by myself and our team is going to work on the implementation
Technologies : Spring boot Microservices, RabbitMQ, Redis, MySQL, REST Web services
Cognizant Technologies
06.2016 - 08.2019
Google, Sams Club
Bangalore
03.2019 - 04.2019
Technical Architect
Cognizant Technology Solutions
Bangalore
05.2018 - 02.2019
Created a skill for Sams club in Actions on Google to enable Product search and club finder through the Google assistant app in Phone and Smart displays as well as Google Home devices
Created Actions on Google project for the skill, dialog flow agent for conversations and Node JS fulfillment code for connecting to backend Sams club services and conversations
Deployed the webhook in Firebase and enabled testing in simulator, phones, google home devices
Implemented variations in conversation flow for Google home vs Display devices like phones and smart displays
Project : Service Continuity, Project : Service Continuity
Role :, The project aims at creating a new Data center as a switchable alternative to the existing Data center
The new DC will be enabled in 2 different scenarios – one in which the store front applications are only functional and the other in which all the business facing & internal systems are also replicated and functional
Replication, data synchronization, dynamic switchable scenarios, creation and maintenance of 2 different build and deploy flows to enable switchable scenarios in each DC, setting up of the new DC first time, switch back option to the old DC, etc are the main challenges in the project
The client has both B2B and B2C systems in 2 languages along with around 20 internal applications and many third party integrations
Endeca has 3 apps each for B2B & B2C
Integrations with all the internal applications, data bases, Host names, server clusters are made dynamic based on the DC & scenarios
Project : Search Solution Proposal for Ecommerce client
Technical Architect
Cognizant Technology Solutions
Bangalore
04.2018 - 05.2018
Project : Endeca To Solr migration & Enhanced Search Proposal
Client : North American client
Technologies : Solr, Microservices, The proposal aimed at providing solution for Endeca to Solr migration for ecommerce client who has multi-site with multiple locales
The current search solution was very primitive and hence the solution also aimed at providing enhanced search experience by utilizing Solr’s capabilities like Near real time indexing with Mongo connectors, update handlers, Custom business tool, custom deploy tool, custom Stemming filter factory, NLP, LTR, Zookeper, etc
The Solution also gave a road map of implementing in a Microservices & cloud platform in agile mode
Technologies : IBM Watson Cloud services - Discovery Service, Machine Learning Service, WCS Aurora Ecommerce platform, Spring Boot Rest web services, Solr, Intelligent search understanding the concept behind the user query as a whole and provides relevant products based on user’s interest/preferences
Solution Highlights:
An Open ended query, no particular mention about the specifics of products or user’s preferences
Yields products which are more relevant to the user’s interest
No explicit rules in code for personalization
Search is not just keyword driven but based on relevancy training and other enrichments of product data
Automatically filtering the results based on the user’s preferences predicted by the trained ML models from the users past purchase history
Discovery Service
- NLP Relevancy training
Used for NLP based document search
Can be customized for domain specific model (using Knowledge Studio)
Machine Learning
- Trained Model for identifying preferred user Category
- Trained Model for identifying preferred user Color
- Trained Model for identifying preferred user Dress size
- Trained Model for identifying preferred user Shoe size
- Trained Model for predicting 1:1 product recommendations
Used for identifying the personalization characteristics based on User purchase history, preferences, user segments, etc
Was able to show case the difference between a regular Solr based search vs an AI based search
Project : Endeca Digital Platform Migration & Redesign
Technologies, REST web services, Advanced REST client, Jboss 4.3
Role
SAP
Kraft foods
Chicago
05.2010 - 03.2011
Project : CATALYST (Vendor Data migration)
Client :, Cransoft, SQL Server 2000, Oracle, SQL DTS Packages, Data loads and verification in
Application Developer
Kraft Foods
Chicago
06.2008 - 04.2010
Project : Compensation, Annual Enrollment and Payroll Systems, .1.0, Java
Onsite co-ordinator
Kraft Foods, WAVE-C
Northfield, Chicago
01.2008 - 05.2008
The compensation 2009 project involves enhancement of three applications/tools namely Global Merit, GMIP/LMIP and Direct compensation statements
It is a short duration project which involved fast learning of new tools and applications and analyzing the requirements, build, testing, installation and production support
The Global Merit and GMIP/LMIP enhancements went on simultaneously with two groups of enhancements each
It is executed with an offshore team of 7 mainly from China
The post production support for Global merit and GMIP/LMIP continued for two months along with the reviews and testing and installation for Direct Compensation statements
Installation of the projects went smooth compared to last year thereby increasing the client satisfaction
This project is executed in parallel with the Compensation 2009 project which involves implementation of three major Change Requests in payroll applications
All the three CR’s were executed with one offshore person in China
One of the CR has its major significance in reducing lots of paper work involved in payroll systems by our online Direct deposit system for Canadian Kraft employees
All the CR’s were delivered on time and the installation was smooth without any production issues
It’s an enhancement project which involves the changes to the 2009 Benefits Option worksheet where the employees select their contributions to the Medical Insurance plans, dental insurance plans, Health care options, dependent day care options, etc
It involves fast learning of the application, mainframe data setup for different scenarios, on-time delivery, smooth move to production, no UAT/ production issues, etc
Role and Responsibilities
Requirements Gathering and Analysis
Estimation of requirements
Client Interaction
Coordination with offshore
Reviews
Unit Testing and Integration Testing
UAT Support
Installation
Production Support
XVII
Developer
BNSF, Burlington, MKS
Chennai
12.2006 - 12.2007
Project : Planning and Reporting System (PARS), Northern and Santa Fe)
Web sphere Development environment, Java
Developer
Infosys
Chennai
09.2005 - 12.2006
It’s a large development project where the web application for AVON is customized across various country markets like Baltics, Hungary, Czech, etc
We identified the gaps between these countries and the core application, the requirements of the new markets and the enhancements to the core application and implemented them
It’s developed in Spring framework to make it suitable for implementing different languages
Language selection within the same market becomes one of the important functionality of the web application
There were different suites in the application and the project required coordination with the development and testing teams across different suites
Role and Responsibilities
Coding
Testing
Mentoring
Reviews
Tracking - tracked defects for all the three suites in the application
Worked in a large team
Prepared delivery documents for each delivery to the testing team as well as to onsite
XVIII Project : Planning and Reporting System(PARS, It’s a maintenance project for an existing Planning and Reporting System for BNSF railways, the second largest railways in United States
The planning and reporting system has the provision for the managers to plan and schedule work for future work orders and assign the gangs to every work order
It also has a reporting system which will allow the workers to report back their hours of work and other details
The project involves development of additional functionalities and fixes for the existing issues in the planning and reporting system
Role and Responsibilities
Coding
Testing
Mentoring
Reviews
Initiatives on taking sessions for the fresher’s in the team, Reviews tracking, Defect tracking and conduct Defect Prevention meetings
Have had work experience in retail and transportation domains for retailer/logistics companies., enhanced forecasting and allocation systems, Planners workbench for scheduling projects for a transportation and logistics based client in US, Worked in RPGLE for a gaming based client
Projects involved enhancements and new developments of web sites or internal tools used
Migration of the existing site (home page, department page, browse and search pages) in ATG+Endeca platform to light weight scalable spring platform
Extensive utilization of the Search and Navigation capabilities of the Oracle Endeca search platform 11.1 version
Experience manager driven web pages and Rest API’s for mobile and tablet versions
Endeca Search engine features: Dynamic triggering of contents based on the navigation states, content slotting, media mdex indexing & media browsing, boost and bury, predictive search, relevancy ranking evaluator, customization of backend components for indexing, customized editors, templates and cartridges
Complex baseline indexing pipelines and partial pipelines
Integration of Spring framework with Endeca using the Assembler API
Customization of the Endeca OOB handler components
Spring batch jobs implementation for input XML preparation to be indexed in Endeca
POC on Solr Predictive search
Analysis of the NER model developed by client side
Developed a media portal web Application on Spring boot
Trained in Microservices
Challenges involved high demanding client, stringent sprint timelines and complex system
Lead a team of 13 members
Hands on development, coordination with business and QA team
Project, Support & Product Details Page Redesign
Technical Lead
Infosys
Chennai
TOMS is a philanthropic site donating one shoe to a children in need for every shoe bought
The ecommerce site is built on top of 10.2 and then migrated to 11.1
Being a new client for the company, we had challenges taking over the entire support from the competitor with limited transition from the client team
We did 24/7 support for the first time for TOMS with just one person offshore and one at onsite and resolved a backlog of around 600 tickets with a team of 2 within short span of 4 months
Worked in Agile framework, in addition to the support activities simultaneously resolved as many backlog tickets as possible to achieve greater client confidence
Suggestions provided to client for lot of manual reconciliation activities due to poor process or design issues in existing setup
Implemented a major design change suggestion – Merch Solution - Entire Product details page redesign to fetch the templates from Endeca experience manager and rewrite of the processor logic for all 7 different product types and did optimization of pages all within a month duration
In store MPOS solution feasibility analysis and estimation done for the client and helped for the demo by creating ATG REST services for the MPOS solution by Infosys
Responsibilities & Challenges:
Lead the entire offshore support team – myself being the support person at offshore and the backlog support team
Core design and development of the Product detail page
Sole contributor to Feasibility analysis and Estimation for a MPOS solution for TOMS
Resolution time for SEV0 and SEV1 tickets went higher within short span of time
Initial project setup activities and coordination
Weekend support activities and deployments
Team management
Achieved 6/7 employee satisfaction on the first customer satisfaction survey., Darden is the world's largest full-service restaurant company whose family of restaurants features some of the most recognizable and successful brands in full-service dining: Red Lobster, Olive Garden, Longhorn Steakhouse, The Capital Grille, Bahama Breeze, Seasons 52 and Eddie V's
Owns and operate more than 2,000 restaurants
Her new cross channel digital platform is aimed at building loyalty by digitally engaging with Guests anytime anywhere and any device
The existing desktop site being enabled for tablets and desktops, we designed and are creating new sites for mobile
The new mobile site is enabled across multiple brands and multiple countries as well
The design ensured the guest to have a mobile optimized user experience to the desktop site as well as integrating with Darden’s enterprise applications such as DASH, Menu System, RID, etc
There are challenges involved in leveraging the existing business components and Experience manager cartridges wherever possible but only create new mobile JSP’s
Device detection and location detection for mobile are enabled through Akamai and HTML5 Geo code API respectively
Complexities involve analyzing and optimizing the modules for mobility
-Togo module with the menu items which changes based on the default restaurant selected and time of ordering along with multi-level configurable sku’s and their availability by the pickup time
Browse and Navigation with auto detection of the nearby location and their current menu, less number of clicks to navigate to the menu item, price and nutritional calculation of the different sku’s in a menu item
Role and Responsibilities
Responsibilities involve
Leading the offshore team, handling the entire progress of the mobility project
Analyzing the mobile PSD’s and wireframes, matching them against the desktop version
Technical design for enabling mobility with multisites, device detection, static content
Designing and coding the Business Components, web content, HTML’s, Resource bundles, BCC changes, DB changes, Experience manager Cartridge definitions, etc
Test across multiple mobile devices
Managing individual modules within the project like Guest Management, Browse and Navigation, Togo module
Mentoring and providing technical guidance
Co-ordination with the UX support team, onsite and testing team., This project involves Redesigning the website as per the new responsive web design paradigm in order to provide customers a uniform experience across desktops and the rapidly evolving tablet & mobile platforms
This redesign includes integration of Oracle Endeca search engine with ATG 10.1.2 platform which has inbuilt OOB components for search integration
With the basic Endeca package Guided Search, we were able to face the challenges in exploring the assembler API components and queries with respect to the new site redesign
Explored the ways to implement Assembler Pipeline approach, URL redirects, Price slider with varying min/max, configuring banners for different categories and brands, Dimension value caching for brands, marketing links, SEO urls, precedence rules in refinements, sku level indexing, pricing and promotion
This is a development/enhancement project involving requirements elaboration, design, build, testing, implementation and support
The enhancement and fixes are developed in multiple releases
Role and Responsibilities
Technical design for ATG Endeca integration
Led a team of 4 along with research and analysis on Endeca guided search and Assembler API behavior, coding, testing, review, defect fixing, QA support, post production support, mentoring and technical guidance
Co-ordination with the Endeca team and understanding how the Assembler API’s work with Endeca configurations
Co-ordination with the UX support team and the Responsive Web Design team to integrate presentation content with responsive html and to enable extensive multi-browser and multi-device testing.
fashion retailer
Infosys
Chennai
Client has set out a mobile commerce application for their dot com site
This mobile app is supported by iOS and Android platforms
The dot com website is implemented in ATG framework has all the functional components for the commerce application
The mobile app is designed in such a way that they can access these ATG components via REST web services
The REST web services are configured directly based on the existing components (bean, form handler or repository) or by creating wrapper classes around the components to optimize the number of web service calls from the mobile
We developed web services for the mobile app in iPhone and Android
Basically we identified the key functionalities of the dot com site and their corresponding ATG components which could be exposed as web services
In some cases, we wrote wrapper classes to optimize the number of web service calls from mobile
We integrated the mobile module with the existing dot com module
Our first release of the app had the basic functionalities required for a complete order flow
The second release has almost all the functionalities that a user should be able to do in the dot com site
Role and Responsibilities
Created web services for various modules like profile, product catalog, checkout, shopping cart, payment, rewards, etc
Changes for handling secure requests like apache rewrite rules, adding url’s to protocol switch servlet
Handling redirects in mobile
Team handling, work delegation, design and development as well as better co-ordination with the front end team during integration
Handled Unit, Integration & System Testing along with UAT & production support ensuring quick resolution of defects
Project : CATALYST (Vendor Data migration)
Onsite co-ordinator
Kraft Foods
Glenview, Chicago
CATALYST was a multi-year engagement project at a leading food company in US
Under Catalyst, the data delivery services team works on converting the existing legacy into SAP ready data
The system was developed in multiple releases with the help of Cransoft as a migration tool
Cransoft was used as a data extraction, transformation and loading tool
The tool has the capability to provide preload reports and post load validation reports which play a significant role in the data migration process
SQL server tools will be used for implementing the business conversion process
Data loading scripts are developed in Cransoft and data is loaded into SAP using BDC and LSMW utilities
It helps the migration team to load the migration data easily and quickly into SAP
We did the migration of whole vendor data from 3 different legacy systems in Kraft for North America
It involves creating custom migration rules using complex SQL queries for both source and destination data sources, loading huge data to SAP, verify the data loaded with that of the source systems and create reports for the same
Challenges involved Migrating Relationship (partners) data which is considered one of the complex logic in Vendors, improving performance of complex join queries with huge data
Role and Responsibilities
Did estimation for data migration, Requirements Gathering and data analysis, Solution preparation, Data enrichment, Applying business rules for 2 releases
Business & Client meetings to understand the complexities involved in migrating different Vendor data sources into the single SAP system