Summary
Overview
Work History
Education
Skills
Certification
Accomplishments
Languages
Timeline
Generic

KIRAN KUMAR VELLANKI

Hyderabad

Summary

A highly accomplished and innovative Lead AI & Automation Engineer with 9.6 years of extensive experience in architecting, developing, and deploying cutting-edge, AI-powered automation solutions and intelligent applications. Expertise in leading the full project lifecycle for enterprise-grade systems, leveraging Python, Machine Learning, and advanced Generative AI techniques (including LLMs, Langchain, Prompt Engineering, and Agent Implementations) to drive significant operational efficiencies, data-driven insights, and transformative business outcomes. Seeking a challenging leadership role to spearhead AI and automation initiatives, fostering innovation and delivering impactful, scalable solutions.

Overview

11
11
years of professional experience
1
1
Certification

Work History

Technical Lead

Dell(via object win)
Hyderabad
06.2024 - Current
  • Spearheaded the design, development, and implementation of AI-powered automation solutions for Dell's internal knowledge base (LKB), auto-translation service (ATS), and knowledge orchestration bot (KOB) platforms, significantly enhancing operational efficiency and content accuracy.
  • Key Projects & Responsibilities:
    LKB (Lighting Knowledge Base) Content Integrity & ATS (Auto Translation Service):
    Architected and deployed a Python-based automation framework using Selenium and Robot Framework to proactively identify, validate, and rectify broken links within Dell's global knowledge base articles.
    Engineered a GenAI-powered solution (leveraging tools like Lavague and custom LLM integrations) to intelligently understand content context and implement accurate link replacements for Dell support pages.
    Integrated and optimized an Auto Translation Service (ATS) leveraging advanced NLP models and GenAI (e.g., custom-trained/fine-tuned translation LLMs, or APIs like MarianMT) to accurately translate knowledge base articles into 19+ languages, significantly enhancing global content reach.
    Impact: These AI-driven automations eliminated manual effort equivalent to over 400+ personnel and slashed article processing and translation cycle time from ~20 minutes to less than 1 minute per article.

    KOB (Knowledge Orchestration Bot) & Intelligent Agent Implementations:
    Designed and implemented sophisticated GenAI agents using Langchain and Pydantic AI for intelligent knowledge base content management, including automated content updates, summarization, and categorization.
    Developed and refined prompt engineering techniques to effectively guide LLMs for complex tasks such as contextual understanding of technical articles, keyword extraction for improved searchability, and ensuring adherence to Dell's content standards.
    Built and maintained robust APIs (FastAPI/Flask) enabling seamless data synchronization between knowledge systems (SQL and MongoDB databases), real-time process notifications, and automated generation of detailed operational reports.

    Technical Leadership & Infrastructure Optimization:Provided technical leadership in defining the automation roadmap, selecting appropriate AI/GenAI tools, and architecting scalable solutions.
    Oversaw data persistence strategies using SQL (e.g., PostgreSQL) and MongoDB for storing article metadata, translation states, automation logs, and performance metrics.
    Implemented RabbitMQ (RMQ) to orchestrate and parallelize high-volume article processing and translation tasks, significantly improving throughput and resource utilization.
    Established automated logging and comprehensive reporting mechanisms, with data archived to AWS S3 buckets for auditing, performance analysis, and continuous improvement.
    Championed the use of Docker for consistent application environments and managed deployments on PKS/Kubernetes platforms.
    Leveraged GitLab CI/CD for robust version control, automated build/test/deployment pipelines, and fostering agile development practices.
    Mentored team members on advanced automation techniques, GenAI best practices, and efficient utilization of the AI-driven technology stack.

Technical Lead (Delivery)

Qentelli Solutions PVT LTD
Hyderabad
08.2023 - 03.2024
  • Company Overview: Qentelli is formed with the objective to make Intelligent Quality Engineering a reality to deliver future-ready applications that can serve the ever-increasing volume, variety, and velocity expected from today’s businesses.
  • Auto magiQ: AutomagiQ is an internal product of Qentelli solution pvt ltd. The tech stack for the project was AI/ML with Python, NLP, Airflow, Docker, Kubernetes, Git, Jira, Aws (s3), postgrues, tensors flow, Ml models like whisper, sentiment analysis etc.
  • Clean, preprocess, and transform raw data to make it suitable for AI model training.
  • Identify and collect relevant text data for training and testing NLP models.
  • Experience in developing and implementing Natural Language Processing (NLP) solutions, including data preprocessing, model development, and integration, to enable language understanding and generation in applications such as chatbots.
  • Fine-tune pre-trained models to adapt them to the specific requirements of the project.
  • Develop RESTful APIs, including defining endpoints, methods, request/response formats, and authentication mechanisms.
  • Optimize the API for scalability and performance to handle a large number of concurrent requests and Implementing caching mechanisms.
  • Deploy the RESTful API to production environments using continuous integration/continuous deployment (CI/CD) pipelines.
  • As a Developer created Docker files that specifies the steps to build an image to run containers in various environments.
  • Versa Flow: Versa flow is an internal product of Qentelli solution. The tech stack for the project was ML/AI with Python, Py spark, Deep learning, Airflow, Docker, Kubernetes, Git, Jira, Aws (s3), postgrues, tensors flow, Ml models like whisper, sentiment analysis etc.
  • In versa flow the time size is of 6 member, where the process of versa flow is to analysis the data in any format (logs, image, video, audio, , excel, csv) and convert the data with required formation and transformation.
  • The formatted or structured data flow will apply some ML/AI models /transcriptions to suggest/prediction the data process.
  • The product has been implanting with online and of line execution. The entire online process will execute through a data pipeline Airflow. And the offline process can be executing in the form a python library.
  • Develop FASTAPI APIs, including defining endpoints, methods, request/response formats, and authentication mechanisms.
  • Optimize the API for scalability and performance to handle a large number of concurrent requests and Implementing caching mechanisms.
  • Deploy the FASTAPI API to production environments using continuous integration/continuous deployment (CI/CD) pipelines and code review.
  • Identify and collect relevant text data for training and testing NLP models.
  • Experience in developing and implementing Natural Language Processing (NLP) solutions, including data preprocessing.
  • By using fast API few api endpoints has been implemented to execute the entire flow of the product. And by using NBformat, web sockets to create a notebook shell and execute the shell using kernel.
  • In other case it is useful to convert the video input into few prompts as input to generate the bots and a complete automation in single click for data conversion.
  • Brierley: Brierley wa a client based project for Capillary Tech. The tech stack for the project was python, Apache Nifi, Minio,Kafka, Snowflake, Postgres, Fast API, Kubernetes, .net, Jira, Git, Jenkins. And the team size is of 10 members.
  • To execute the daily basis Circlek campaigns, data extraction, maintain customer data to allocate the rewards, membership data maintains. Generate the reports, analysis the data and create the target groups.
  • Based on the created groups data need to run the campaigns by using the data flow pipeline NIFI.
  • Need to monitor the generated reports and based on the requirements need to make the changes in required views and rerun the processes. The data should be in encrypted format and the generate files need to place into sftp location through nifi processers.
  • POC (ResoBot): ResoBot is a POC by using multiple PDF docs to develop a Chabot on there Restaurant app.
  • By using Langchain and streamlit to upload the PDF, convert the data and store the vector store into database and based on the search key word and return the relevant information.
  • Qentelli is formed with the objective to make Intelligent Quality Engineering a reality to deliver future-ready applications that can serve the ever-increasing volume, variety, and velocity expected from today’s businesses.

Lead Consultant

SakSoft LTD
Bengaluru
07.2020 - 12.2022
  • Erlang Voice 2022 RPA Automation: Which is used to automate the NMTS tool for Port In and Port Out Request.
  • To process the port in and port out request using the oracle cx procedures and data processing with pandas and automating with selenium web drivers.
  • HRDMS: Which is used to Centralized HR documentation process, data migration and document portal bug fixes using GO lang, HTML, CSS, JS, MongoDB, MSSQL, GoogleAPI, contentserver, OpenText, Python.
  • HRDMS system was implement for 4 countries and data migration for 3 countries which is reduced 25% risk factors of DMS.
  • OLO Finance Transmission: To automate the invoice processes for 500+ vendors using RPA Automation anywhere, python, excel, pdf, Camelot, and MSSQL. Total 50+ vendors invoice processes are automate and reduce 15% of resources workload.
  • MNI: Automated the processes web update activities, ticket status update in Database and Siebel UI using python, XML, Oracle CX.
  • CPQ: Automated report downloader scheduler using java, FTP, outlook and Oracle CX.
  • CPQ: Automated reporting and regular process tasks using python, data frame, html, Oracle CX.
  • Chatbot: Done the POC on chat bot using Python, Keras, Tensor flow, Angular, Html, CSS Reduced human intervention for report generation.
  • UK Questionnaire: Automated feedback questionnaire through mails using Python, SMTP, Oracle-CX, dataframes, Thyme leaf templates. Jenkins. Reduced 20% Human Intervention.
  • Change Management: Complete Automation for regular processes change management schedules, mails, SharePoint calendar plotting, reschedule updates, delete cancelled schedules in SharePoint for different departments. Avoid duplicate entries, audit logs Using python, Oracle-CX, HTML, Thymeleaf templates, Rest API, pandas.
  • Mail automation, Report Generation for Phase2 and ChatBot in Phase3 using TensorFlow and Keras.
  • MNI: Dashboard for MNI tool using Angular, JS, HTML.

Python Developer

People Tech Group
Hyderabad
02.2020 - 06.2020
  • Company Overview: People INC was founded in 2012 with its headquarters located at Redmond, WA, USA has it offices and presence across different locations in the world. People Tech is a services as well as a product based company and has tie up with almost all major IT gaints like Microsoft, Oracle, Amazon, Wolterskluwer catering to the needs of latest technologies in the IT arena.
  • Abotme: To understand existing application, enhanced the functionality of energy monitoring dashboard. Using Python, Django, HTML, CSS, AWS, Capcha generation, SQL.
  • People INC was founded in 2012 with its headquarters located at Redmond, WA, USA has it offices and presence across different locations in the world. People Tech is a services as well as a product based company and has tie up with almost all major IT gaints like Microsoft, Oracle, Amazon, Wolterskluwer catering to the needs of latest technologies in the IT arena.

Python Developer

ESoft Labs Client Accenture
Gurgaon
03.2018 - 01.2019
  • Edit analytics: For BCCL (Times Of India), developed a dashboards like Fuel price, Pollution data, Traffic updates, Sports data..etc using Python, Scrapy, data analysis, HTML, CSS, data visualization, LDAP, SQLlite3, JS, NGINX, GUNICORN. Worked as a individual contributor.
  • Edit Analytics Portal reduces the 60% work load to develop an articles in TOI report analysis on the above data.

Software Engineer

Pranam India PVT LTD
Hyderabad
08.2014 - 03.2018
  • Worked on various projects eVidyaloka, Plabro, Mee Purohit, OTT, Data integration and Support engineer on various web scraping projects, web development using django and flask, Html, Css, S3, Mysql, sqlite3.
  • Mostly worked on web scrapping, to crawl the data from various sources store it into database, troubleshooting and automated the regular process monitoring the process in NGIX. Acted as a team player.

Education

Bachelor of Technology - Electronics & communication Engineering

Newtons Institute of Engineering
Macherla
01.2014

Skills

  • LANGUAGES: Python, GO, Pyspark, HTML, CSS, JavaScript
  • PYTHON LIBRARIES & FRAMEWORKS:Core & Web: Django, Flask, FastAPI, Scrapy, Requests, Arrow, Unittest, Datetime, JSON, YAML, Logging, Files, Multi-processing, SMTP Lib
    Data Science & ML/AI: Pandas, Numpy, Matplotlib, Seaborn, Scikit-learn (Sklearn), SciPy, Keras, TensorFlow, PyTorch (Beginner)
    GenAI & NLP: Langchain, Pydantic AI, OpenAI (GPT models), Lavague (for LLM-powered UI Automation), Chatterbot, Hugging Face Transformers (Beginner), NLTK, spaCy (Beginner), LlamaIndex (Beginner), Whisper, Sentiment Analysis
    Automation & Web Interaction: Selenium, Robot Framework, Web driver, BS4, Camelot
    Database & ORM: SQLAlchemy, SQLite, MySQL, Mongo
    Cloud & DevOps Related: Boto3 (for AWS), Poetry (Dependency Management)
    Other: Geopandas, pypdf2, Gradio, Pickle
  • DATABASES: MySQL, MongoDB, PostgreSQL, SQLite
  • AI/ML & GENAI CONCEPTS: Machine Learning (Supervised & Unsupervised), Deep Learning (Neural Networks), Natural Language Processing (NLP), Large Language Models (LLMs), Prompt Engineering, GenAI Agent Design & Implementation (eg, using Langchain Agents), Vector Databases (conceptual: FAISS, Chroma), Fine-tuning LLMs, RAG (Retrieval Augmented Generation), AI-driven Process Automation
  • DEVOPS & CLOUD:CI/CD & Version Control: Git, GitLab CI/CD, Jenkins
    Containerization & Orchestration: Docker, PKS (Pivotal Container Service), Kubernetes (K8s - fundamental understanding)
    AWS: S3, EC2, ELB, CloudWatch, Auto Scaling, Lambda, Athena, VPC
    Messaging Queues: RabbitMQ (RMQ)
  • ADDITIONAL: REST API Design & Development, Robotic Process Automation (AA - conceptual understanding and integration), SVN, Kibana, Alteryx, YTD, Minio, Airflow, Apache NiFi, DeltaLake, Microservices (conceptual understanding)

Certification

  • Programming, Data Structures and Algorithms using Python, Kelly Technologies
  • Alteryx Core level certification

Accomplishments

  • Received 5 time work Performance Spot awards from SakSoft.
  • Received an appreciation from Local DSP (Gurazala) for CC TV data recovery.
  • Successfully Completed three villages street light Automation using LDR technology in 2014.

Languages

Telugu
First Language
English
Advanced (C1)
C1
Hindi
Intermediate (B1)
B1

Timeline

Technical Lead

Dell(via object win)
06.2024 - Current

Technical Lead (Delivery)

Qentelli Solutions PVT LTD
08.2023 - 03.2024

Lead Consultant

SakSoft LTD
07.2020 - 12.2022

Python Developer

People Tech Group
02.2020 - 06.2020

Python Developer

ESoft Labs Client Accenture
03.2018 - 01.2019

Software Engineer

Pranam India PVT LTD
08.2014 - 03.2018

Bachelor of Technology - Electronics & communication Engineering

Newtons Institute of Engineering
KIRAN KUMAR VELLANKI