Summary
Overview
Work History
Education
Skills
Training
Timeline
Generic
ANKITH SHETTY

ANKITH SHETTY

Mangalore

Summary

Dynamic professional with nearly 2 years of hands-on experience in database management, software development, and cloud-based solutions. Adept at designing, developing, and executing complex SQL queries and integrating data using SQL Server Integration Services (SSIS). Proficient in reporting with SQL Server Reporting Services (SSRS) and familiar with the Software Development Life Cycle (SDLC) models. Trained extensively on Azure Data Factory and skilled in working with ETL tools such as SSMS, SSIS, SSRS, and Informatica Intelligent Cloud Services (IICS). Experienced in using Apache Airflow for scheduling and managing workflows, with a solid understanding of integrating Airflow DAGs with Snowflake, Azure Blob, and Amazon S3 for efficient data storage and manipulation. Competent in utilizing Snowflake for cloud-based data solutions and well-versed in Jira software for project management. Known for excellent written and oral communication skills, capable of effectively conveying technical concepts in business contexts.

Overview

2
2
years of professional experience

Work History

Employee

Tietoevry India Pvt Ltd
Bangalore
2022.09 - Current

Duration: 1 year 10 months (September 5th, 2022 – Present)

Professional Training:

  • Received 45 days of intensive training on database management from DLITE, an external organization, organized by the company.

Technical Expertise:

  • SQL Proficiency: Extensive experience in the design, development, and execution of complex queries.
  • Data Integration: Strong knowledge of integrating data using SQL Server Integration Services (SSIS).
  • Data Reporting: Proficient in reporting data using SQL Server Reporting Services (SSRS).
  • SDLC Knowledge: In-depth understanding of Software Development Life Cycle (SDLC) models.

Tools and Technologies:

  • Database Management: Skilled in using SQL Server Management Studio (SSMS).
  • ETL and Reporting Tools: Experienced with ETL tools such as SSIS and SSRS, and Informatica Intelligent Cloud Services (IICS).
  • Cloud and Data Storage: Trained and worked on Azure Data Factory, and hands-on experience with Snowflake for data storage and manipulation.
  • Workflow Automation: Practical experience with Apache Airflow, managing scheduled and unscheduled Directed Acyclic Graphs (DAGs) using Python.
  • Data Integration: Expertise in integrating Airflow DAGs with Snowflake, Azure Blob Storage, and Amazon S3 buckets.

Project Management:

  • Agile Tools: Solid understanding and practical experience with Jira software for project tracking and management.

Communication Skills:

  • Business and Technical Communication: Excellent written and oral communication skills, with the ability to convey information effectively in both business and technical contexts.

This comprehensive skill set underscores my ability to handle complex data projects, from database management and integration to reporting and workflow automation, ensuring efficient and effective data solutions.

4o

Training on Database

DLITE
  • I have developed a deep understanding and proficiency in SQL queries, allowing me to effectively manipulate and manage complex datasets. My experience includes successfully completing various tasks involving SSIS packages, demonstrating my ability to design, develop, and deploy robust data integration solutions.
  • Additionally, I have created comprehensive SSRS reports based on diverse datasets, showcasing my capability in generating insightful and actionable business intelligence reports.
  • I am highly skilled in using MS Word, MS Excel, and MS PowerPoint, which has enabled me to create professional documents, perform advanced data analysis, and deliver impactful presentations. My familiarity with these tools extends to advanced functionalities, ensuring efficiency and precision in my work.
  • Furthermore, I have significant experience working in Agile environments, which has honed my ability to collaborate effectively with cross-functional teams, adapt to changing requirements, and deliver high-quality solutions within tight deadlines. This experience has also instilled in me a strong understanding of Agile methodologies, including sprint planning, daily stand-ups, and iterative development, which I apply to ensure continuous improvement and project success.

Projects And POC's

Online Timesheet Application

Technology and Tools Project Overview Responsibilities Key Features

  • Technology: SQL
  • Tools: Visual Studio Code, Visual Studio 2022, SQL Server, Jira

We were tasked with developing an application that enables employees to create and submit timesheets to their managers for payroll processing. Each employee needed to fill out their daily timesheet, detailing the number of hours worked, the specific project they were working on, or if they were on leave. This timesheet was then submitted to their manager for approval.

As the only team member with specialized training in database management, I played a crucial role in the project. My responsibilities included:

  • Designing the database schema to efficiently handle timesheet data.
  • Implementing and optimizing SQL queries for both employee and manager functionalities.
  • Testing the database design and queries to ensure data integrity and performance.
  • Employee Perspective: Employees could log their daily work hours, project details, or leave status.
  • Manager Perspective: Managers had the ability to review, approve, or reject submitted timesheets based on the provided data.

By leveraging my database expertise, I ensured that the application’s backend was robust, scalable, and met the requirements of both employees and managers.

Training POC

Database Bootcamp
  • I have developed a deep understanding and proficiency in SQL queries, allowing me to effectively manipulate and manage complex datasets. My experience includes successfully completing various tasks involving SSIS packages, demonstrating my ability to design, develop, and deploy robust data integration solutions. Additionally, I have created comprehensive SSRS reports based on diverse datasets, showcasing my capability in generating insightful and actionable business intelligence reports.
  • I am highly skilled in using MS Word, MS Excel, and MS PowerPoint, which hDeveloped a strong understanding and proficiency in SQL queries, enabling efficient data manipulation and retrieval.
  • Successfully completed assigned tasks involving the creation and optimization of SSIS packages, demonstrating a robust understanding of ETL processes.
  • Delivered comprehensive SSRS reports based on given datasets, showcasing the ability to transform raw data into meaningful insights.
  • Achieved proficiency in Microsoft Office Suite, including MS Word, MS Excel, and MS PowerPoint, facilitating effective documentation, data analysis, and presentation.
  • Gained substantial experience working in an Agile environment, contributing to iterative development cycles and collaborative team efforts to achieve project goals efficiently.as enabled me to create professional documents, perform advanced data analysis, and deliver impactful presentations. My familiarity with these tools extends to advanced functionalities, ensuring efficiency and precision in my work.
  • Furthermore, I have significant experience working in Agile environments, which has honed my ability to collaborate effectively with cross-functional teams, adapt to changing requirements, and deliver high-quality solutions within tight deadlines. This experience has also instilled in me a strong understanding of Agile methodologies, including sprint planning, daily stand-ups, and iterative development, which I apply to ensure continuous improvement and project success.

Cloud

Airflow, Snowflake Bootcamp
  • Understanding the basic concepts of airflow, scheduling and triggering DAG’s
  • Hands on experience of using python codes to generate DAG’s of multiple kinds.
  • Using dags to store data from local to snowflake also from cloud to snowflake.
  • Cloud softwares used include Azure Data Factory, Amazon S3 Bucket and SFTP.

HL7 Processing POC

Process and store HL7 messages
  • I spearheaded a proof of concept (POC) focused on parsing, processing, and storing HL7 messages, a critical data interchange standard used in hospitals. This project demanded a thorough understanding of HL7 message structures, as well as the ability to extract and clean diverse data elements such as patient IDs, names, addresses, dates of birth, phone numbers, and medical event details.
  • The complexity of this undertaking was significant, requiring intricate code to handle various HL7 segments like MSH, PID, EVN, and PV1.
  • I implemented robust regex-based parsing techniques to ensure accurate data extraction and cleaning.
  • Additionally, I developed a sophisticated data validation and insertion mechanism to populate multiple database tables, ensuring data integrity and consistency.
  • This project not only highlighted my technical prowess in handling complex data formats but also demonstrated my ability to create scalable, maintainable solutions for critical healthcare data management tasks.

Education

Bachelor of Engineering (B.E) in Electronics and Communication Engineering -

Mangalore Institute of Technology and Management
01.2022

Skills

  • SQL
  • SSIS
  • SSRS
  • IICS
  • C programming
  • Python
  • HTML
  • CSS
  • Airflow
  • Snowflake
  • Azure Data Factory

Training

Training and Experience

DLITE Database Training
Received extensive training from DLITE on database management, focusing on SQL queries. This training included:

  • Completing assigned SSIS package tasks.
  • Generating SSRS reports based on given datasets.
  • Worked with several IICS transformations.
  • Gaining proficiency in MS Word, MS Excel, and MS PowerPoint.
  • Acquiring experience in working within an Agile environment.

Online Timesheet Application (Duration: 2 months)
Technologies and Tools: SQL, Visual Studio Code, Visual Studio 2022, SQL Server, Jira
Developed an application for employees to create and submit timesheets to their managers for payroll processing. Key responsibilities and features included:

  • Employee Functionality: Employees could log daily work hours, project details, or leave status, and submit these timesheets to their managers.
  • Manager Functionality: Managers had the ability to review, approve, or reject the submitted timesheets based on the provided data.
  • As the sole database-trained team member, I was responsible for designing, implementing, and testing the database schema and queries for both employee and manager perspectives.

Database Bootcamp
Acquired a solid understanding of database concepts including normalization, database modeling, and other related topics. Gained hands-on experience with SQL queries using SSMS.

Airflow Bootcamp
Learned the fundamentals of Airflow, including scheduling and triggering DAGs. Developed hands-on experience with generating various types of DAGs using Python code.

Snowflake Bootcamp
Gained in-depth knowledge of Snowflake's architecture, data storage, and processing capabilities. Acquired hands-on experience with Snowflake's SQL dialect and data loading techniques.

Azure Certifications
Completed and earned certifications for Azure 900 and Azure DP-203 courses, gaining a comprehensive understanding of Azure services and data engineering on Azure.

Timeline

Employee

Tietoevry India Pvt Ltd
2022.09 - Current

Training on Database

DLITE

Projects And POC's

Online Timesheet Application

Training POC

Database Bootcamp

Cloud

Airflow, Snowflake Bootcamp

HL7 Processing POC

Process and store HL7 messages

Bachelor of Engineering (B.E) in Electronics and Communication Engineering -

Mangalore Institute of Technology and Management
ANKITH SHETTY