Summary
Overview
Work History
Education
Skills
Professional Experience Highlights
Certification
Languages
Projects Handled
Timeline
SeniorSoftwareEngineer

Raghavendra Darisha

Bangalore

Summary

Senior software engineer with extensive experience in technical and database management. Proven ability to develop innovative solutions that enhance productivity and drive organizational growth. Strong analytical and critical thinking skills, adaptable to various challenges in the IT sector.

Overview

5
5
years of professional experience
1
1
Certification

Work History

Senior Software Engineer

Nitor InfoTech Ltd
Bangalore
08.2023 - Current

Software Engineer

IBM India Pvt ltd
Bangalore
12.2021 - 06.2023

Data Analyst

Capgemini India Pvt Ltd
Bangalore
08.2021 - 11.2021

Data Analyst Intern

Kandra Digital
Bangalore
06.2020 - 07.2021

Education

B. TECH - Mechanical

JNTUA-University
Anantapur
01.2017

High School Diploma -

Govt.Polytechnic
Anantapur
04-2013

Skills

  • Cloud technologies: AWS and Azure
  • Database management: Snowflake, MySQL, Oracle, SQL Server, and BigQuery
  • Programming languages: SQL, NoSQL (Cypher), Python, and PySpark
  • ETL tools: Matillion and AWS Glue
  • BI tools: Sisense and Power BI
  • DevOps tools: Jira and GitHub
  • Gen AI Principles (RAG)
  • Operating systems: Windows and Unix/Linux

Professional Experience Highlights

  • 4.2 years of progressive experience in software development, including 2.5 years as a dedicated Snowflake and SQL developer
  • Proficient in data warehousing concepts, complemented by hands-on experience with Matillion ETL
  • SNOWPRO CORE certified data engineer, consistently delivering high-quality solutions within agile environments
  • Proven track record in successful Teradata to Snowflake migrations, and SAP BODS workflow migration to Matillion
  • Expert in creating and optimizing complex SQL queries (insert, update, delete) for MySQL, Oracle, and Snowflake databases
  • Extensive experience in setting up and managing Snowflake data warehouse applications, and Snowpipe for continuous data ingestion
  • Adept at leveraging advanced Snowflake features, including cloning for backups and time travel for data recovery
  • Skilled in developing data warehouses on Snowflake Cloud Database and integrating diverse data formats, such as CSV and JSON
  • Strong command over the Snowflake Admin module and utilizing SnowSQL for efficient bulk data loading
  • Excellent understanding and practical application of joins, subqueries, common table expressions (CTEs), views, materialized views, and sequences
  • Proficient in using COPY/INSERT, PUT, and GET commands for seamless data loading and unloading between Snowflake tables and internal/external stages
  • In-depth knowledge of Snowflake data sharing capabilities
  • A collaborative and effective team player, capable of performing under stringent time constraints

Certification

  • SnowPro Core Certified Data Engineer
  • Generative AI Level 101 Certification

Languages

English
Proficient (C2)
C2
Telugu
Proficient (C2)
C2
Hindi
Upper Intermediate (B2)
B2
Tamil
Beginner
A1
Kannada
Upper Intermediate (B2)
B2

Projects Handled

Enterprise Data Platform for Juniper Networks
Client : Juniper Networks, US
Role : Software Engineer
Technology : Snowflake UI, Matillion, HVR, SAP CRM & ECC, Jira.
Description:
Juniper Networks, Inc. is an American multinational corporation headquartered in Sunnyvale, California. The company develops and markets networking products, including routers, switches, network management software, network security products, and software-defined networking technology.
An Enterprise Data Platform is created in Snowflake where all the data has been dumped and processed the same in snowflake as per the Business requirements and providing it to downstream.
This is a data migration from SAP BW system into snowflake by using Matillion as a ELT to process the data from the source systems like SFDC, ORACLE, API, MS SQL Server. And the data from SAP CRM & ECC has been replicated into snowflake using HVR tool. And built pipelines to process the data into target layer.
Roles and Responsibilities:
 Responsible for all activities related to the development, implementation, support of ELT processes.
 Will work on the enhancing of the existing pipelines by optimizing the code as per the requirement.
 Support the Users for the data issues in the dashboard and backtrace the issue and fixing the code.
 Optimizing the existing code and pipelines and validating code by test scenarios.
 Worked on the Matillion Orchestrations and transformations.
 Extracting data from API using Matillion and transforming as per the client requirement.
 Identifying the data quality issues using Right data tool and working on the RC and the fix
 Migration of data from Oracle to Snowflake & SAP BODS workflows into Matillion
environment.
Client : ENEXIS GROEP, Nederland.
Role : Software Engineer
Technology : Snowflake UI, Oracle server, Matillion, SAP BODS,
Jira.
Description:
Enexis primarily work on a reliable and sustainable energy supply for today and for the future. The companies within Enexis Groep are committed, quick and agile. And thus they are able to meet the requirements of society and the challenges in the energy world. By
employing their knowledge, expertise and energy in all possible ways, they are accelerating the energy transition.
Over 120.000 new renewable energy connections were added in 2020; an unprecedented increase in one year. We are doing everything we can to connect renewable energy initiatives as quickly as possible. But the electricity grid cannot keep up with the growth everywhere. The production capacity of renewable energy generators increased in the past year by nearly 60%.
Roles and Responsibilities:
 Responsible for all activities related to the development, implementation, administration and support of ETL processes of Matillion.
 Importing the jobs designed by source team and creating Orchestration and Transformation jobs similar to SAP Bods.
 Validating each workflow by various checking’s such as giving exact workflow name, target
table, timestamp values.
 Running the Orchestration and Transformation jobs with the help of master job simply providing the exact target table name.
 Performing test scenarios such as Row count check, content check and delta checks.
 Comparing the results in snowflake with oracle copy.
 Analyzing in the mismatch of the data and making necessary changes in the transformation logic in matillion. Any data issue in source database reporting to client end.
 Reporting all the comments for a particular workflow in Jira.
 Once everything is similar delivering the jobs to Client.
 Migration of data from Teradata into Snowflake.
Client : AUSGRID
Role : Associate Software Engineer
Technology : Snowflake UI, Aws S3.
Description:
Ausgrid is an Engineering services company located in Australia that needs expertise
on building up technologies to support their growth and their off the ground, we derive our strength from experience strategic operational excellence and focus on innovation. By making quick and informed decision in a complex, fast-paced competitive business environment we provide innovative solutions to meet your critical business objectives that faster creativity and continuous improvement.
Roles and Responsibilities:
 Involved in Migrating Objects from Teradata to Snowflake.
 Used COPY command to bulk load the data from AWS S3.
 Performed data validations checks by taking sample records and making report with the obtained results.
 Created Snow pipe for continuous data load.
 Created internal and external stage and transformed data during load.
 Used Temporary and Transient tables on diff datasets.
 Cloned Production data for code modifications and testing.
 Shared sample data using grant access to customer for UAT.
 ALDI 1.0
Client : ALDI
Role : Data Analyst
Technology : MY SQL
Description:
The Aldi Group is the largest German supermarket corporation, currently holding a market share of 26%. Founded in 1898, it consists today of several cooperatives of independent supermarkets all operating under the umbrella organization Aldi AG & Co KG, with headquarters in Hamburg. There are approximately 4,100 stores with the Aldi nameplate that range from small corner stores to hypermarkets.
Roles and Responsibilities:
 Responsible for defining the scope of the project, gathering business requirements, doing analysis and documents.
 Preparing reports as per the requirement.
 Working on creating tables, indexes and constraints
 Developed various queries by joining various tables.
 Wrote SQL Queries using joins, sub queries to retrieve the data from thedatabase.
 Wrote dynamic SQL queries to create database objects.
 Involved in implementing the data integrity validation checks through constraints.

Timeline

Senior Software Engineer

Nitor InfoTech Ltd
08.2023 - Current

Software Engineer

IBM India Pvt ltd
12.2021 - 06.2023

Data Analyst

Capgemini India Pvt Ltd
08.2021 - 11.2021

Data Analyst Intern

Kandra Digital
06.2020 - 07.2021

B. TECH - Mechanical

JNTUA-University

High School Diploma -

Govt.Polytechnic
Raghavendra Darisha