Summary
Overview
Work History
Education
Skills
Affiliations
Accomplishments
Languages
Timeline
Generic
Divya Raghu

Divya Raghu

Bangalore

Summary

Practical Database Engineer possessing in-depth knowledge of data manipulation techniques and computer programming paired with expertise in integrating and implementing new software packages and new products into system. Offering 4.9 years background managing various aspects of development, design and delivery of database solutions. Tech-savvy and independent professional bringing outstanding communication and organizational abilities. A proven track record of working in a fast-paced environment. Highly-motivated employee with desire to take on new challenges. Strong work ethic, adaptability, and exceptional interpersonal skills. Adept at working effectively unsupervised and quickly mastering new skills.

Overview

4
4
years of professional experience

Work History

Senior Data Engineer/Analyst

Dell International Services
Bangalore
09.2021 - Current
  • Collaborated with other teams to understand their requirements and deliver solutions accordingly.
  • Developed data pipelines to ingest and process large datasets.
  • Integrated multiple heterogeneous sources into one unified source via ETL processes.
  • · Expertise in Informatica PowerCenter, Power BI, SQL, and Teradata – Design, develop ETL jobs.
  • · Working as developer to work on regular Backlog business requirements based on their priorities.
  • · Working as a Solution Architect to provide Data Lake solutions to Business and make them understand the architecture of it.
  • · Handling the production issues, maintaining data availability is essential for the performance and business continuity and keep your team as a best example for Data Availability.
  • · Involved in building ETL CICD pipeline.
  • Basic knowledge in Python programming.
  • Interacting with the client and understanding their business requirements and the change request.
  • · Having Experience to build CICD pipelines in Git lab.
  • Having good knowledge in AWS, Python, Devops (Git lab) Machine learning and AI models.
  • Optimized existing queries in order to improve query performance and reduce the load on the server.
  • Managed security protocols and access control policies within the organization's databases and systems.
  • Participated in regular meetings with management to assess and address issues and identify and implement improvements.
  • Managed performance monitoring and tuning while identifying and repairing issues within database realm.
  • Adept in troubleshooting and identifying current issues and providing effective solutions.

Data Engineer/Analyst

Apexon Consulting
Bangalore
09.2019 - 06.2021
  • · Developed mappings with different transformations such as expression, aggregator, sorter, lookup and normalizer.
  • · Using Informatica Power Connect developed sessions with different sources such as flat file, Oracle and loaded into targets like flat file, Oracle.
  • · Created other tasks such as command task & email task.
  • · Created worklets required for the workflow.
  • · Involved in developing the test cases of smoke, regression and end to end scenarios.
  • · Created unit test case documents.
  • · Verified column mapping between source and target.
  • · Involved in execution of smoke, Regression, and end to end scenarios for each release.
  • · Involved in defect tracking, management and reporting activities.
  • · Involved in sprint retrospective meeting to prepare sprint retrospective documents.
  • · Monitoring daily/weekly/monthly jobs.
  • · Involved in On Demand ETL Run.
  • · Analyzing the ETL job failure and fixing it as per cardinal SLA norms.
  • · Updating daily job status to onsite Team.
  • · Interacting with clients in gathering and understanding the requirements.
  • · Importing data from RDBMS to HIVE, HDFS and Exporting data from HIVE, HDFS to RDBMS by using SQOOP.
  • · Developed and worked with SQOOP incremental loads to populate HIVE external tables
  • · Creating HIVE external tables to store the results on top of HDFS
  • · Create clone concepts to maintain zero copies in Snowflake.
  • · Worked on Task creation for Scheduling/Automate Snowflake jobs.
  • · Involved in loading data from Linux file system to HDFS.
  • · Writing CLI commands using HDFS.
  • · Creating HIVE tables with Partitions and Buckets.
  • · Developed Spark SQL queries for processing HIVE tables by using hive context.
  • · Import the data from different sources like HDFS/HBase into Spark RDD.

· Experienced in implementing Spark RDD transformations, actions to implement business analysis

Education

BCA - Bachelor of Computer Applications

Jyoti Nivas College Autonomous University
Bengaluru
07-2019

Skills

Languages:

SQL, Python

ETL/DWH:

Informatica Power Center

Databases:

Teradata, Oracle 11g/12c, Ibmdb2

IDE:

Eclipse, Spider, Jupiter Notebook

Repositories:

GIT, SVN and Bit bucket

Data Modelling:

Star Schema, Snowflake Schema, ER-model, Dimensional Model

Reporting Tool:

BI, Tableau, Knowledge on Grafana

project management: framework

Agile, Waterfall

Tools:

TOAD, SQL Developer

OS/Environment:

Windows, Linux, UNIX

Affiliations

  • Effective personality.
  • Flexible and Team Player.
  • Keen to learn new technologies and quick learner.
  • Effective communicator and good listener.

Accomplishments

  • Led to the successful launch of the B2B and CSG surveys for Voice of Customer. I have shown dedication and leadership to this role throughout the year. My continuous efforta during the weekly refreshes for B2B and quarterly samples for CSG has driven the customer success.
  • I have been awarded for going above and beyond to support the business in stabilization of B2B survey code in last few sprints and working along with team members to improve the overall execution through Automation.
  • Implemented the solution for the DSR automation which handles the Privacy requests of NPS data to close without any manual intervention. With this efforts we are compliant with DELL Privacy for all the VOC survey programs.
  • DSR automation, Quebec rollout, and many more improvements to VOC product are a direct result of my commitment and contribution. And my willingness to give my best and ensuring risks are being managed timely enabled us to deliver much needed outcomes for our business within hard timelines. With my direct contribution, we have a solid plan for FY24 to maximize automation and streamline handshakes between processes to transition VOC product to become more seamless.
  • Voice of Customer (VOC) product roadmap transition from FY23-Q4 to FY24-Q1 has been excellent in terms of preparation, planning, business engagement, and outcome. The fact that we have successfully delivered 9 CRs in FY24-Q1 and executing two projects as per the plan speaks to the discipline, detail work. and commitment.
  • I have been Diligent in delivering the tasks assigned and was able to kick start Project #5 by successfully deploying the objects to Production as committed to the Business without the lift and shift in the timelines.
  • Involved in multiple triage activities, including the SDC 5.5.0 upgrade, migration of scripts from SDC to edge nodes, migration of SVC accounts to AD, control-m support as needed for the team and ensuring timely delivery of Project #5.

Languages

English
First Language
Telugu
Proficient (C2)
C2
Kannada
Proficient (C2)
C2
Hindi
Intermediate (B1)
B1
Tamil
Upper Intermediate (B2)
B2

Timeline

Senior Data Engineer/Analyst

Dell International Services
09.2021 - Current

Data Engineer/Analyst

Apexon Consulting
09.2019 - 06.2021

BCA - Bachelor of Computer Applications

Jyoti Nivas College Autonomous University
Divya Raghu