Summary
Overview
Work History
Education
Skills
Websites
Certification
Awards
Projects
Languages
Timeline
Hi, I’m

Chandra Teja Reddy G

Anantapur
Chandra Teja Reddy G

Summary

Experienced Data Architect & Data Modeller with a proven track record at Dun & Bradstreet and Infosys. I excel in designing and implementing scalable data solutions, leveraging SQL, Python, XML, JSON, AWS, and GCP. My expertise includes developing robust DTS/BDL for data consistency, alongside efficient data management in PostgreSQL and MySQL Server. I specialize in Data Modeling for strategic decision-making, streamlining complex data structures, and contributing to significant organizational efficiencies. A collaborative leader, I also designed Disaster Recovery processes, led NFR testing for performance optimization, and ensured data quality during UAT.

Overview

6
years of professional experience
1
Certification

Work History

Dun & Bradstreet

Data Architect L1
12.2022 - Current

Job overview

  • Led the development of comprehensive High-Level Design (HLD) and Low-Level Design (LLD) for various applications, with a strong focus on scalability and performance optimization to align with evolving business requirements. Collaborated closely with cross-functional teams to ensure seamless integration of design components, adhering to industry best practices
  • Specialized in advanced Data Modeling methodologies, leveraging tools such as Erwin Data Modeler for streaming applications. Designed models that enhanced efficient data ingestion, mastering processes, and supported timely downstream product publishing. Applied agile data modeling techniques to accommodate real-time data streams, significantly improving data processing efficiency and the accuracy of insights.
  • Developed and deployed agile JSON and XML schemas using Altova XMLSpy, ensuring flexible schema design to support evolving data requirements. Utilized BigQuery to capture and process streaming data in real-time, enabling efficient data analysis and storage. For mastered data, implemented PostgreSQL to ensure consistent and reliable long-term storage, optimizing database performance and ensuring data integrity.
  • Utilized extensive SQL expertise to analyze and transform source data, driving precise design decisions for target applications. Applied a combination of statistical tools and data profiling techniques to uncover actionable insights, ensuring alignment with business goals and improving the accuracy of data flows.
  • Designed robust ETL processes and Data Transformation Schemas to facilitate the smooth transfer of data between multiple sources and targets. Ensured consistency and integrity throughout the data mapping process, minimizing potential data loss and ensuring the reliability of the transferred data. Applied a strong understanding of Data Warehouse principles to ensure robust storage, accessibility, and processing of data across systems.
  • Contributed significantly to improving Data Quality by implementing profiling techniques and tools to identify and address data issues, enhancing overall data governance.
  • Automation & Efficiency Tools: Created and designed numerous utility tools that significantly reduced manual effort for frequent and repetitive tasks. These tools streamlined critical operations such as application recovery and precise data corrections in production environments, drastically minimizing errors and improving operational efficiency.
  • Disaster Recovery (DR) Leadership: Spearheaded the design and implementation of comprehensive system data Disaster Recovery processes, developing robust strategies and protocols to ensure rapid system restoration and business continuity in critical outage scenarios. This included defining RTO/RPO objectives and establishing failover mechanisms.
  • Non-Functional Requirements (NFR) Excellence: Championed Non-Functional Requirements (NFR) testing for developed systems. Meticulously designed and executed performance, scalability, and reliability tests, identifying and resolving critical performance bottlenecks. This proactive approach significantly improved system efficiency, stability, and user experience.
  • User Acceptance Testing (UAT) & Data Integrity: Played a pivotal role in User Acceptance Testing (UAT) phases, working closely with business stakeholders. Focused intensely on validating data correctness, ensuring data quality, and confirming that the delivered solutions met precise business requirements and maintained the highest standards of data integrity.

Infosys

Data Engineer
06.2021 - 12.2022

Job overview

  • Extracted data from various sources and organized it into structured, accessible formats to streamline data processing and analysis.
  • Focused on data modeling using tools like Erwin Data Modeler, ensuring that data is efficiently structured for streaming applications and storage solutions.
  • Developed JSON schemas to facilitate smooth data integration and ensure flexibility as data requirements evolved.
  • This allowed for better data management and seamless integration between applications.
  • Wrote optimized SQL queries to process data efficiently, ensuring consistency and accuracy across datasets.
  • Participated in the development of Data Transformation processes, enabling smooth transitions between different data formats and systems.
  • Monitored and ensured data quality, applying data validation techniques to maintain the integrity and accuracy of the processed information.

National Institute of Electronics & Information Technology (NIELIT)

Big Data Analytics Internship
05.2021 - 07.2021

Job overview

  • Trained in Big Data Analytics using Python and Hadoop.

MSM Technologies

Software Engineer (Data Modelling)
06.2020 - 06.2021

Job overview

  • Designed and maintained data models using Erwin Data Modeler and ER Studio.
  • Wrote and optimized SQL queries for efficient data management and analysis.
  • Utilized Excel for data organization, analysis, and reporting.
  • Presented data insights clearly to stakeholders using Excel reports.
  • Contributed to data-driven projects by applying data modeling and SQL skills effectively.

Mahaugha Solutions Limited

Data Analyst Intern
05.2019 - 07.2019

Job overview

  • Established the data and reporting infrastructure from the ground up, utilizing Tableau and SQL to deliver real-time insights into product performance, marketing funnels, and business KPIs.
  • Devised and executed A/B experiments for products, resulting in a 19 basis point increase in conversion rates and a 12 basis point reduction in churn.
  • Successfully implemented a long-term pricing experiment that enhanced customer value by 25%.
  • Constructed operational reporting using Tableau to identify areas for improving contractors' annual revenue.

Education

Sri Venkateswara University

Bachelor's Of Technology from Computer Science and Engineering
09.2020

University Overview

Sri Chaitanya Junior Kalasala

Intermediate
04.2016

University Overview

Radha School of Learning

10th Standard
04.2014

University Overview

Skills

  • Data Architecture
  • Data Modelling
  • Data Transformations
  • Data Profiling
  • Systems Design
  • DBMS
  • SQL
  • JSON
  • XML
  • C programming
  • Erwin Data Modeller
  • Python
  • GCP
  • AWS
  • Big Query
  • Tableau
  • Statistics
  • Master Data Management

Certification

  • Google Data Analytics Professional Certification, Coursera, 09/28/22
  • Google Cloud Architect, Udemy, In Progress

Awards

Awards
  • Infosys Internal Appreciation, 06/28/22, Received commendation from the team manager for exemplary performance and significant contributions to team objectives.
  • The Keystone Award, 04/10/24, Teja is a dynamic problem-solver with a passion for innovation. He constantly seeks new opportunities to improve processes and inspire change within the team. His proactive approach and ability to generate fresh ideas have led to significant improvements, driving both efficiency and creativity. Teja's commitment to action makes him an outstanding candidate for the Keystone (Igniter) award.

Projects

Projects

Real-time Streaming Data Platform for Customer 360

Description: Led the architectural design (HLD/LLD) and implementation of a real-time data streaming platform to consolidate diverse customer interaction data (web clicks, app usage, CRM events) into a unified Customer 360 view.

Key Skills Demonstrated:

  • High-Level & Low-Level Design: Detailed architecture for streaming ingestion, processing, and serving layers.
  • Data Modeling: Designed agile data models for real-time streams (using Erwin Data Modeler if applicable) accommodating evolving customer data.
  • Streaming Technologies: Utilized BigQuery for real-time capture and analytics; defined efficient JSON/XML schemas (Altova XMLSpy).
  • Data Mastering & Storage: Implemented PostgreSQL for persistent storage of mastered customer profiles, optimizing for read/write performance.
  • ETL/ELT: Designed data transformation pipelines for clean, consistent data.
  • Performance Optimization: Focused on minimizing latency and ensuring high throughput for millions of events per second.
  • Data Quality & Governance: Implemented profiling and validation rules at ingestion to ensure data accuracy.

Languages

English
Proficient
C2

Timeline

Data Architect L1
Dun & Bradstreet
12.2022 - Current
Data Engineer
Infosys
06.2021 - 12.2022
Big Data Analytics Internship
National Institute of Electronics & Information Technology (NIELIT)
05.2021 - 07.2021
Software Engineer (Data Modelling)
MSM Technologies
06.2020 - 06.2021
Data Analyst Intern
Mahaugha Solutions Limited
05.2019 - 07.2019
Sri Venkateswara University
Bachelor's Of Technology from Computer Science and Engineering
Sri Chaitanya Junior Kalasala
Intermediate
Radha School of Learning
10th Standard
Chandra Teja Reddy G