Summary
Overview
Work History
Education
Skills
Timeline
SoftwareEngineer

Kadiyala Lekhya Chowdary

Software Engineer
Andhra Pradesh

Summary

Enthusiastic Software developer eager to contribute to team success through hard work, attention to detail and excellent organizational skills. Clear understanding of requirements and motivated to learn, grow and excel in Software Industry.

Overview

2
2
years of professional experience
7
7
years of post-secondary education

Work History

Project Engineer

Wipro Technologies
09.2020 - Current
  • Having 2 years of Experience in Azure Platform (Azure Data Factory, Azure Data Bricks), Client Interaction and Requirement Analysis.
  • Participation in all phases of Software Development Life Cycle (SDLC) including analysis, design, coding and testing.
  • Comprehensive problem-solving abilities and troubleshooting skills.
  • Good verbal and written communication skills.
  • Willingness to learn, team facilitator, passion for technology, Adaptability.

Azure Data Engineer

Microsoft

Project Name : Enterprise Data Lake Platform

Description:

The Enterprise Data Lake House is Microsoft Digital’s unified foundational, governed, trusted, “data on tap”, shared capability offering publishers a rich self-serve data publishes experience and the subscriber with a self-serve access management. It offers rich data life-cycle management (DLM), unified GDPR, transparent, governed, and scheduled processing of data. The data is published as standardized assets available as DELTA formatted (ACID compliant) entities. Azure Purview governs and manages efficient data access. The data is available on Azure ADLS Gen2 path as a delta-formatted asset that can be subscribed with bring your own compute over chosen computes like ADB, Synapse.

The source published data is available as delta formatted entities that are subscribed to by several tenants for building analytical data products for scorecards, analysis, reporting, ML. Currently EDLP enables a data freshness HTTP API for the subscribers. They can subscribe to it and get the latest status with freshness details for an entity. Further, taking a dependency on the API, they could build a downstream pipeline for data products. Currently the API serves as an informative endpoint using which the subscribers can check for dependency fulfillment and schedule downstream pipeline jobs.

Roles & Responsibilities:

  • Creating and updating Triggers and pipelines in Azure Data Factory.
  • Monitoring trigger runs and check the data flow from source to destination.
  • Writing SQL queries to see the ADLS data by using ADB (Azure Data Bricks) resource.
  • Providing SPN/UPN level access to downstream users using Postman scripts.
  • Creating clusters and configuring.
  • Checking cluster Logs and troubleshooting accordingly.
  • Checking the metadata and updating in Cosmos DB based on the user requirement.
  • Writing the NoSQL queries in cosmos DB.
  • Debugging the issue and repulling the missing data using data pipelines.
  • Checking SPN level access in ADLS and Azure Active Directory.
  • Sound knowledge on creating ACL propagation for ADLS path.

Education

Bachelor of Technology - ECE

Sri Venkateswara College of Engineering And Technology
07.2016 - 05.2020

Board of Intermediate Education -

Sri Chaitanya Junior College
Vijayawada
06.2014 - 05.2016

SSC -

Raju High School
Rayachoty
07.2013 - 05.2014

Skills

    Azure Data Factory

undefined

Timeline

Project Engineer

Wipro Technologies
09.2020 - Current

Bachelor of Technology - ECE

Sri Venkateswara College of Engineering And Technology
07.2016 - 05.2020

Board of Intermediate Education -

Sri Chaitanya Junior College
06.2014 - 05.2016

SSC -

Raju High School
07.2013 - 05.2014

Azure Data Engineer

Microsoft
Kadiyala Lekhya ChowdarySoftware Engineer