Summary
Overview
Work History
Education
Skills
Personal Information
Tech Stack
Timeline
Generic

Kubra Rasheed

Hyderabad

Summary

Experienced data engineer with over a decade of experience in designing, developing, and implementing advanced data solutions. Skilled in utilizing AWS cloud services, Python, Snowflake, Informatica PowerCenter, SQL, and Unix to enhance data-driven decision-making and streamline data workflows. Demonstrated success in overseeing entire data pipeline processes, maintaining data integrity, and providing top-notch data solutions that support organizational objectives.

Overview

12
12
years of professional experience

Work History

Senior Consultant

Deloitte Consulting USI
11.2015 - Current
  • Architected and implemented a comprehensive AWS cloud data warehousing solution that consolidated disparate data sources, reducing query times by 50% and supporting a 30% increase in concurrent data analysis tasks.
  • Increased data processing efficiency by designing and deploying an ETL pipeline using AWS Glue and Lambda, which processed over 1TB of data daily with 99.9% uptime.
  • Initiated and managed the transition to serverless architectures with AWS technologies, cutting operational costs by 20% and enhancing scalability for high-traffic events.
  • Successfully established a secure and efficient data sharing mechanism between Amazon Redshift Serverless and Amazon Redshift provisioned clusters, enhancing cross-cluster data accessibility and collaboration. This setup enabled seamless data sharing without the need for data duplication, significantly improving data processing efficiency and reducing storage costs to 50%.
  • Developed and deployed AWS Lambda functions using Docker images, enabling custom runtimes and dependencies, and optimized Docker files to enhance image build efficiency and performance, improving reusability by 40%.
  • Automated the build, tag, and push process of Docker images to ECR using AWS CDK and integrated AWS Code Build and Code Pipeline, streamlining deployment workflows and ensuring consistent, secure, and high-performing deployments, enhancing reusability by 35%.
  • Designed and implemented a robust data pipeline leveraging AWS services including S3, SQS, SNS, Lambda, and Redshift, ensuring efficient data processing and storage.
  • Developed a dead-letter Lambda function to handle failed messages, enhancing the reliability and fault tolerance of the pipeline.
  • Designed and implemented One time migration and batch data processing for the world’s second largest investment firm, integrating 13 legacy mainframe files into 6 predecessor and 4 final producer jobs (containing over 1 billion+ records), to AWS Aurora, executing over 80 business rules, facilitating ~50% reduction in code complexity and achieving ~90% reduction in daily batch cost for the client.
  • Led the architectural design and execution of a data migration project to Snowflake, transitioning from legacy systems to a cloud-based environment. Developed a comprehensive strategy utilizing Snowflake's bulk loading capabilities, including Snowpipe and SnowSQL, along with zero-copy cloning and time travel for data validation, ensuring data integrity and minimizing downtime.
  • Expertly leveraged Blade Bridge Analyzer and Converter tools to meticulously analyze and transform metadata for over 10,000 Informatica workflows, facilitating a seamless data migration from Teradata to Snowflake.

Systems Engineer

Tata Consultancy Services Ltd
12.2012 - 11.2015
  • Analyzed Business Requirement Documents (BRD) and defined steps for data extraction, business logic implementation, and target loading, ensuring alignment with business goals.
  • Gathered information for various Change Data Capture scenarios, enhancing data accuracy and consistency.
  • Developed and monitored over 100 complex mappings, mapplets, transformations, and workflows using Informatica tools (Source Analyzer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Manager, and Workflow Monitor), improving data processing efficiency by 30%.
  • Created reusable UNIX scripts with advanced techniques (awk, sed, grep) and PMCMD commands, optimizing interaction with the Informatica Server and automating data loading processes, resulting in a 40% increase in performance and reusability.

Education

Bachelor of Engineering - Computer Science

Visvesvaraya Technological University
07-2012

Skills

  • Experience with AWS cloud technologies
  • Python
  • Snowflake
  • Informatica PowerCenter
  • SQL
  • Unix
  • Data Pipeline Management
  • Data Analysis and Reporting
  • Problem-Solving
  • Collaboration and Leadership
  • Data Engineering
  • RDBMS
  • Blade Bridge Converter Tool

Personal Information

Title: Senior Consultant | AWS Data Engineer | Snowflake Data Engineer

Tech Stack

EC2, SNS, SQS, S3, ECR, Lambda, Redshift, DynamoDB, CloudWatch, Athena, Step Functions, ETL, Data Warehousing, Redshift, Snowflake, Oracle, Teradata, PostgreSQL, Unix Shell Scripting, SQL, Python, SnowSQL, Informatica Power Center 9.1 Hot fix, WinSCP, Autosys, SnowPipe, Blade Bridge, Docker, Git

Timeline

Senior Consultant

Deloitte Consulting USI
11.2015 - Current

Systems Engineer

Tata Consultancy Services Ltd
12.2012 - 11.2015

Bachelor of Engineering - Computer Science

Visvesvaraya Technological University
Kubra Rasheed