Developed robust data pipelines using SSIS, optimizing data transfer processes and ensuring data integrity
This resulted in a 30% reduction in ETL processing time and increased data accuracy by 20%
Gathered technical business requirements from customers through Jira and Zendesk tickets, converting them into
SQL scripts and SSIS data pipelines, ensuring adherence to best practices and standards
Documented solution architecture for knowledge transfer and troubleshooting
Collaborated with functional consultants and Data Scientists, utilizing complex SQL queries, Python, and o9 platform IBPL queries to troubleshoot issues and ensure data accuracy
Demonstrated proficiency in mentoring junior team members, sharing best practices, and fostering a collaborative learning environment
Recognized with spot awards for exceptional contributions to team projects and initiatives.
Senior Technical Associate
Merilytics
Hyderabad
06.2021 - 10.2021
Contributed to the development of a comprehensive PowerBI dashboard, providing customers with insights into key
performance indicators (KPIs) crucial for decision-making
Demonstrated proficiency in T-SQL scripting by writing complex queries and procedures to extract, transform, and
analyze data from various sources, ensuring data accuracy and integrity
Collaborated closely with stakeholders to understand business requirements and provide technical solutions that
aligned with organizational objectives
Projects
Spotify Data Pipeline: I developed a comprehensive Spotify data pipeline on AWS, beginning with data extraction
from the Spotify API using Python within AWS Lambda
The raw data was stored in JSON format in an S3 bucket
Leveraging AWS Glue with PySpark for data transformation, I processed the JSON data, creating separate dataframes
for song, album, and artist data
These dataframes were then converted into CSV format and stored back in the S3
bucket
To automate the process, I scheduled daily execution through AWS CloudWatch
Finally, I seamlessly ingested
the transformed CSV data into Snowflake tables using Snowpipes connected to S3
E-Commerce Data Pipeline: Engineered an end-to-end Azure data pipeline, orchestrating the download and storage
of user data in Azure Blob Storage using Azure Data Factory
Implemented a subsequent pipeline for Parquet-format file
transfer and event-triggered processing with Azure Blob Storage, followed by data transformations adhering to the
Medallion Architecture principles in Azure Databricks
Leveraged SQL for analytics, culminating in the creation of Delta
tables and the extraction of valuable business insights.