Experienced Data Engineer with three years of hands-on practice in Azure and AWS ecosystems, specializing in end-to-end data solution design. Proficient in SQL, Python scripting, and ETL processes across diverse on-premises and cloud environments. Skilled in data warehousing, adept at version control using Git and Azure DevOps, and committed to implementing rigorous data quality assurance measures. Skilled with effective communication, quick adaptability to new technologies, and a reliable track record of timely project delivery.
· Employed AWS Glue for data extraction, transformation, and loading tasks.
· Orchestrated workflow using AWS Step Functions and managed with Lambda functions.
· Utilized CloudWatch for error monitoring and Amazon SES for automated email notifications.
· Leveraged AWS Redshift for effective data storage and Power BI for insightful reporting.
· Leveraged AWS Databrew for data quality improvement.
· Ensured data security and governance by implementing AWS Lake Formation.
· Developed a comprehensive framework for constructing a Datalake, reducing development rework.
· Implemented architecture driven by metadata to enhance flexibility and adaptability.
· Applied effective modelling techniques for the Data Warehouse (DWH) and enabled parallel data processing.
· Implemented System-assigned Managed Identity/User-assigned Managed Identity for seamless connection between Azure resources.
· Microsoft Certified: Azure Data Engineer Associate (DP-203), Microsoft
· Microsoft Certified: Azure Fundamentals (AZ-900), Microsoft
· AWS Certified Cloud Practitioner (CLF-C01), Amazon Web Services (AWS)
· Academy Accreditation -Databricks Lakehouse Fundamentals, Databricks
· Python for Data Science, Forsk Coding School
Provided guidance and mentorship to 5 junior team members, fostering their professional growth and enhancing team performance.
Conducted sessions for peers in topics – Framework for data ingestion from different sources to azure and Building Datalake in AWS using AWS S3, AWS Lakeformation.