Project name : North Asia L`Oreal Dashboard
- Experienced Data Engineer with over three years of experience and worked on client where we build the pipelines for sellouts and stocks of different markets and media dashboards
- Collaborated in deploying Cloud Run services to perform ETL jobs for loading data into Big Query from different sources file GCS, SharePoint and oracle
- Worked on Data Modelling for tables in Big Query
- Developed the code for unmerging rows, melting columns and also worked on unzipping the file through Cloud Run using python
- Designed and implemented data pipelines, resulting in a 40% improvement in data processing efficiency
- Worked with large datasets in Google Cloud Storage, utilizing SQL for data manipulation and analysis
- Utilized Big Query, Cloud Run, Composer, and Cloud Shell to build scalable data solutions
- Redesigned architecture of DAGS to improve workflow, resulting in a 50% reduction in processing time
- Orchestrated end-to-end data pipelines using DBT for ETL transformations, ensuring data integrity and quality
- Handled release/move to production jobs where the modules were delivered to users with minimum issues
- Hands on experience with Airflow to schedule and orchestrate workflow between various on-prime to GCP components
- Leveraged GitHub for version control and CI/CD pipelines, enhancing collaboration and maintainability of data workflows
- Automated infrastructure provisioning on cloud platforms (GCP) using Terraform, enabling scalable and reliable environments
- Hands on experience to push the codes and scripts into CSR (Cloud Source Repository) and GitHub
- Created complex logics and transformations while using the warehouse concepts in Big Query and Optimized the Big Query scripts in terms of cost and performances
- Migrated projects between regions, ensuring seamless transition and minimizing downtime
- Mentored new team members, improving their productivity through strategic problem-solving guidance
- Implemented version-controlled CI/CD pipelines for DBT models and Terraform configurations, ensuring consistent and auditable deployments
- Implemented DBT (Data Build Tool) to streamline data transformation pipelines, ensuring consistent and scalable data modelling and transformation
- Collaborated cross-functionally with data engineers, analysts, and DevOps teams to optimize data workflows and infrastructure management, fostering a cohesive data-driven environment
- Create Authorised Views in Big Query to provide Required Data to do down streams (Power BI/ Business team and Stake Holders)