seeking employment that will allow me to develop my skills, offer room for advancement, and allow me to effectively use the knowledge and abilities I've acquired through education and experience to benefit the organization.
Python
Tools: Cub.js, pyspark, Airflow, big query,PostgreSQL, Apache Kafka.
Description: Businesses use dashboards as an analysis tool to analyse and assess the effectiveness of their plans using high-quality KPIs. We used data mining, legal SQL data analysis, and metrics analysis in this project. using Cube.js, BigQuery, and PostgreSQL, we also created executive KPI reports, funnel reports, and drill-down reports.
Tools: python, spark,Apache Kafka,Airflow,PostgreSQL,GCP(dataproc,big query,gcs buckets)
Description: Built an ETL in this project that ingested transactional and event data from a web app with 100,000 daily active users, saving over $85,000 annually in external vendor costs. This project will be using numerical ratings of similar items between the active user and other system users to evaluate how similar their profiles are, and then use that information to predict recommendations of unseen items to the active user.
Tools: python, spark,Apache Kafka,Airflow,Hive,PostgreSQL,GCP(dataproc,big query,gcs buckets,gcs composer)
Description: Built an ETL that ingested transactional and event data from a web application and maintained data pipeline uptime of 99.8% while ingesting streaming and transactional data across 6 different primary data sources using Spark, big query, gcs buckets, and Python,Apache Kafka, Airflow, and also automated ETL processes across billions of rows of data, which reduced manual workload by 25%.
Travelling, Cooking and passionate about Technology.