Results-driven and adaptable software professional with a strong foundation in backend development, data engineering, and system design. Proven ability to deliver scalable, high-performance solutions in fast-paced, collaborative environments. Quick learner with a proactive mindset, strong problem-solving skills, and a passion for building clean, efficient systems. Adept at working across cross-functional teams, continuously optimizing for performance, reliability, and impact. Eager to contribute to innovative projects and drive meaningful outcomes aligned with organizational goals.
Backend Engineering & API Development
• Integrated user-facing front-end components with robust server-side logic to deliver seamless web experiences
• Built modular, reusable code libraries for scalable and maintainable backend services
• Optimized applications for speed and scalability through efficient code design and system tuning
• Implemented security best practices, including data protection mechanisms and vulnerability mitigation
• Designed and maintained secure, high-performance data storage solutions
• Collaborated cross-functionally with frontend teams to ensure smooth UI-backend integration
• Debugged and resolved critical issues in production systems, enhancing application reliability
• Researched and adopted modern backend development best practices and emerging technologies
• Authored detailed technical documentation for APIs and data schemas to support team collaboration
Database Engineering & Performance Tuning
• Refactored and optimized complex SQL queries, leveraging indexing and best practices to improve performance
• Developed and maintained real-time data pipelines using PostgreSQL CDC events, Debezium, and NATS for high-throughput data streaming
• Implemented effective caching strategies across application layers to reduce latency and improve performance
Data Engineering & Workflow Orchestration
• Designed, developed, and deployed ETL pipelines from scratch, ensuring end-to-end data flow reliability
• Built batch processing workflows using Prefect and Mage for large-scale data transformation
• Utilized Apache Airflow and Prefect for orchestrating automated and fault-tolerant data workflows
• Performed data wrangling using Python and SQL to cleanse and transform raw data into analytics-ready formats
• Developed ELT pipelines using Airbyte for efficient data ingestion and transformation across heterogeneous sources
• Deployed scalable, containerized data solutions using Docker and Kubernetes in open-source environments
• Leveraged DBT to design and implement modular, incremental data models in PostgreSQL, streamlining data workflows and supporting analytics use cases
DevOps & Testing
• Implemented automated unit testing for backend systems to ensure code reliability and reduce regression bugs
• Maintained source control using Git for version tracking and collaborative development