Protegrity Data Security Implementation
Role: Data Engineer / Security Integration Specialist, Tools & Tech: Protegrity, Hadoop, Spark, Hive, HDFS, Policy Manager
- Implemented enterprise-grade data encryption and tokenization solutions using Protegrity to secure sensitive PII and financial data.
- Integrated Protegrity seamlessly into the big data ecosystem (Hadoop, Spark), optimizing policies to minimize latency and preserve performance
- Collaborated with data governance and compliance teams to align with industry regulations (e.g., PCI DSS, GDPR).
- Performed end-to-end testing and fine-tuning of security workflows to ensure operational efficiency and scalability.
Hortonworks to In-House Platform Data Migration
Role: Data Engineer / Migration Specialist, Tools & Tech: Hortonworks, Spark, Hive, HDFS, Airflow, Python, SQL
- Led the migration of enterprise data pipelines from Hortonworks to an internally hosted Big Data environment, ensuring compatibility and optimized performance.
- Analyzed and redesigned existing ETL workflows for scalability, rebuilding them using Spark and Hive with best practices for data processing efficiency.
- Tuned Spark jobs and Hive queries to improve execution times and reduce resource consumption.
- Managed schema mapping, data validation, and reconciliation to maintain data integrity and consistency throughout the transition
- Ensured minimal downtime by conducting detailed impact assessments, dry runs, and post-migration performance monitoring.