Data Architect with extensive experience in defining enterprise data architecture vision, strategies, principles, and standards. Skilled in collaborating with business stakeholders, system designers, and development teams to ensure a cohesive and scalable data architecture. Adept at aligning data solutions with business objectives to drive efficiency and innovation. Certified Azure AI Associate, with expertise in leveraging Azure's AI and machine learning capabilities to develop intelligent, data-driven solutions. Experienced in cloud data platforms, predictive analytics, and enterprise data management, with a strong focus on driving business value through advanced analytics, and AI-driven insights.
Project Overview:
Built a scalable data pipeline integrating MES, SAP, and third-party systems using AWS Glue, Greengrass, Lambda, S3, and Athena. Enabled real-time and batch processing for production monitoring, quality control, and predictive maintenance, ensuring optimized data ingestion, governance, and security compliance.
Responsibilities:
Technologies Used:
Cloud & Data Processing: AWS Glue, AWS Athena, AWS Lambda, AWS Greengrass.
Analytics & Visualization: Power BI.
Development & Automation: AWS CodeCommit, AWS CodePipeline, TypeScript CDK, Python, PySpark, SparkSQL.
AI & ML: AWS Bedrock, LangChain.
Project Overview:
Developed a Centralized Data Store (CDS) and data pipeline using Azure SQL Server and Azure Data Factory (ADF) to integrate data from JIRA, SonarQube, finance systems, demand and resource planning systems including SharePoint lists. Automated data ingestion and transformation to generate key KPIs like sprint velocity, SonarQube efficiency, and resource utilization, visualized in Power BI for actionable insights and improved decision-making.
Responsibilities:
· Designed & implemented a Centralized Data Store (CDS) and data pipeline using Azure SQL Server & ADF to integrate data from JIRA, SonarQube, finance, demand/resource planning systems, and SharePoint lists.
· Developed & optimized ETL workflows in ADF for automated data ingestion, transformation, and consolidation.
· Designed scalable data models in Azure SQL Server, ensuring performance and efficient reporting.
· Implemented security measures including RBAC, data encryption, masking, and secure data
Technologies Used:
· Cloud & Database: Azure SQL Server, Azure Data Lake
· Data Integration & ETL: Azure Data Factory (ADF), Azure Functions
· Security & Compliance: Role-Based Access Control (RBAC), Data Encryption, Data Masking, Secure Data Transmission
· Data Storage & Processing: Azure Blob Storage, Azure Synapse Analytics
· Visualization & Reporting: Power BI
· Source Systems: JIRA, SonarQube, Finance Systems, Demand & Resource Planning Systems, SharePoint Lists
· Scripting & Automation: Python, SQL
Project Overview:
Supported DTNA’s large-scale Records Management initiative by migrating petabytes of data from on-premises file shares to the cloud using OpenText ECM. This initiative enhanced enterprise content management, accessibility, and compliance within the organization.
Responsibilities:
· Provided consultation on functional and business processes, demonstrating migration strategies and best practices.
· Designed and documented end-to-end data migration solutions.
· Developed and implemented a migration process/tool to transfer large-scale data from on-premises file shares to Azure.
Technologies Used:
· Cloud & Data Migration: Azure Data Factory, Azure Data Lake
· Scripting & Automation: PowerShell, Python
Project Overview:
Countrywide is a leading provider of estate agency, lettings, financial services, land, new homes, auctions, surveying, conveyancing, and property management services. Corporate clients enjoy an integrated approach delivered by a dedicated corporate business team. Implemented Einstein solutions in marketing.
Responsibilities:
As a Salesforce Einstein consultant,
Consultation on functional and business processes.
Developed and maintained dashboards in the Salesforce Einstein Platform for all Marketing business units.
Set up the Einstein Discovery model to find lead generation by different campaigns. This helped the marketing team select the right tactic.
Technologies Used:
▪ Einstein Analytics.
▪ Einstein Discovery.
Project Overview:
Designed and implemented an end-to-end data solution for Govia Thameslink Railway (GTR) to analyze ride quality data and develop a prediction model for railway track deterioration over time. This Proof of Concept (PoC) aimed to enable timely track maintenance, reducing last-minute speed restrictions and improving overall railway safety and efficiency.
Responsibilities:
· Designed end-to-end data architecture to analyze ride quality data and predict railway track deterioration over time.
· Developed a predictive analytics model to enable timely maintenance and reduce last-minute speed restrictions.
· Led a Proof of Concept (PoC) to demonstrate insights on track conditions, improving decision-making for railway maintenance. Design an end-to-end analytical landscape involving Power BI dashboards connected to backend Azure Data Lake system to enable a system to analyze/detect the defect in the railway track.
· Conceptualizing the architecture to process and cleans the data from sensor on board, designing and setting up Power BI connectivity for the effective visualization of the findings all the way up to building the actual Tender dashboards for the paid PoC.
Technologies Used:
· Cloud & Database: Azure Data Lake
· Data Integration & ETL: Azure Synapse Analytics, Azure Data Factory (ADF), Azure Functions
· Security & Compliance: Role-Based Access Control (RBAC), Data Encryption, Data Masking, Secure Data Transmission
· Data Storage & Processing: Azure Blob Storage, Azure Synapse Analytics
· Visualization & Reporting: Power BI
· Scripting & Automation: Python, SparkSQL
Project Overview:
Designed and implemented a modernized Data Warehouse solution for Countrywide, a leading provider of estate agency, lettings, financial services, and property management. Led the migration to Azure Synapse Analytics, optimizing data architecture for enhanced analytics and reporting. Built and led a BI & Analytics team, focusing on Azure-based ETL, data modeling, and Power BI visualization. Developed automated processes and continuous improvement initiatives to streamline data operations, ensuring scalable and efficient enterprise data management.
Responsibilities:
· Redesigned the Data Warehouse solution and led its migration to Azure Synapse Analytics.
· Captured and translated business requirements into logical and physical data models for optimal data architecture.
· Led and built a BI & Analytics team from the ground up, focusing on Microsoft Azure services (ETL, Azure Synapse, Power BI).
· Provided technical solution guidance and led a team of five BI developers in data warehouse design and implementation.
· Owned end-to-end development including requirements gathering, data modeling, database design, ETL development, query optimization, and report creation.
· Designed and developed automation processes as part of a continuous improvement initiative, enhancing efficiency and scalability.
· Led the development, validation, publishing, and maintenance of logical and physical data models within the Enterprise Data Warehouse (EDW).
· Contributed to estimations and solution proposals, ensuring feasibility and alignment with business needs.
Technologies Used:
· Cloud & Data Warehouse: Azure Synapse Analytics, Azure Data Lake
· ETL & Data Integration: Azure Data Factory (ADF), DTS, SSIS
· Data Modeling & Database: MS SQL Server (2005/2008/2016)
· BI & Reporting: SSRS, Power BI
SSRS, Exposure to PowerBI, Azure data Studio, Azure Data Factory, Azure Synapse, SSIS, SSAS, Azure Data lakes, Python, R, PySpark, Relational – MS SQL Server., T-SQL, JIRA, PODIO, Team Foundation Server (TFS)