Summary
Overview
Work History
Education
Skills
Certification
Websites
Personal Information
Timeline
Generic
Deepak Uttam

Deepak Uttam

Solution Architect
Bengaluru

Summary

With over 15+ years of experience as a Data Architect & Developer with a focus on Data & AI applications (Oracle/Databricks/Snowflake), with assignments involving design, implementation, testing and deployment of data Lake/data warehouse solution and data pipelines for clients to deliver optimal results and business value across high-growth environments. With excellent communication skills able to collaborate effectively with cross-functional teams and stakeholders. Strong understanding of both traditional and modern data principles, excel at creating innovative solutions that drive business growth.

Overview

15
15
years of professional experience
3
3
Certifications
2
2
Languages

Work History

Solution Architect

Phdata
02.2025 - Current
  • Client - NB - Architected and led the migration of enterprise data systems from legacy dimensional models (Markit EDM, Informatica, MS SQL) to a modern Data Vault 2.0 architecture using dbt Core, automateDV, and Snowflake, incorporating robust data quality frameworks with Soda .
  • Client - Solenis - Designed and implemented a new dimensional model to support analytical reporting, modernizing their existing data platform using Coalesce and Snowflake for scalable and maintainable data workflows.
  • Communicated with partners and clients to update product and implementation status at technical or functional level.
  • Provided technical leadership and mentoring for junior team members, fostering a supportive learning environment that promoted skill development and career growth.

Application Architect

Accenture
07.2021 - 01.2025
  • Company Overview: Client - HighMark Health
  • Tools & Tech - Google Cloud Services, Databricks, Big query, Postgres, Git, Devops, Pyspark
  • Design, implementation, and deployment of CIF (common ingestion framework) data lake warehouse solution which accepts variety data from batch and streaming sources and further pushed into bronze, silver, and golden layer by implementing data standardization, DQ and business rules and it is further used by API /Analytics /ML teams. Databricks pyspark is used for data wrangling and business rule implementation, GCS for storage and Big query/PG is used for analytics.

Module Designer and Data Eng

Cognizant Technologies Ltd
10.2020 - 07.2021
  • Company Overview: Client - Telstra Communication
  • Tools & Tech - Azure ADLS Gen-2, Data factory, Databricks, SQL Server, Python, Control-M, Git
  • We are an agile team responsible for ingesting raw data from various strategic sources into DataCore/InfraCo Data Lake to support DaVinci Mission. We enable various downstream consumers by providing standardized datasets available in Data Lake and Data Warehouse and supply data extracts for various use cases including Loyalty, eNPS, Complaints and DES. We primarily source data into Data Lake via Batch pattern and API patterns

Database developer, Sr

Epison
09.2016 - 10.2019
  • Company Overview: Client - Bank Of America
  • Tools & Tech - Hive, PySpark , Netezza,Oracle 12c,Python,Squirrel, SQL Developer, Redshift AWS , Unix, Git
  • Participated on Atlas migration project of Bank of America client, Earlier it was running on Netezza DB with all the scripting in Perl, while migration system switched to Big Data environment and scripting language changed to Python. Active Batch tool are used for scheduling the jobs. This project has been used for reporting/decision making purpose and contains high volume of data

Oracle Database Developer

Virtusa Consulting Pvt Ltd
10.2013 - 09.2016
  • Company Overview: Client - British Telecom
  • Tools & Tech - Oracle 10g/11g, SQL
  • PLUS, SQL Developer, TOAD
  • Global Services enterprise data warehouse – GS-EDW is a central repository of data which is created by integrating data from different components across BTGS stack.GS-EDW stores current as well as historical data and are used for creating trending reports for senior management and for external customers as well as.GS- EDW transforms these data and uses them to generate volumetric reports for orders/incidents on monthly, weekly basis.

Software Engineer

Tech Mahindra
07.2010 - 10.2013
  • Company Overview: Client - British Telecom
  • Tools & Tech - Oracle 10g/11g, SQL
  • PLUS, SQL Developer, TOAD
  • Global Services enterprise data warehouse – GS-EDW is a central repository of data which is created by integrating data from different components across BTGS stack.GS-EDW stores current as well as historical data and are used for creating trending reports for senior management and for external customers as well as.GS- EDW transforms these data and uses them to generate volumetric reports for orders/incidents on monthly, weekly basis.

Education

Master in Computer Application -

Maulana Azad National Institute of Technology, NIT Bhopal
06-2010

Skills

    Data Architecture

undefined

Certification

Azure Certified Data Engineer, 2020

Personal Information

Nationality: Indian

Timeline

Solution Architect

Phdata
02.2025 - Current

Application Architect

Accenture
07.2021 - 01.2025

Module Designer and Data Eng

Cognizant Technologies Ltd
10.2020 - 07.2021

Database developer, Sr

Epison
09.2016 - 10.2019

Oracle Database Developer

Virtusa Consulting Pvt Ltd
10.2013 - 09.2016

Software Engineer

Tech Mahindra
07.2010 - 10.2013

Master in Computer Application -

Maulana Azad National Institute of Technology, NIT Bhopal
Deepak UttamSolution Architect