Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

T Senthil Kumar

Summary

I have worked in Information Technology for over 15 years, specializing in data ingestion and maintenance, data integration, data migration, data lake in AWS, and Azure Cloud Platform. Hands on experience design and implementation of data Orchestration using Kubernetes cluster, Docker, Azure key vault, Azure File share, Azure container integration with Astronomer Airflow on hybrid cloud. Good experience in design and implementation of Snowflake serverless container Orchestration using Snowpark and python. Good experience in designing and implementing data lake house using Snowflake, ADLSGEN2, ADF, and SHIR. Helped for transformation from Netezza on-perm to Snowflake data lake house. Design and implementation of medallion architecture with snowflake in Azure cloud platform. Experienced in design and implementation of modern data warehouses using Snowflake and AWS Cloud. Helped for data ingestion from various database Oracle, Exadata, Big data, MS SQL to modern data warehouse using Streamsets and Oracle Golden gate ETL tools. Data Architect with working experience on design and implement end-to-end Snowflake infrastructure with combination of customer managed key in premium Azure key vault with Tri-secret secure HSM key. Handled Structured and Semi/RAW (JSON/Avro/parquet files) data ingestion, data Management between Data warehouse and Data Lake. Hands on experience in Snowflake SSO Setup and SCIM setup with Azure AD automated the RBAC. Hands on design and implementation experience in DevOps GitHub, DBT, Airflow, Fly-away, Schema Change tools. Design and implementation experience On-Prime to Cloud database migration using IICS, IICS Mass ingestion, Stream sets, Apache Kafka, Oracle Golden gate for Big data adaptor.

Overview

22
22
years of professional experience
1
1
Certification

Work History

Technical Architect

TCS Netherlands B.V
10.2023 - Current
  • Design and implement Data Orchestration project in health care domain
  • Design and implemented Astronomer Airflow in hybrid Cloud helped data team to transform data from on-perm file servers to snowflake data cloud
  • Integrated Azure key vault for security, Azure file share for raw data intermediate staging, Azure Kubernetes for airflow components execution
  • Airflow Dag's and docker images are executed perform through Airflow and performance test conducted
  • Improved business productivity for clients by [Number]% by re-engineering and designing infrastructures.

Snowflake Architect

ADT Netherlands B.V
11.2022 - 10.2023
  • Design and implementation of modern Data Lakehouse Using Snowflake data cloud in Azure
  • Analyzed the current environment conducted workshops with client provided the solutioning for the modern medallion architecture
  • Good experience integration of On Prem SSIS and ODI with Snowflake cloud data warehouse
  • Azure sentinel SIEM based solution implemented used to capture the events from cloud-based environment
  • Worked on Time Travel, Database Replication and Failover, Connection Failure and Client Redirection, Database Failover/Failback
  • Migrated on-perm Netezza databases to Snowflake data cloud using Azure data factory with SHIR
  • Good experience in setup the secure data sharing and disaster site for the snowflake
  • Implemented Tri Secret secure customer managed HSM key with Azure premium key vault

Technical Architect

Cognizant
09.2018 - 10.2022
  • Design & Implementation of Data lake house with medallion architecture
  • Integrated Azure ADLS Gen2 as an external stage centralized data and implemented snow pipe ingested the data into snowflake as per data team request
  • Design and implement CI/CD, GitHub, and schema changes for transformation CI/CD process
  • Implemented the Snowflake Security Model with data classification, RBAC Roles and Data access control
  • Snowflake Self Service Workspace Design and implemented snowflake monitoring and logging
  • Provided the design support for IICS, IDQ, EDC and AXON data management Products and its Snowflakes Drivers
  • Support provided for implementation of the Data Quality and governance
  • Involved in Snowflake account setup, Designed and implemented Business Continuity and Disaster Recovery for production snowflake Datawarehouse database in Azure cloud
  • Worked on Time Travel, Database Replication and Failover, Connection Failure and Client Redirection, Database Failover/Failback
  • Implemented High Level Cost Tracking Report for Snowflake and provided the cost tracking solution to the client
  • Interfaced with stakeholders and customers through workshops, conferences, and aid in solutioning

Architect

Cognizant
09.2018 - Current
  • Design and implement Snowflake enterprise data warehouse in the Client environment with AWS cloud
  • Implemented Asset Centric Design ensures the Segregation of Asset Data by providing physical databases in snowflake, AWS cloud
  • Establish Basic structure to facilitate CI/CD, DBT and automated the transformations on Snowflake RAW staging layer
  • Involved New Team Creation, GITLAB CI Users Creation, New Asset Database creation and automate the snowflake transformations
  • Design and implemented AWS S3 Storage Integration for Data Classification
  • Good experience in setup Snowflake Tri-Secure security
  • Created high level plan and executed History Data Migration, Ongoing Data Ingestion from various data source Oracle, Exadata, MS SQL and big data to Snowflake using Streamsets ETL tool
  • Handled data drift using stream sets tool
  • Used Oracle Goldengate bug data adapter ( Classic ) with Apache Kafka extracted data from Oracle Exadata send as AVRO message to S3 location and using snow pipe loaded in to snowflake
  • Design and implement Disaster Recovery for production snowflake Datawarehouse in AWS cloud
  • Provided the design support for Zero copy clone, Data Sharing, streams, and Tasks, stream sets ETL tool and Oracle Goldengate

Senior System Engineer

SGS MCNET
07.2011 - 08.2018
  • Provided support for database administration of 12c and 19C production database
  • Involved in Oracle database storage migration from Sun Solaris to Hitachi storage
  • Good Experience in upgrading database from 11.2.0.4 to 19C
  • Design and implementation of Disaster Recovery site using Oracle Goldengate

Advisory System Analyst

IBM India Private Limited
06.2008 - 02.2010
  • Provided E-Business suit 11i support Cloning Database, application server from RAC to RAC & RAC to Non RAC Environment
  • Migration of database from P575 to P595 Servers
  • Diagnosed and troubleshot database problems such as errors and bugs, performance problems, runaway processes, connectivity problems, poor SQL statements, etc
  • Troubleshoot performance issues in custom code using, SQL Trace, explain plan and AWR report and monitoring system wide performance using Stats pack reports
  • Took ownership of complex database incidents and problems and provided resolutions in a speedy
  • Defining the backup strategies and reduce the recovery time

Executive Systems

Mascon Global Ltd
12.2005 - 02.2007
  • Install and configure new Oracle database in Solaris and Linux Operating system
  • Day-to-day DBA activities in non-PROD, QA and Trial environment
  • Database Refresh handled from development, Test and QA environment
  • Patching on Databases through O Patch Utility
  • Backup and Recovery - Cold/Hot and RMAN backup
  • Exposure to Solaris Shell Scripting
  • Interacting with Oracle Support Services for Product related issues and implementing Oracle recommendations

Platform Consultant

Digite Infotech
01.2005 - 12.2005
  • Effectively Managed the Qatar Planning Council product-oriented database 24x7 production environment and provided on-call and day-to-day support
  • Maintenance of the Oracle 9i database on Solaris v880 server, implemented backup strategies
  • Proactive database-level performance tuning, with a good working knowledge of Oracle internals
  • Managed appropriate use of free space within table spaces, reclaimed space whenever possible
  • Reorganized tables and indexes within databases when needed
  • Monitored the production Oracle alert logs for database errors and system resource availability and responded to system memory and data issues
  • Expert with OEM Event and Job Managers, Change Manager, Diagnostics using stat Packs

Programmer

Kals Information Systems
04.2003 - 12.2004
  • Joined as an Education Project Trainee and In Ingress II database
  • Installation and configuration of the Oracle 9i on Windows NT
  • Used SQL Loader to load bulk data to the database
  • Worked in INGRES II database for insurance project
  • Implemented the Auto Debit System module to the client Takaful Malaysia in Ingres
  • Effectively handled the dead lock issues on the forms

Education

MCA -

AV Institute of Technology
06.2002

BSC -

PGP College of Arts and Science
06.1999

Skills

  • Enterprise architecture design
  • Infrastructure automation
  • Communicating with clients
  • Client engagement
  • Client communications
  • Business solutions

Certification

  • Oracle Cloud Architect Associate Certified
  • TOGAF 9 Certified
  • Snow Pro Core Certified Professional
  • AWS Certified Solutions Architect - Associate
  • Oracle 12C RAC Trained Professional
  • EXADATA X3 - DMA - Certified Database Machine Administrator
  • ITIL Foundation certified professional

Timeline

Technical Architect

TCS Netherlands B.V
10.2023 - Current

Snowflake Architect

ADT Netherlands B.V
11.2022 - 10.2023

Technical Architect

Cognizant
09.2018 - 10.2022

Architect

Cognizant
09.2018 - Current

Senior System Engineer

SGS MCNET
07.2011 - 08.2018

Advisory System Analyst

IBM India Private Limited
06.2008 - 02.2010

Executive Systems

Mascon Global Ltd
12.2005 - 02.2007

Platform Consultant

Digite Infotech
01.2005 - 12.2005

Programmer

Kals Information Systems
04.2003 - 12.2004

BSC -

PGP College of Arts and Science

MCA -

AV Institute of Technology
T Senthil Kumar