Summary
Overview
Work History
Domains Worked
Technical Skills
Education Certification
Core Competencies
Title
Certification
Timeline
Generic

Satyandra Kumar

Summary

Led a team of architects in designing and implementing cloud strategies for enterprise clients, resulting in increased operational efficiency and cost savings of up to 30%. Developed and communicated cloud architecture standards, best practices, and patterns to ensure consistency and adherence to industry standards. Collaborated with sales and pre-sales teams to provide technical expertise during client engagements, contributing to a 20% increase in client acquisition. Oversaw the evaluation and selection of cloud technologies and vendors to ensure alignment with business goals and industry trends. Data engineering solution using Azure data factory and Databricks. Acted as a subject matter expert in cloud security, compliance, and governance, ensuring solutions met industry-specific regulatory requirements. Worked in Agile models and closely involved in delivery with respect to implementation of agile safe.

Overview

17
17
years of professional experience
1
1
Certification

Work History

Data Architect

LTI Mindtree
01.2021 - 11.2023
  • Led the design and implementation of a secure and scalable cloud infrastructure for a large-scale Electronic Health Record (EHR) system, resulting in a 30% reduction in response times and improved system availability
  • Collaborated with compliance and security teams to ensure that the cloud architecture met industry-specific regulatory requirements such as HIPAA, GDPR, and HITECH Act
  • Implemented Azure Active Directory for identity and access management, enabling seamless and secure authentication for healthcare providers and staff
  • Designed and implemented a disaster recovery plan using Azure Site Recovery, reducing recovery time objectives (RTO) by 40% and ensuring minimal data loss in case of a disaster
  • Implemented Azure Monitor and Log Analytics to proactively monitor system health, detect anomalies, and generate actionable insights for performance optimization
  • Worked closely with data scientists to implement machine learning models for predictive analytics, enabling early detection and intervention for critical patient conditions.

Architect

Wipro
06.2019 - 12.2019
  • Worked In POC for one of the Upcoming New footprints of the customer, in cloud and data practice, Involved in preparing RFP's, estimations, and customer calls
  • Worked with onsite presence of the customer to understand the requirement with pre-sales team to make a smooth transition of the project
  • Retrieve data from legacy systems
  • To improve data accuracy and reliability, clean the data
  • Update a target database with data
  • Used Redshift as a data-warehousing tool in the AWS cloud.

Data engineer and Sr Data engineer

Capgemini, Accenture
01.2012 - 05.2018
  • Architected and implemented a distributed data processing pipeline using Apache Spark, resulting in a 30% reduction in processing time for complex analytics tasks
  • Designed and maintained data warehousing solutions using technologies like Amazon Redshift and Google BigQuery, enabling efficient querying and reporting capabilities
  • ETL Pipeline to load Amazon Customers Review Dataset by establishing a database connection by a series of AWS Glue Python Shell Jobs to connect to AWS Redshift cluster
  • Redshift Spectrum reads data from S3 and loads it into Redshift tables and also executes aggregation queries via UNLOAD to s3
  • Led the development of ETL processes using tools like informatica and pyspark, aws glue, and redshift to ingest, transform, and load data from various sources into data lakes and warehouses
  • Worked in Azure data factory and ADLS for ETL loads
  • Worked with Apache airflow and other tools for orchestrating the data engines
  • Worked with Onsite and offshore model, worked closely with customers.

Consultant

Techmahindra
01.2007 - 12.2011
  • Worked in loading and migrating data from different sources like inventory systems, billing and order management systems which were non Siebel systems to Siebel
  • Worked in EIM (enterprise integration manager) to load the data and used the staging as informatica
  • Designing the job, preparing ifb file, and running the job in EIM component, monitoring and taking the log file after execution
  • Worked as a business analyst to liaise between the customer and our teams to make sure understanding and implementation of the requirement is fulfilled with due diligence.

Domains Worked

Manufacturing, health care, Banking, communications

Technical Skills

Hadoop, Spark, Kafka, Hive, python, pysprk, Cloudera, Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform (GCP), SQL, NoSQL (MongoDB, Cassandra), Amazon Redshift, Google BigQuery, Snowflake, Apache NiFi, Talend, Apache Airflow, informatica, Microservices, Serverless, Containers (Docker, Kubernetes), Virtual Private Cloud (VPC), VPN, Direct Connect, CDN, Identity & Access Management (IAM), Security Groups, Encryption, WAF, Firewall Rules, Jenkins, GitLab CI/CD, Terraform, Ansible

Education Certification

Bachelor of Engineering in  Electronics and INstrumentation

Master of Business adminstration in FInance

Core Competencies

Data engineering, Data architecture, Data orchestration, machine learning, cloud architecture, Data integration with ETL processes

Title

Data architect

Certification

  • Aws Solution Architect professional certifies
  • Azure SOlution architect certified

Timeline

Data Architect

LTI Mindtree
01.2021 - 11.2023

Architect

Wipro
06.2019 - 12.2019

Data engineer and Sr Data engineer

Capgemini, Accenture
01.2012 - 05.2018

Consultant

Techmahindra
01.2007 - 12.2011
  • Aws Solution Architect professional certifies
  • Azure SOlution architect certified
Satyandra Kumar