Efficient Cloud Engineer with years of experience assembling cloud infrastructure. Utilizes strong managerial skills by negotiating with vendors and coordinating tasks with other IT team members. Implements best practices to create cloud functions, applications and databases.
Certified in Python , DP200 & D201 Azure Data Engineer Associate, DA-100- Analyzing Data with Power BI and willing to learn new skills
Overview
11
11
years of professional experience
1
1
Certification
Work History
Technical Lead
Prodapt Solutions
Miami
12.2021 - Current
Designed architecture and developed a product capable of running multiple AI based engines, similar to AWS lambda with payment gateway support, user access management. Technology specification of this project was flask, react, Redis, celery, postgres, docker etc
Designing and building web environments on AWS, which includes working with services like EC2,Redshift clustered, Elastic Cache(Redis),S3, Amazon Appflow, IAM, Amazon Glue, Athena, Secret Manager, SNS, Step Functions, API Gateway, VPC, QuickSight, Lambda, CloudWatch
Building data pipelines and applying data transformations for batch and real-time messaging systems.
Assessing the effectiveness and accuracy of new data sources and data gathering techniques.
Ability to work on various aspects of Data transformations and Data Modelling.
Enhancing data collection procedures to include information that is relevant for building analytic systems.
Excellent knowledge of scripting in Python and Dash or R
Experience visualizing/presenting data using BI tools e.g. PowerBi, Tableau.
Created Glue jobs (created crawlers, Virtual Databases) and validated the data catalog and moved the data from S3 to Redshift
Worked on using monitoring solutions like CloudWatch.
Created interactive query Athena service that makes it easy to analyses data in Amazon S3 using standard SQL
Created IAM roles to support the AWS services has specific permissions
Used event bridge to set up rules that watches for specific types of events using Cron expression
Created step functions for the data orchestration using visual workflow service. Workflows manage failures, retries, parallelization, service integration, and observability.
Incorporated the SNS during the step function execution and this helps us simplify my application architecture and reduce costs.
Created Layers for AWS Lambda function (awswrangler, boto3, psycopg2, pysftp-layer)
Used tools like Postman, SwaggerHub, and AWS developer portal.
Created Lambda's to move the data from S3 to Redis using Python programming
Expertise in developing production ready Spark applications utilizing Spark-Core, DataFrames, Spark-SQL, Spark-ML and Spark-Streaming API's
Acquired profound knowledge in developing production ready Spark applications utilizing Spark Core, Spark Streaming, Spark SQL, DataFrames, Datasets and Spark-ML.
Proficient at using Spark APIs for streaming real time data, staging, cleansing, applying transformations and preparing data for machine learning needs.
System Analyst
Hexaware Technologies
Chennai
06.2019 - 12.2021
Data Pipeline and ETL (extract, transform, load) processes into the new Snowflake environment from Azure Data Factory/SAP BOD
Azure Data Factory Creation, deployment and schedule on demand
Experienced in writing live Real-time Processing and core jobs using Spark Streaming with Kafka as a data pipeline system using Scala programming.
Built Spark applications in Scala and involved in end-to-end development, build, and deployment cycles.
Azure Storage (Blob, File, Queue and Tables) create using portal
Designed and developed Azure SQL/Data Warehouse solutions.
Database Migration SQL Server Database to SNOWFLAKE using Azure Data Factory& SAP BODS
Set Up Data Engineering Pipelines from On Premises SQL Server to Azure SQL Database using
Azure Data Factory using Self Hosted Integration runtime
Adapt changes to the existing scripts, codes and pipelines.
Created procedures functions using SQL
Created new report request raised by client in SAP BO/POWER BI
Creating new jobs in SAP BODS/Azure Data Factory as per the client request to source the data from SQLSERVER/PIAF to Snowfake/EDW Tables to work on the
POWERBI/SAP BO reports or Power BI reports
Impact analysis on the service request raised for new enhancement request for the existing reports in SAP BO/POWER B
Extensively worked in data Extraction,Transformation and Loading from source to target system using Azure Data Factory, SAP BODS from SQL SERVER &PI system.
Extracting the data from various data sources into Power BI desktop and transforming the data in Power Query Editor and to generate Reports and Dashboards
Create the Calculated columns and measures with the use of DAX Expressions like Aggregate, Time & Date Functions.
Associate
Cognizant Technologies Solutions
Chennai
06.2016 - 06.2019
Involved in unit testing and preparing test cases.
Implemented intermediate tables to improve the performance.
Involved in migrations using Star Team.
Involved in Performance tuning options.
Developed mappings to pull the history from different databases.
Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
Developed mappings/sessions using Informafica Power Center 9.1 for data loading.
Extracted data from Oracle and SQL Server then 2016-06 - 2019-06 used Teradata for data warehousing.
Strong knowledge on ETL tool Informafica
Tuned multiple Informafica workows through techniques like partitioning, performance tuning like using required persistent, dynamic lookups.
Reducing the aggregator and other transformations and do the SQL logics in the source level in ETL.
Knowledge on working with SQL
Working on other tasks like reviewing checklists, deployment documents, audit level documents and other documents related to the enhancements/New developments.
Prepared EDI and Data profling of source systems.
Prepared Teradata load scripts.
Load and Transform programs and ensured delivery for each iteration timely
Perform integration test/unit test for all transformation
Analysis, design, development, testing the credit card source system.
Prepared Teradata scripts BTEQ, Load and Fast Exports.
Implementing new enhancement into existing EDW as an Application change request
Creating new mappings according to the business needs, doing some modifications in the present existing code, performing unit testing and preparing
test results based on the business scenarios.
Performance improvement like creating Persistent caches lookup for table for multiple business requirements.
MIS Executive
CSS Corp Private
Chennai
01.2013 - 02.2016
Experience in Informafica Power center –Mapping designer, Workflow manager, Workflow monitor, various transformations such as Source qualifier,
Expression, Joiner, Normalizer, Router, Aggregator, Maple designer and Update strategy.
Good presentation skills and excellent team member with good communication and interpersonal skills.
Fast learner and Ability to grasp new technologies
Created lots of SQL scripts using the inner join, outer join, cross join while writing the SQL Queries.
Created views and materialized views to pull data from Oracle database.
Exported the database and imported the same into development and test environment whenever required.
Worked on data extraction, data cleaning, data transformation and data loading process