Extensive experience on Big Data technologies. Confident in Hadoop, Apache Spark, Scala, Python, AWS, HDFS, HIVE, PL/SQL. Experience on different Hadoop ecosystem tools like HIVE, Sqoop, pig and Ozzie Strong knowledge of backups, Restore, recovery models, database shrink operations, DBCC commands, Clustering Experience in troubleshooting and resolving database integrityissues,log shipping issues, blocking and deadlocking issues, connecting and security issues. Domain: Banking, Retail, and Telecom. Architecture: Batch processing of big data sources at rest. Real-time processing of big data in motion. Interactive exploration of big data. Predictive analytic and machine learning. Store and process data in volumes too large for a traditional database. Transform unstructured data for analysis and reporting. Capture, process, and analyze unbounded streams of data in real time, or with low latency.
Title: Data Engineer
CBD HLT REDS MSP2 Merge, Consultant, JP Morgan Chase, 09/06/21, Present, Home Lending Servicing LOB is starting an initiative to consolidate all data in the MSP Clients 156 and 465 into one where the MSP Client 465 will be the only instance going forward. This code deployment and testing effort related to the MSP 2 to 1 initiative for the CFL reports. From a firm-wide perspective completing this will help streamline and improve our current processes, reduce testing requirements and timelines to implement updates and increase efficiency for Chase applications processing the data from MSP which will help us efficiently service our customers' requests., Develop and implement pipelines that extract, transform, and load data into an information product that helps the organization reach its strategic goals., Focus on ingesting, storing, processing, and analyzing large datasets., Create scalable, high-performance web services for tracking data., Investigate alternatives for data storing and processing to ensure implementation of the most streamlined solutions., Serve as a mentor for junior staff members by conducting technical training sessions and reviewing project outputs., Develop and maintain data pipelines and take responsibility for Apache Hadoop development and implementation., Work closely with data science team to implement data analytic pipelines., Maintain security and data privacy, working closely with data protection officer. AT & TIOTA Migration, Consultant, AT&T, 11/15/18, 08/28/21, The AT&T Project is generally consistent with the Town Plan insofar as expanding wireless broadband service into areas of the Town with poor service quality, while minimizing and mitigating adverse impacts to historic districts or public parks, necessary wildlife habitats, special flood hazard areas, primary agricultural soils, or along designated scenic roadways., Create Scala/Spark jobs for data transformation and aggregation., Produce unit tests for Spark transformations and helper methods., Write Scala doc-style documentation with all code., Design data processing pipelines. Disney Streaming Services, Consultant, Disney, 02/02/17, 06/30/18, Disney Streaming Services is a technology subsidiary of The Walt Disney Company located in Manhattan, New York City. Disney Streaming is a business unit within Disney Media and Entertainment Distribution (DMED), managing operations of The Walt Disney Company's streaming services including Disney+, Hulu, ESPN+ and STAR+., Working on migration tasks from source to destination., Creating and monitoring the task in different environment which includes Prod/Dev/QA., Good knowledge of creating Qlik replicate tasks to source and target., Experienced on adding tables to task., Monitor the Full load and change processing modes, apply throughput, latency issues and data incoming changes., Managing task settings and endpoint connections, importing and exporting the task., Supporting ON-Call on weekends., We are supporting nearly 200+ Databases which includes Prod/Dev/QA environment., Creating logins and assigning proper permission to the created logins, mapping logins to the databases and granting appropriate permissions., Planning backups and recovery strategy for various database environments., Space monitoring and adding space to database when its needed., Performing SQL server daily health check and preparing a report based on it. Mars Infosol, Software Trainee, 02/01/16, 08/31/18, Chennai, Tamil Nadu, India, java spark, Core Java, PL/SQL, JUnit, Bitbucket, Git, Oracle SQL Developer, postman, Jira