Hewlett Packard Inc: P360 EDL to Cloud Migration, HPI, QBOT Tool, Databricks, Azure Data Factory, Windows, Linux, MS SQL server, Databricks, ETL Tester, This is an EDL to the Cloud Migration project. Data comes from the SFTP path as a ZIP file daily. It is moving into stage folders and then into the Input path. Files will get consumed from the Input path after unzipping and the processed files will move into the archive path. We have tested multiple source feeds like E2open, BMC Helix, Gypsy, FCM, GCW Dependent Tables, and Masters. We have tested in parallel prod with the live data from the source which is EDL (Hive) and Target as FDL (Databricks). We have validated counts for each table and QBoT check for all the source feeds. We have uploaded test cases in Test Rail and attached the defects in Test Rail. Having good experience in preparing Test Strategy Documents and Test approach documents. We have tested it in ITG and parallel production environments and given it to Business validation. The report provided valuable insights to the data quality team, aiding them in identifying and resolving potential data discrepancies and enhancing overall data integrity. Reduced manual efforts by 95%, ensuring smooth and efficient data processing., Involved in understanding the requirements (through functional specification document), Writing test cases based on Functional specs and business requirement documents., Responsible for test data preparation for different applications. validate test data., Executed test cases & Results in Tracker. USAA Claims Modernization, ETL Tester, Informatica Power Center, Windows, Unix, Oracle, DB2, Data migration project where data will be transformed and loaded in four layers including Guidewire in which we will be validating and testing the data in each layer for data accuracy as per the mapping document., Designing the test scenarios and test cases for all the extracts against mapping document., Updating the Mapping requirement sheet whenever requirement changes happened., Defect logging and tracking process and involve with development team if any discussions required., Worked on Defect retesting, Regression Testing, Smoke Testing, Positive and Negative testing., Performing manual testing by performing the operations mentioned in design steps., Validating the Guidewire Claim Center UI to check the expected data is populating or not., Compare the expected results with actual outcome and record the results., Analyzed application data using SQL to identify the ETL process., Reporting bugs in JIRA against the defect., Retesting the resolved bug., Modifying and creating SQL queries based on scenarios for quality assurance and analysis., Involving in communication with UAT team members., Supporting the production load to check any new defects are observed and reporting the bus using JIRA., Responsible to find the root cause of the issue and providing the analysis in JIRA., Daily standup calls for work activity status and KT sessions for the team., Got a Rising Star award from the client because of delivering the tasks on time with great quality. GENWORTH CCM PROJECT(Contact Centre Modernization), GENWORTH FINANCIAL Inc, ETL Tester, AWS Services, S3 Buckets, Athena, Azure Data Factory, Microsoft SQL Server, Windows, Unix, SQL Server, PostgreSQL, Athena, Genworth Financial Inc, established in 1871 as the Life Insurance Company of Virginia and headquartered in Richmond, Virginia, offers life insurance, long-term care insurance, mortgage insurance, and annuities. As part of our daily process, files from the NAS Folder are transferred to the Landing S3 bucket in AWS using AWS Data Sync. AWS Data Sync is a secure, online service that automates and accelerates the transfer of data between on-premises storage and AWS storage services. The data migration involves moving files from on-premises NAS Folder to AWS cloud storage services. We have multiple source systems, including CLOAS, CALYPSO, PEGA, PPLUS, and Amazon Connect. Currently, there are 594 files in the source, which have been converted into 594 tables in the Athena Database. Additionally, 17 tables from the EDH-T Population will be loaded into RDS. Athena views with the current snapshot are created by joining various source tables, as defined by the EDH-T domain table structure. Reduced manual efforts by 98%, ensuring smooth and efficient data processing., Prepared the test plan as a part of this project., Written test cases according to the validations to be performed concerning data., Uploaded test cases into HP test rail along with the execution of test cases., Maintained defects in HP-Test rail., Performing the SIT by including the different sets of scenarios., Performed the data validation in Excel and V-lookup transition to match the data., Written SQL queries to validate the data in Oracle.