Summary
Overview
Work History
Education
Skills
Timeline
Disclaimer
Generic

Spandana Veronika

Rajahmundry,Andhra Pradesh

Summary

Overall experience of 5+ years in Data Warehousing ETL and Business Intelligence Regular practitioner of Informatica Power Center, IICS, snowflake and Azure data factory Extracted data from oracle , Teradata , SQL database and flat files provided and staged into a single place and used Various Transform components such as Filter, Router, Union, Sequence Generator, Sorter, Lookup, Joiner and Aggregator. Analysis of the specifications provided by the clients. Created Mappings, Sessions and Workflows using Informatica Power Center. Participated in testing and performance tuning by identifying bottlenecks in mapping logic and resolving them. Developed and maintained Unix shell scripts to automate test data setup, job monitoring, and file handling in the ETL process. Used shell scripting to schedule and validate ETL workflows and perform pre- and post-load validations. Execute tests to verify data is extracted, transformed, and loaded correctly. Validate data accuracy, completeness, and consistency across systems. Document all test results and provide summary reports. Reporting daily Development status. Attending Sprint planning calls and status calls. Monitoring Jobs in Control-M Worked closely with the onsite team to understand the business requirements and applying the business rules. Good team player

Overview

2024
2024
years of professional experience

Work History

CELEBAL TECHNOLOGY PVT LTD

Senior Consultant
07.2022 - 12.2023
  • ETL Developer (Informatica to Azure Migration)
  • Analyzed and assessed existing Informatica ETL workflows to extract business logic, data flows, and transformation rules for accurate migration to Azure Data Factory and Databricks.
  • Developed ADF pipelines to ingest data from Teradata and flat files into Azure Data Lake Storage (ADLS), applying dynamic parameterization and reusable components.
  • Built scalable transformation logic using Databricks (PySpark) to replicate complex Informatica transformations such as joins, filters, lookups, and aggregations.
  • Implemented multi-layered data architecture in Azure: Raw → Staging → Curated → Snowflake, ensuring clean separation of concerns and data traceability.
  • Loaded transformed data into Snowflake, designing and populating dimension and fact tables using star schema modeling for analytics and reporting purposes.
  • Performed end-to-end data validation to ensure that data loaded via Azure pipelines matches the data previously processed by Informatica (row counts, hash totals, sample-level checks).
  • Conducted data quality checks (e.g., null validation, type checks, referential integrity) and routed invalid records to error handling tables for review and reprocessing.
  • Developed audit and logging mechanisms to capture job metadata (row counts, status, timestamps) for monitoring pipeline execution and ensuring accountability.
  • Collaborated with cross-functional teams (business analysts, data engineers, QA, DevOps) in an Agile environment to gather requirements and deliver migration milestones on time.
  • Documented ETL processes, validation results, and architecture changes, and provided knowledge transfer and handover to support teams post-migration.
  • Project -1 : SIAM Commercial Bank
  • Technologies : Azure Data Factory,Informatica,Teradata,Snowflake,SQL, Databricks

Project -2

BAJAJ ALLIANZE LIFE INSURANCE COOPERATION LTD
  • Designed and developed end-to-end ETL pipelines using IICS, extracting data from Oracle, transforming it with complex business rules, and loading into Snowflake data warehouse.
  • Implemented incremental data loading strategies using watermark columns and CDC logic to ensure only new or changed records are processed efficiently.
  • Built and maintained Slowly Changing Dimension (SCD) Type 2 mappings in IICS to preserve historical changes in dimension tables.
  • Used star schema dimensional modeling techniques to design and load fact and dimension tables in Snowflake for optimized reporting and analysis.
  • Created reusable mapplet and used parameterfile to avoid hard coding in IICS to support dynamic ETL workflows across different environments.
  • Applied various IICS transformations such as Expression, Lookup, Joiner, Aggregator, and Filter to handle data cleansing, enrichment, and business logic implementation.
  • Developed and tuned complex SQL queries for data extraction, transformation, validation, and loading between Oracle and Snowflake.
  • Created Snowflake views to support Power BI reports, ensuring downstream BI teams had access to clean, structured, and analysis-ready datasets.
  • Delivered curated data to downstream teams by managing access, building data marts, and sharing Snowflake objects for cross-team usage.
  • Performed data quality profiling to identify data anomalies and implemented various data quality rules using IICS; routed invalid records to error tables for further review.
  • Audited the complete data flow from source to target, capturing key metadata like record counts, job status, timestamps, and errors into a centralized audit table for monitoring and traceability.
  • Project -2 : BAJAJ ALLIANZE LIFE INSURANCE COOPERATION LTD
  • Technologies : ORACLE,SQL,IICS,SNOWFLAKE,POWERBI.

Virtusa

ETL Informatica Developer.
01.2022 - 06.2022
  • Designed and developed ETL pipelines in Informatica IICS to extract data from SQL Server and flat files (CSV and delimited formats), transform data based on business rules, and load into Snowflake.
  • Implemented a layered ETL architecture comprising inbound, processing, and outbound layers to streamline data flow and ensure high data quality and traceability.
  • Utilized IICS Indirect File List feature to process multiple flat files efficiently by referencing a control file that lists all input files dynamically.
  • Developed dynamic file creation logic in outbound processes, generating files at runtime with names based on parameters such as date, region, or product category.
  • Applied Transaction Control transformation to manage commit and rollback operations dynamically based on file-level or record-level conditions, ensuring data integrity during dynamic file processing.
  • Used key IICS transformations like Update Strategy, Router, Normalizer, Rank, and Filter to implement complex business logic and conditional data routing.
  • Built error-handling frameworks, routing invalid or rejected records to designated error tables in Snowflake with detailed error descriptions for analysis.
  • Created audit tables in Snowflake to capture ETL execution details including record counts, execution timestamps, and job statuses to monitor data pipeline health.
  • Implemented pre- and post-session SQL commands within IICS to perform tasks such as truncating staging tables, validating source system availability, and updating control tables.
  • Collaborated with cross-functional teams including QA, business stakeholders, and Snowflake developers to validate data accuracy, support testing phases, and ensure successful production deployment.
  • Project : Societe Generale
  • Duration: Jan 2022 – June 2022
  • Technologies: IICS and snowflake

COGNIZANT

Informatica Developer.
07.2021 - 12.2021
  • Worked with ETL team for the Migration of DATA JOBS from DATASTAGE to INFORMATICA.
  • Analyzed DataStage jobs and created Mappings in Informatica accordingly using required Transformations Created mappings in Informatica power center and loading data into Oracle.
  • Usage of Debugger and understanding session log for effective error tracking.
  • Used transformation like Source Qualifier, joiner, Aggregator, Sorter, Router, Filter etc.
  • Prepared Unit testing Document, Detailed Design Document and Knowledge transfer Documents.
  • Used Workflow Manager for creating, validating, testing and running the sequential.
  • Used parameter files to avoid Hardcoding and reusability of code.
  • Performed unit testing for jobs and scripts to validate data. If any errors are there looking into them.
  • Developed various mappings with the collection of all Sources, Targets, and Transformations using Informatica Designer.
  • Have extensively worked in developing ETL program for supporting Data Extraction, transformations and loading using Informatica PowerCenter
  • Created various transformations such as Aggregator, Expression, Source Qualifier, Joiner, Filter, Router, Sorter Union and Sequence Generator transformations.
  • Develop ETL test strategies and test plans.
  • Run test cases during development and deployment phases and validating the DATA,
  • Document test results and data discrepancies and Create test summary reports and metrics.
  • Having knowledge in JIRA/CONFLUENCE.
  • Project : ZOETIS LIFE SCIENCE
  • Technologies : Informatica 10.4 , DataStage 11.7 , Oracle , SQL , DBeaver.

SYNITI

Software Developer.
03.2021 - 07.2021
  • Responsible for designing and developing of ETL pipeline from heterogeneous source to data warehouse layer.
  • Developed Mappings using various Transformations like lookup, Update Strategy, Aggregator, Router, Sequence generator, Expression
  • Designed, Developed and unit tested the ETL process for loading the Medicaid/Medicare records across STAGE, LOAD and ACCESS schemas
  • Loaded data into Teradata warehouse using informatica by integrating data from different sources.
  • Extended the functionalities of existing ETL process of Medicaid for Medicare.
  • Worked on defects and Co-ordinating with Testing team and creating the mapping documents
  • Involved in the analyzing of Project and conducted peer review’s
  • Used session logs, workflow logs and debugger to debug the session and analyze the problem associated with the mappings and generic scripts.
  • Maintained documents for Design reviews, Engineering Reviews, ETL Technical specifications, Unit test plans, Migration checklists and Schedule plans.
  • Preparing the weekly status report and coordinating weekly status calls with technology lead/business.
  • Enhancements and Change Requests are being implemented.
  • Project : Envision Healthcare
  • Duration : Mar 2021 to July 2021
  • Technologies : Informatica powercenter , Oracle, flatfiles

Ranstad India Pvt Ltd

Senior Software Engineer.
11.2020 - 02.2021
  • Perform data cleansing, validation, and transformation to ensure data quality.
  • Implement complex transformations using Informatica’s transformation components.
  • Monitor and analyze ETL processes for performance bottlenecks.
  • Implement error handling and logging mechanisms in ETL processes.
  • Responsible for designing and developing of ETL pipeline from heterogeneous source to data warehouse layer using IICS
  • Developed Mappings using various Transformations like Lookup, Update Strategy, Aggregator, Router, Sequence generator, Expression
  • Designed, Developed, and unit-tested the ETL process for loading the Credit card information
  • Troubleshoot and resolve issues related to data integration processes.
  • Perform data cleansing, validation, and transformation to ensure data quality.
  • Implement complex transformations using Informatica’s transformation components.
  • Project : kenvue Insurance
  • Client : Legato Health Technologies LLP
  • Duration : Nov 2020 to Feb 2021
  • Technologies : Informatica, Teradata , SQL,Unix

Vissindia Enterprises Pvt Limited

Software Trainee.
08.2018 - 09.2020

Designed and developed ETL workflows in Informatica PowerCenter to extract data from multiple sources including flat files and SQL Server databases, transform it based on business rules, and load it into Teradata.

Created source-to-target mappings using Informatica transformations such as Expression, Lookup, Joiner, Filter, Aggregator, Sequence Generator, and Router to implement data cleansing, validation, and business logic.

Developed reusable mapplets and parameterized sessions to improve code reusability, maintainability, and adaptability across environments (DEV/QA/PROD).

Implemented data loading strategies for Teradata, including full loads, incremental loads (based on change data capture or timestamp logic), and optimized bulk loading using Teradata utilities (e.g., TPT, FastLoad).

Performed data profiling and validation to ensure source data quality and transformation accuracy before loading into Teradata, using both SQL queries and Informatica data preview.

Applied performance tuning techniques at the mapping, session, and database levels to improve ETL job efficiency — including pushdown optimization, partitioning, and indexing on Teradata tables.

Conducted unit testing and peer code reviews to ensure data accuracy, validate transformation logic, and reduce defects before system and user acceptance testing.

Handled error logging and exception management, routing rejected or invalid records to error tables and creating informative logs for easier troubleshooting.

Maintained documentation for ETL jobs, including mapping specifications, technical design documents, data flow diagrams, and scheduling details to support operational transparency.

Collaborated with DBAs, QA teams, business analysts, and data architects to understand requirements, troubleshoot issues, ensure adherence to standards, and deliver high-quality data integration solutions.

Project : Met Life


Duration : AUG 2018 to SEP 2020

Technologies : Informatica 10.4 , Oracle , SQL , DBeaver.

Education

M.Tech - Design And Manufacturing

Centurion University of Technology And Management
Orissa
01.2015

Skills

  • Operating Systems : Windows , Linux
  • ETL Tool : Informatica 9x/10x
  • Databases : Oracle , Teradata,Mysql
  • Other tools : Unix , shell scripting,WinSCP, Control-M
  • Cloud Technologies : IICS, Snowflake,Azure Data Factory

Timeline

CELEBAL TECHNOLOGY PVT LTD

Senior Consultant
07.2022 - 12.2023

Virtusa

ETL Informatica Developer.
01.2022 - 06.2022

COGNIZANT

Informatica Developer.
07.2021 - 12.2021

SYNITI

Software Developer.
03.2021 - 07.2021

Ranstad India Pvt Ltd

Senior Software Engineer.
11.2020 - 02.2021

Vissindia Enterprises Pvt Limited

Software Trainee.
08.2018 - 09.2020

Project -2

BAJAJ ALLIANZE LIFE INSURANCE COOPERATION LTD

M.Tech - Design And Manufacturing

Centurion University of Technology And Management

Disclaimer

I hereby declare that information furnished above is true to my knowledge and belief. 


(K S Veronika)

Spandana Veronika