Summary
Overview
Work History
Education
Skills
Certification
Languages
Work Availability
Quote
Timeline
Awards
Generic

Rajanand Ilangovan

Bangalore

Summary

  • Have over 10 years of experience in design, development and maintenance of databases, data warehouse and business intelligence solutions.
  • Problem solver: Ability to work individually as well as collaboratively in a team with excellent problem solving and troubleshooting capabilities.
  • Goal-oriented and possess ardent desire to learn new technologies and tools. Good team player, Self-motivated and life-long learner.
  • International work experience: Have worked in onshore - offshore working model in a multi-cultural team environment across Europe and North America. Lead a team from onshore and implemented a development and migration project.
  • Agile methodologies: Have worked in Scrum and Kanban. Also have experience working in Waterfall methodology.
  • Database: Have majorly worked in SQL Server for both OLAP and OLTP applications. Was trained in MySQL and PL/SQL. Good knowledge in Cloud databases and data warehouses like Azure SQL database, Azure Synapse Analytics, Google BigQuery, Exasol analytic database.
  • Business Intelligence: Have created many BI/DW solutions for BFSI clients. Have majorly worked in SQL Server Integration Services to implement ETL solutions. Experienced in BI tools like Tableau and Power BI and knowledgeable in MicroStrategy. Have involved data modelling and solution deployment.
  • Cloud Computing: Have good knowledge and interest in cloud computing. Knowledgeable in Azure, Google Cloud Platform, Oracle Cloud Infrastructure and Microsoft Power Platform.

Overview

14
14
years of professional experience
1
1
Certificate

Work History

Technical Lead - Business Intelligence

Accion Labs
04.2022 - Current

#1 SALES ANALYTICS

Client: Global Academic Publisher
Role: Lead Data Engineer

Project Description:
Led the data engineering design and development for a strategic cloud-based data platform supporting a major academic publishing client. The initiative focused on integrating diverse data sources—including internal systems and third-party data—to power advanced analytics for customer retention, market analysis, and sales optimization.

Key Responsibilities & Achievements:
• Led the technical design and data modeling for multiple data domains within enterprise data warehouse; authored and secured approval for over 10 Data Flow Diagrams (DFDs) and data models using ER Studio and Enterprise Architect.
• Orchestrated the integration of 3rd party Ringgold data via FTP into a 4-layer medallion architecture (ADLS), standardizing institutional identifiers to enhance customer profiles and enable advanced market and customer analysis.
• Engineered a secure, automated data export process to deliver curated datasets to an external S3 bucket, directly powering a downstream customer retention analysis model.
• Designed and developed a strategic entitlements data model (fact and dimension tables) to provide a unified view of customer access rights, identifying upsell opportunities for journals and books.
• Championed data quality and integrity by diagnosing and resolving a 30% discrepancy in ancillary revenue reporting; root cause analysis identified and fixed issues including dimension table duplicates and case-sensitive join conditions specific to the cloud environment.

Technologies Used: Azure Synapse Analytics, Delta Lake, PySpark, Spark SQL, Synapse Pipelines, ADLS Gen2, Git, Power BI, Star Schema Data Modeling.


#2 SINGLE SUPPLIER DASHBOARD (PURCHASE ANALYTICS)

Client: Global Industrial Company
Role: Lead Data Engineer

Project Description:
Led the end-to-end design, development, and deployment of a centralized Purchase Analytics platform to transform fragmented SAP procurement data into actionable intelligence for the global sourcing team. The project aimed to solve critical gaps in spend visibility and price volatility analysis, directly empowering buyers to negotiate more effectively with suppliers.

Key Responsibilities & Achievements:
• Technical Leadership & Architecture: Architected and implemented a scalable, multi-layered data lakehouse on Azure Databricks using a medallion (Bronze/Silver/Gold) architecture with Delta Lake to ensure reliability, performance, and ACID compliance.
• Complex Data Engineering: Engineered and optimized large-scale ETL pipelines in PySpark to process over 125 million records from complex SAP source tables (EKKO, EKPO). Developed sophisticated business logic to compute key KPIs including Spend, % Savings/Increase, and Invisible Spend.
• DevOps & Automation: Implemented and managed the deployment of Databricks notebooks and Azure Data Factory pipelines, ensuring robust and repeatable release processes across environments.
• Cross-Functional Collaboration: Collaborated closely with enterprise architects, SAP functional consultants, and business stakeholders to translate complex procurement requirements—such as tracking historical price changes and TOGR (Turnover of Goods Received)—into a tailored star schema data model for efficient Power BI reporting.
• Mentorship & Delivery: Mentored and technically guided a team of 2 data engineers, fostering best practices in Spark optimization and data modeling. Successfully delivered the project, providing a single source of truth that eliminated manual reporting and enabled data-driven supplier negotiations.

Technologies Used: Azure Databricks, Delta Lake, PySpark, Spark SQL, ADF, ADLS Gen2, Git, Power BI, Star Schema Data Modeling.


#3 MIGRATE CDC TO NATIVE SQL SERVER REPLICATION

Client: Dutch based IT services company in Legal domain
Role: Lead Data Engineer

Project Description:
Initiated and led a successful project to replace an expensive and underperforming 3rd party CDC tool with a native SQL Server Transactional Replication solution. The legacy system suffered from severe latency (10-14 hours weekly), unexplained data loss, and high licensing costs. The new framework ensured reliable, near-real-time data synchronization, achieved direct cost savings of $175K, and significantly improved data reliability.

Key Responsibilities & Achievements:
• Drove a critical initiative to replace a costly third-party replication tool by architecting and implementing a native SQL Server transactional replication solution, resulting in a direct annual savings of $175K in licensing fees.
• Resolved major data latency and reliability issues by migrating over 170 tables, reducing transaction latency from frequent 10-14 hour delays to a consistent average of 5 seconds.
• Engineered a scalable and reusable framework using batch and SQL scripts to automate the replication setup process for any table, standardizing the workflow and empowering the team to independently manage the system.
• Built a custom monitoring and alerting solution by implementing Tracer Tokens to track replication latency, collecting history of latency to proactively identify and resolve potential bottlenecks.
• Championed knowledge transfer and self-sufficiency by creating comprehensive documentation and conducting training sessions on replication management, empowering the team to independently monitor and maintain the new environment.

Technologies Used: SQL Server Replication, Batch Script, T-SQL



Principal Software Engineer

Accion Labs
10.2020 - 03.2022

#4 ETL HOURLY INCREMENTAL LOAD DEVELOPMENT

Client: Dutch based IT services company in Legal domain
Role: Data Engineer

Project Description:
Implemented a high-frequency incremental data loading framework to transition from full loads to incremental data processing.

Key Responsibilities & Achievements:
• Improved data availability and efficiency by leading the migration from full-load ETL scripts to a stored procedure-based incremental load, reducing data refresh time by over 80% (from 8 hours to 1.5 hours) and providing stakeholders with near-real-time data for decision-making.
• Enhanced data reliability and performance by collaborating with the Qlik support team to resolve replicate latency issues, ensuring uninterrupted data flow and preventing data loss to secondary systems.
• Drove data security and compliance by preparing and implementing data masking rules for over 300 tables to securely provision production data for use in development and QA environments.
• Streamlined development and testing workflows by configuring and managing database clones and images, and by generating large volumes of data for performance testing.
• Reduced manual effort and project risk by developing automated deployment scripts and preparing deployment manifests, ensuring a smooth and reliable deployment process to staging and production environments.

Technologies Used: SQL Server, T-SQL, Qlik Replicate, Redgate Data Masker, Redgate Clone, Redgate Data Generator, Git

Senior Software Engineer

Accion Labs
03.2017 - 09.2020

#5 SA - PRODUCT FEATURES DEVELOPMENT AND ENHANCEMENT

Client: Production Finance Software Provider in Entertainment Industry
Role: Senior Developer

Project Description:
Supported the development and enhancement of a financial product by implementing critical US tax form updates (1099-MISC/NEC), addressing production issues, and leading knowledge transfer for junior developers. Focused on ensuring compliance with IRS regulations while improving product functionality and stability.

Key Responsibilities & Achievements:
• Ensured regulatory compliance by successfully implementing all US tax form 1099 MISC and NEC changes ahead of the tax filing season.
• Resolved a critical business logic bug that was causing discrepancies in the General Ledger and cost reports, ensuring accurate financial reporting for stakeholders.
• Re-engineered and modularized a legacy cost report, reducing its runtime by over 68% (from 80 minutes to 25 minutes), and significantly improving code readability and maintainability for future development.
• Mentored junior developers and provided knowledge transfer on product specifics and accounting domain expertise, reducing onboarding time and improving overall team proficiency.

Technologies Used: SQL Server, T-SQL, Microsoft SQL Server


#6 SA - STUDIO INTERFACES DEVELOPMENT

Client: Production Finance Software Provider in Entertainment Industry
Role: SQL Developer

Project Description:
Developed and maintained a bidirectional integration system between a production accounting platform and SAP, ensuring end-of-day reconciliation. Focused on implementing complex business logic via T-SQL, optimizing query performance, and providing robust production support to maintain data consistency across systems.

Key Responsibilities & Achievements:
• Optimized critical data synchronization between the production accounting system and SAP by extensively tuning queries, reducing the daily processing time from 45 minutes to 5 minutes and significantly decreasing the load on the database.
• Ensured financial data integrity and prevented discrepancies for auditors by providing timely support for production and reconciliation issues.
• Drove system-wide data quality by collaborating with business analysts to establish a clear data contract for two-way syncing of ~100,000 daily transactions across systems.
• Streamlined the project lifecycle by working directly with a third-party vendor and teams across multiple time zones to align on requirements, conduct integration testing, and assist business users with initial data exploration.

Technologies Used: SQL Server 2014/2017, T-SQL, Jira, Confluence, Git, BitBucket


#7 LEGACY ISAM TO SQL SERVER DATA MIGRATION

Client: Production Finance Software Provider in Entertainment Industry
Role: Data Engineer

Project Description:
Spearheaded a critical data migration project to modernize the data infrastructure for a production show management system. The project involved reverse-engineering a completely undocumented legacy ISAM (Indexed Sequential Access Method) file-based system and successfully migrating its data and business logic to a modern SQL Server 2014 database. The key challenge was ensuring that the new system perfectly replicated the functionality and output of the legacy environment, with zero discrepancies in operational reports.

Key Responsibilities & Achievements:
• Reverse-Engineered Complex Legacy System: Analyzed raw CSV extracts and deconstructed existing production reports from the undocumented ISAM system to infer the complete data model and complex business rules, effectively recreating the system's logic from the ground up.
• Designed Modern Relational Data Model: Architected a new, rationalized SQL Server schema based on the reverse-engineered logic, optimizing it for clarity, maintainability, and future reporting needs.
• Developed Robust ETL Framework: Engineered scalable and generic SSIS packages to transform and load data from the legacy CSV extracts into the new SQL Server database, ensuring data integrity and accuracy throughout the migration process for over 200+ production shows.
• Executed Rigorous Validation Strategy: Designed and implemented a data quality framework to meticulously reconcile reports generated from the new SQL Server database against the original ISAM system outputs, guaranteeing 100% data parity and a successful go-live with zero business disruption.

Technologies Used: SQL Server 2014, SSIS (SQL Server Integration Services), ISAM, Reverse Engineering, Data Modeling, ETL Development, Microsoft Excel, Jira, Confluence

Technical Lead

Capgemini
04.2015 - 03.2017

#8 SQL SERVER TO MICROSOFT APS (PDW) MIGRATION

Client: Leading Insurance Company in Sweden
Role: Lead Developer

Project Description:
Led a major migration initiative to modernize the data warehouse by moving over 600 GB of data and 350+ ETL processes from a traditional SQL Server environment to the Microsoft Analytics Platform System (APS), formerly known as Parallel Data Warehouse (PDW). Key focus areas included performance optimization for a massively parallel processing (MPP) architecture, ensuring compatibility, and coordinating a distributed team to achieve timely milestones.

Key Responsibilities & Achievements:
• Led the technical migration planning, creating the project roadmap and breakdown for converting over 350 SSIS packages to be PDW-compatible, resolving compatibility issues and ensuring functional parity in the new MPP environment.
• Architected and optimized table structures by applying advanced performance tuning techniques, including selecting optimal hash and round-robin distribution strategies to minimize data movement and maximize parallel query performance on the APS platform.
• Orchestrated the end-to-end migration of 600 GB of data, ensuring accuracy and integrity throughout the ETL modernization process. Coordinated closely with offshore team members to deliver on weekly milestones and ensure timely project completion.
• Validated migration success by designing and executing functional and unit testing plans, and communicating results clearly to business users to ensure the new platform met all performance and functional requirements.

Technologies Used: Microsoft APS (PDW), SQL Server, SSIS, T-SQL, Performance Tuning, ETL Migration

Senior Software Engineer

Capgemini
12.2012 - 03.2015

#9 DATA WAREHOUSE AND BUSINESS INTELLIGENCE ETL DEVELOPMENT AND MAINTENANCE

Client: Leading Insurance Company in Sweden
Role: ETL Engineer

Project Description:
Owned the end-to-end maintenance, enhancement, and operational integrity of a large-scale enterprise data warehouse (EDW) on Microsoft's Analytic Platform Service (APS/PDW). Led a continuous cycle of implementing ad-hoc business requirements, optimizing ETL processes, and delivering actionable insights through robust reporting, while mentoring junior developers and ensuring system reliability.

Key Responsibilities & Achievements:
• Led the Development & Maintenance of a complex EDW, architecting and implementing SSIS ETL pipelines to ingest data from diverse sources into a multi-layered warehouse environment, supporting critical business intelligence and reporting needs.
• Designed and optimized data marts using both Star and Snowflake schemas to structure data efficiently for consumption by MicroStrategy reports, enhancing performance and user experience for business analysts.
• Championed Process Improvement & Reliability by designing reusable, low-maintenance ETL templates and orchestrating over 350+ job streams in IBM Tivoli Workload Scheduler, significantly improving operational efficiency and reducing manual intervention.
• Drove Business Collaboration & Analysis by conducting requirement analysis sessions, performing deep-dive data analysis to produce insights, and coordinating with business users and vendors for UAT and integration testing, ensuring solutions met precise business needs.
• Mentored Junior Team Members through technical knowledge transfers and code reviews, fostering team growth and ensuring adherence to best practices in ETL development and T-SQL programming.
• Ensured System Governance by creating and maintaining comprehensive technical documentation, including mapping documents, KB articles, and system design specs.

Technologies Used: SQL Server (2008 R2, 2012), Microsoft APS/PDW, SSIS, T-SQL, TFS, Data Mart Design (Star & Snowflake Schemas)

Software Engineer

Capgemini
12.2011 - 11.2012

#10 ETL AND REPORT DEVELOPMENT

Client: Financial Services Company in North America
Role: BI Developer

Project Description:
Provided end-to-end operational support and development for a critical ETL and reporting environment. Key responsibilities included ensuring the reliability of daily data loads, developing a suite of business intelligence reports, and implementing enhancements to meet evolving business needs, all while maintaining clear communication with stakeholders.

Key Responsibilities & Achievements:
• Managed and monitored critical production ETL loads using SSIS, ensuring timely and accurate data delivery for business operations and reporting.
• Authored Root Cause Analysis (RCA) documents for production issues, diagnosing failures in SSIS packages and implementing fixes to enhance pipeline stability and prevent recurrence.
• Developed and optimized T-SQL stored procedures to support backend data processing and reporting requirements.
• Managed the End-to-End Change Request Process: Implemented and enhanced changes to existing SSIS packages and SSRS reports based on new business requirements, ensuring solutions were robust, efficient, and thoroughly tested before deployment.

Technologies Used: SQL Server, SSIS, SSRS, T-SQL

Education

Bachelor of Technology - Electrical and Electronics Engineering

Pondicherry University
05.2011

Diploma - Electrical and Electronics Engineering

Karaikal Polytechnic College
04.2008

Skills

  • ETL development
  • T-SQL
  • Batch scripts
  • Power Shell
  • Microsoft SQL Server
  • Cloud computing
  • Azure Synapse Analytics
  • Azure SQL database
  • Google Cloud Platform
  • Microsoft Azure
  • Power BI
  • Power Query
  • DAX
  • Tableau
  • SQL Server Integration Services
  • SQL Server Reporting Services
  • Azure Data Factory
  • Parallel Data warehouse
  • SQL Server management Studio
  • Visual Studio
  • TFS
  • Jira
  • Confluence
  • BitBucket
  • Gitlab
  • Apache Airflow
  • Qlik Attunity Replicate
  • Redgate Data Masker
  • Redgate SQL Clone
  • Redgate Data Generator
  • Life Insurance
  • Banking
  • Production Accounting
  • Legal Management
  • Data warehousing
  • Data modeling
  • Data pipeline design
  • Data migration
  • Big data processing
  • Spark framework
  • Performance tuning
  • Data integration
  • Business intelligence
  • Database design
  • Relational databases
  • Database development

Certification

  • Microsoft Certified Solutions Expert (MCSE): Data Management and Analytics
  • Microsoft Certified Solutions Associate (MCSA): SQL Server 2016 Database Development
  • Microsoft Certified Solutions Associate (MCSA): BI Reporting
  • Microsoft Certified Power BI Data Analyst Associate
  • Microsoft Certified Fundamentals - Azure, Azure Data, Azure AI, Power Platform
  • Microsoft Certified Fabric Analytics Engineer Associate
  • Microsoft Certified Azure Data Engineer Associate
  • Microsoft Certified Fabric Data Engineer Associate
  • Databricks Certified Data Engineer Associate
  • SnowPro Certified Platform Associate
  • Tableau Certified Desktop Specialist
  • Google Cloud Certified Associate Cloud Engineer

Languages

Tamil
Bilingual or Proficient (C2)
English
Advanced (C1)

Work Availability

monday
tuesday
wednesday
thursday
friday
saturday
sunday
morning
afternoon
evening
swipe to browse

Quote

Good judgment comes from experience. Experience comes from bad judgment.
Jim Horning

Timeline

Technical Lead - Business Intelligence

Accion Labs
04.2022 - Current

Principal Software Engineer

Accion Labs
10.2020 - 03.2022

Senior Software Engineer

Accion Labs
03.2017 - 09.2020

Technical Lead

Capgemini
04.2015 - 03.2017

Senior Software Engineer

Capgemini
12.2012 - 03.2015

Software Engineer

Capgemini
12.2011 - 11.2012

Bachelor of Technology - Electrical and Electronics Engineering

Pondicherry University

Diploma - Electrical and Electronics Engineering

Karaikal Polytechnic College

Awards

  • Customer delight, Accion Labs, 2017-06-01
  • Best Performer of the year, CIO, Länsförsäkringar, 2014-12-01
Rajanand Ilangovan