Summary
Overview
Work History
Education
Skills
Work Availability
Timeline
Personal Dossier
Awards & Recognition
Generic
Ardhra Balarajan

Ardhra Balarajan

Digital Solution Specialist
Trivandrum

Summary

Accomplished Lead Data Engineer with over 15 years of experience in data warehousing, data analytics, and enterprise IT solutions. Expertise in cloud-based architecture and solution design across Microsoft Azure and AWS platforms, complemented by a strong background in Azure Data Analytics tools such as Data Lake, Databricks, and Synapse. Strong experience in project and client engagement, requirements analysis, project estimation, roadmap planning, and stakeholder communication. Proven leadership in end-to-end project delivery, team mentoring, and technical governance, focusing on building scalable ETL pipelines and analytics solutions that drive business success. Recognized for analytical skills, adaptability, and a collaborative mindset, consistently delivering innovative solutions that align with organizational goals while excelling in dynamic environments.

Overview

15
15
years of professional experience
4
4
Language

Work History

Digital Solution Specialist

Infosys
01.2024 - Current
  • Cloud to Cloud migration of various application from Europe region to US enterprise cloud computing platform.
  • Building an application for duties and taxes reporting using Azure data factory, Azure Databricks, Azure SQL server and reporting in Power BI.
  • Understand problem spaces, create MVP plan and the digital transformation roadmap.
  • Drive technology selection and own the overall solution development.
  • Provide strategic recommendations to clients
  • Guide the teams on implementation through the journey, ensure value generation for clients.
  • Monitored project status and productivity while inspecting performances to decrease process discrepancies by 15%.
  • Inspected products as part of adherence to quality standards.
  • Answered customer questions about billing, account issues and upgrade possibilities.
  • Domain: Logistics, Netherland
  • Technology: Data Analytics-Azure

Project Manager

Infosys
09.2022 - 12.2023
  • Establishing a solid foundation for project execution by creating detailed plans that outline tasks, resources, timelines, and deliverables.
  • Initiate the planning process by clearly defining the project’s scope, goals, and objectives.
  • Assigning tasks to the team, unifying team efforts through collaboration, conflict resolution, and effective team meetings
  • Identify potential risks, analyze their impact, and develop mitigation strategies.
  • Estimating costs, establish budgets, track spending, and adjust as necessary to keep the project within financial boundaries while ensuring fiscal efficiency.
  • Maintaining open and transparent communication with clients and stakeholders and also provide updates, respond to inquiries, and use feedback.
  • Implement quality control processes to ensure that deliverables meet agreed-upon standards and satisfy client requirements, thereby maintaining project integrity.
  • Met project deadlines without sacrificing build quality or workplace safety.
    Established effective communication among team members for enhanced collaboration and successful project completion.
    Coordinated with cross-functional teams to resolve project issues and mitigate risks.
  • Domain: Logistics, Netherland
  • Technology: Azure

Technical Lead

Infosys
06.2020 - 08.2022
  • Ground Operation Dashboard: Framework for next generation ground operation organization that will support persona-based insights through intuitive visualization. Developing a station dashboard in Azure that includes the main KPI for service and financials. Azure data Factory service will be used to create pipelines that will connect to various raw sources and ingest the data in to ADLS. Databricks is used to process the raw data, and this prepared base data will be stored in a designated publish container within ADLS. Azure Data Bricks will also be used to transform the prepared base data to complex report specific tables. This report data is exposed through azure synapse analytics.
  • Teamed on development of technology roadmap spanning 5 years.
  • Initially part of data integration and mapping team, responsible for getting the data from various data sources based on the priority of different incoming programs. Create the data modelling and mapping based on initial data analysis and centralized repository. Creating the pipelines in Azure data lake and bring the data to Azure Data Factory layer, creating JSON queries and use it in standard Databricks template script and bring the data to publish layer.
  • Responsible for ingestion of required entities and creating complex KPI’s in functional Mart.
  • Liaising with reporting team to integrate the data in power BI.
  • Scrum lead for domestic ingestion and part of creating project plan, roadmaps, performing feasibility analysis for new interface, providing the estimates to client.
  • Coordinating with onshore team SME’s/BA’s, clients for requirement gathering.
  • Performing unit testing, system integration of interfaces to ensure high system quality.
  • Data migration activities for creating dev and user testing environments.
  • Providing post development support for the project by resolving interface related issue.
  • Prepare documentation for configuration and maintaining centralized repository for entities.
  • Ensuring the daily data ingestion is completed without any issues.
  • Managing mid-size team operating from various geographical locations.
  • Domain: Logistics, Netherland
  • Technology: Azure

Technical Lead

Infosys
02.2019 - 05.2020

Log analysis and dashboard creation: Two road network interfaces which has been integrated had mismatched data across multiple applications involved in the integration which affected the operation service. As a solution, a reporting platform has been created to find the root cause and pinpoint the issue. Splunk has been used to analyze the logs from various application to analyze the issue or the layer of missing. Data from Splunk has been integrated to Spotfire using TDV for reporting and dashboard creation.

  • Coordination with client for requirements gathering and specifications.
  • Coordination with different application team, support, third party vendors, Product owners, business users, Project and service transition team.
  • Design and develop search heads in Splunk to consume, parse and search in the logs of different application.
  • Create the views and tables in TDV to consume the data from Splunk.
  • Creating multiple stored procedure for processing the log data.
  • Built dashboard reports as per the client requirements.
  • Monitoring of TDV jobs until the data is loaded successfully.
  • Finding the root cause of TDV job failures.
  • Patching the search heads for fixing any log changes.
  • Analyzing any mismatch/missing movement and report it.
  • Coordinated cross-functional teams to ensure seamless integration of new features, resulting in more robust product offering.
  • Engaging the client, application team in case of any discrepancy reported in data.
  • Domain: Logistics, Netherland
  • Technology: Splunk and Spotfire

Senior Developer/Technical Lead

Infosys
08.2017 - 01.2019
  • Revenue Report using ETL: Standardizing business objects and create an integrated single source of truth to facilitate data integration and analysis across entities. Creating a common semantic layer with additional granularity and better data quality from where business users can access and analyze data based on common definitions and standardized attributes and hierarchies across the enterprise. An integrated reporting environment in Teradata based on a common source using a standard set of definitions and with actionable reports that will provide insights into revenue growth revenue quality as well as operational efficiency that will enable senior management to make better decisions for revenue growth and operational efficiencies.
  • Designing, developing and enhancing ETL architecture and jobs.
  • Data extraction was done using fast load and Multiload utilities, transformation and load done through Bteq scripts. Unix was used to do minor transformation and for wrapping each job.
  • Developing history Load and incremental load script.
  • Defined and implemented ETL development standards and procedures.
  • Optimizing the ETL and SQL code for better performance.
  • Enhancement of existing ETL jobs.
  • Creating design document, test case document as per standard followed in the project.
  • Creating test case for ETL scripts and SQL queries.
  • Designing and developing unix scripts for scheduling ETL jobs.
  • Data migration activities for creating dev and user testing environments following best practices.
  • Prepare documentation for configuration and release procedures.
  • Deploying the code in production and scheduling as per requirements.
  • Creating SQL queries for data validation and tuning SQL performance.
  • Providing L2 support by fixing issues transferred by L1 support team.
  • Ensuring the daily data ingestion is completed without any issues.
  • Implementing ETL best practices, methodologies in the project.
  • Providing post development support for the project by resolving interface related issue.
  • Domain: Logistics, Netherland
  • Technology: Teradata and Unix

Developer

Infosys
01.2016 - 07.2017
  • MDM and Customer Integration: Migration of interfaces built using pig scripts to IBM Datastage and this data will be consumed by MDM which would be the single source of truth. Creation of ETL interfaces in Datastage connecting various sources including Oracle, Sales force and Teradata and transform and load data in to MDM.
  • Adapted quickly to new technologies and programming languages, enhancing overall team productivity.
  • Development and migration of interfaces/Datastage jobs for automation of data flow for customer and its child entities.
  • Creation of audit framework in Unix using Datastage command line.
  • Designing and developing Unix scripts for scheduling ETL jobs.
  • Resolved bottlenecks by optimizing the ETL jobs
  • Performing unit testing, end to ends integration testing for the mappings and workflows
  • Resolving data issues by performing data change requests.
  • Development of ETL interfaces, SQL code blocks and Unix scripts.
  • Modified existing ETL scripts to fix defects at root cause as enhancement part of project.
  • Performing unit testing, end to end testing integration testing for ETL scripts.
  • Deploying the code in production and scheduling as per requirements.
  • Analyzing the existing workflow and tuning their performance.
  • Resolved bottlenecks by optimizing the ETL jobs
  • Domain: Logistics, Netherland
  • Technology: Datastage and Unix

Developer

Infosys
03.2014 - 12.2015
  • Fulfillment and analytics: Report creation for trend analysis based on the fashion sales data available in the EDW tables.
  • Designing and development of report through SQL query using Teradata SQL assistant.
  • Designed customized solutions for proposals to potential customers.
  • Creating SQL queries for data validation and tuning SQL performance.
  • Validation of data, performing unit testing, end to ends integration testing.
  • Extraction of daily and monthly data for festival sales.
  • Worked on automation of partitioning the tables.
  • Support on the EDW jobs
  • Domain: Retail, US
  • Technology: Teradata and Unix

Developer

Infosys
02.2013 - 12.2013
  • SKU Remediation: Expanding the SKU length in all the scripts, jobs and interfaces.
  • Adapted quickly to new technologies and programming languages, enhancing overall team productivity.
  • Single resource leading the open system track, analyzing all the Unix bash scripts, SQL databases, tables and PL/SQL scripts.
  • Analyzing the existing workflow and expanding the SKU size in all the impacted programs.
  • Executing and testing all the jobs and liaising with the front-end team.
  • Checking all the downstream applications impact.
  • Prepare documentation for configuration and release procedures.
  • Deploying the code in production and scheduling as per requirements.
  • Ensuring the daily data ingestion is completed without any issues.
  • Providing post development support for the project by resolving interface related issue.
  • Domain: Retail, US
  • Technology: SQL, PL/SQL, Unix

Developer

Infosys
01.2009 - 02.2013
  • Support and Enhancement: Multiple tracks and technology used for sourcing and processing publishing data, Serena Dimension was the version, build and release management tool to controllably track branching and migration to each environment.
  • Dedicated single offshore resource to support all the tracks in versioning, configuration, build, release and change management in development, test and production environments.
  • Providing support for new project and acted as the release manager and configuration controller.
  • Multiple session on tool usage to new joiners and reiterating sessions.
  • Administration and management of the server on which the tool was installed.
  • Performed supporting activities, daily health check to keep application running.
  • Worked on HPSM, solving incident, service request, data changes.
  • Performed annual maintenance activities of server failover, database partitioning.
  • Domain: Publishing, UK
  • Technology: Serena Dimension, Unix

Education

Bachelor of Science - Computer application (BCA)

Bharathiyar University
Erode, India
04.2001 -

Skills

Microsoft Azure expertise

Work Availability

monday
tuesday
wednesday
thursday
friday
saturday
sunday
morning
afternoon
evening
swipe to browse

Timeline

Digital Solution Specialist

Infosys
01.2024 - Current

Project Manager

Infosys
09.2022 - 12.2023

Technical Lead

Infosys
06.2020 - 08.2022

Technical Lead

Infosys
02.2019 - 05.2020

Senior Developer/Technical Lead

Infosys
08.2017 - 01.2019

Developer

Infosys
01.2016 - 07.2017

Developer

Infosys
03.2014 - 12.2015

Developer

Infosys
02.2013 - 12.2013

Developer

Infosys
01.2009 - 02.2013

Bachelor of Science - Computer application (BCA)

Bharathiyar University
04.2001 -

Personal Dossier

Languages Know: English, Hindi, Malayalam and Tamil

Awards & Recognition

Awarded with technical excellence within for different projects, Awarded as unsung heroe for delivery excellence., Awarded excellence award for best project.
Ardhra BalarajanDigital Solution Specialist