Summary
Overview
Work History
Education
Skills
Timeline
Generic

Virbhadra Sutar

Pune

Summary

With over 14 years of experience as a Data Engineer and Database Analyst, I am an expert in designing, developing and deploying scalable data solutions using Azure Data Factory (ADF), Databricks, ETL tool like SSIS, databases like SQL Server and postgres and cloud platform like Azure and AWS. Adept at building end-to-end data pipelines, optimizing data workflows and enabling data-driven decisions making across product. Strong expertise in SQL, Python, PySpark and Delta lake.

Overview

15
15
years of professional experience

Work History

Tech Lead Database Analysis

Fiserv
07.2023 - Current
  • Product: Ebill distribution
  • Designed and developed a centralized reporting solution using databricks and medallion architecture to provide insights into the eBill delivery product. The solution enabled data driven decision making by tracking key matrices, such as enrollments, rejections, acceptances and bill deliveries.
  • Responsibilities
  • Applied data engineering principles to design and develop data pipelines.
  • Applied transformations to raw data using databricks SQL and Python APIs generating key metrics and insights.
  • Stored processed data in Delta Lake, ensuring efficient querying and reporting, and managed data storage and retention to meet business requirement.
  • Implemented data quality checks and validation rule to ensure accuracy and consistency of data and identified and resolved data quality issues.
  • Developed centralized reporting solution using Power BI reporting tool to provide insights into ebill delivery matrices, enabling data driven decision making and informing product enhancement and marketing strategy.
  • Created interactive and dynamic dashboards using Power BI, providing insights into eBill rejections, enrollments, eBills sent, active/inactive accounts, and product revenue based on eBills sent, Late eBill delivery.
  • Documented data pipeline architecture, design and implementation and shared knowledge with team members to ensure continuity and consistency.
  • Optimize data processing and storage using delta lake feature Z-ordering and optimize commands.
  • Mentor and support engineering teams in implementing architectural guidelines and patterns.
  • Set data modeling standards for Lake House and data warehouse layers, including Bronze-Silver-Gold layers.
  • Lead architectural reviews, proof-of-concept (PoC) evaluations, and technical roadmap planning for data platforms.
  • Optimized the eBill SQL Server database through indexing, partitioning, and query optimization techniques.
  • Replaced manual deployment processes with automated CI/CD pipelines using GitHub and Azure DevOps, enhancing script quality and reliability.
  • Followed best practices while setting up databases and configuring user permissions.
  • Assisted the development team in troubleshooting stored procedures by setting up SQL Profiler to trace SQL statements and capture relevant errors.
  • Reviewed database scripts written by the development team, ensuring compliance with the DB checklist document.
  • Developed, implemented, and optimized queries and stored procedures using T-SQL (DQL, DML, DDL, DCL, and TCL).
  • Utilized monitoring tools like Foglight, SQL Profiler, and Database Engine Tuning Advisor to ensure database efficiency.
  • Key Skills & Tools:
  • Apache Spark, Delta Lake, Python, PySpark, SQL, Databricks Workflows, DLT, DBFS, Auto Loader, SQL Server
  • Teamed on development of technology roadmap spanning [Number] years.

Database Consultant

Amazon Web Services (AWS)
12.2022 - 07.2023
  • Responsibilities
  • Worked directly with customers to assess and analyze their existing on-premises and cloud-based database environments, identifying areas for performance improvement, cost optimization, and architectural modernization.
  • Provided expert guidance on database migration strategies and implementation, helping customers transition from legacy systems to modern AWS-managed services such as Amazon RDS (SQL Server, PostgreSQL, Aurora) and Amazon Aurora.
  • Designed and implemented scalable and secure data integration workflows using AWS Glue, enabling customers to efficiently catalog, transform, and migrate data from diverse sources to data lakes and warehouses.
  • Recommended appropriate AWS storage solutions such as Amazon S3, including designing lifecycle policies, data classification, and bucket configuration for optimized performance and cost.
  • Collaborated with cross-functional AWS solution architects, product teams, and customer engineering teams to deliver best practices in cloud-native database architecture and DevOps integration.
  • Assisted customers in designing high availability, disaster recovery, and backup strategies using features of AWS RDS, Aurora Global Databases, and Amazon S3 Glacier.
  • Conducted performance tuning, indexing strategies, and query optimization for customer databases, enhancing system efficiency and reducing operational cost.
  • Delivered technical presentations, PoCs (proof of concepts), and workshops tailored to customer-specific needs to showcase the value of AWS database solutions.
  • Provided post-migration support and knowledge transfer to customer teams, ensuring smooth operational transition and long-term success on AWS.
  • Key Skills & Tools:
  • RDS for SQL Server, AWS glue, Amazon Aurora, S3 bucket
  • Installed, configured, tested and maintained operating systems, application software, and system management tools.
  • Integrated new technologies into existing systems, enhancing functionality and user experience.
  • Solved complex technical issues related to database management, ensuring optimal system performance at all times.
  • Managed multiple simultaneous projects effectively while adhering to strict deadlines and quality standards.
  • Conducted thorough audits of existing databases, identifying areas for improvement and implementing necessary changes.
  • Educated users on best practices for database management, increasing productivity and reducing errors.

Database Specialist

Fiserv
03.2017 - 12.2022
  • Product: MobileWallet
  • MobileWallet is Fiserv product uses Apple’s built in wallet to deliver bill’s to consumers. Latest bills are displayed as Apple passes inside Apple wallet. MobileWallet is another channel of eBill distribution where consumers can view, store and pay their bills.
  • Responsibilities
  • Designed and implemented a scalable data pipeline using Databricks to ingest and process bill delivery events for MobileWallet, a digital billing solution leveraging Apple Wallet.
  • Modeled EventLogs as the central Fact Table to track bill-related activities (eBill Sent, eBill Delivered, Enrollments), integrating it with dimension tables (BillerInfo, Calendar, Destination, Vertical) for enriched analytics.
  • Prepared plan for database migration with solution architect to cover application side scenarios.
  • Developed batch data ingestion workflows in Delta Lake, ensuring high-throughput, reliable data availability for reporting and insights.
  • Migration of database from SQL Server 2008 R2 to SQL Server 2016.
  • Built interactive dashboards using Databricks SQL and Power BI/Tableau to visualize key business metrics including:
  • Total eBills Sent and Delivered
  • Consumer Enrollment/unrollment Trends
  • Revenue Attribution and Active Account Analysis
  • Collaborated with product, analytics, and executive teams to define data KPIs, shaping strategic decisions around eBill adoption, consumer engagement, and channel effectiveness.
  • Applied data optimization techniques like partitioning, Z-order clustering, and caching to improve query performance over large billing datasets.
  • Ensured data quality and integrity through validations, schema enforcement, and automated testing frameworks in Databricks notebooks.
  • Key Skills & Tools:
  • Azure Databricks , PowerBI, Spark SQL, Python, Delta Lake
  • Built databases and table structures for web applications.
  • Reduced data redundancy by designing efficient database normalization techniques.
  • Provided training sessions on best practices in database management, enhancing team knowledge and skill sets.
  • Customized integration processes between various software applications, enabling more effective data sharing across platforms.

Senior Software Engineer

Mphasis
08.2010 - 03.2017
  • Product: CDW Data Warehouse
  • This repository permits Delphi Customs to generate Ad Hoc reports that will be used for analysis, audit compliance, and business planning. The application receives daily transmissions from multiple vendor sources using via File Transfer Protocol for Canadian, Mexican, and US data. The CDW database will st the historical data for up to 5 years.
  • Responsibilities
  • Studied existing design, data transformation, data sources, destination and execution of DTS packages used in CDW.
  • Involved in requirement gathering, Package design, development, Testing and deployment.
  • Identified long running queries and optimized.
  • Created a common SSIS template which can be reused with minimal changes and shared across the team.
  • Used Data flow components like, Merge join, Multicast, and Lookup transformation, Derived Column, Union All and Conditional Split.
  • Written C# code to send an email to support team on package failure, used Script task in OnError and OnTaskFailed event handler.
  • Troubleshooting of errors by using an add watch window to view package and task variable values.
  • Used SSIS expressions to evaluate package variables by using conditional operator and string functions.
  • Generated XML configurable file through package configuration organizer and added connection managers and variables.
  • Created an error log file to log an error in text file if any of the task fails.
  • Troubleshooting data issues using Data viewers. Added data viewers to output paths and used grid to view the raw data in columns and rows.
  • Identified data issues during testing, used built-in-functions in SQL queries to identify duplicate records.
  • Involved in writing a technical document for developed packages.
  • SQL database backup, adding users, restore database in development or test environment for testing
  • Scheduled packages on SQL Server Agent through command line and Integration Services using a proxy user.
  • Environment: Visual Studio 2008, SSIS, C#, Windows 2008 R2 Server, SQL Server 2008
  • Developed scalable applications using agile methodologies for timely project delivery.
  • Managed multiple projects simultaneously while maintaining strict deadlines and high-quality standards.
  • Maintained comprehensive documentation of development work, facilitating knowledge sharing among team members.
  • Enhanced software functionality by identifying and resolving complex technical issues.
  • Streamlined development workflows to increase team efficiency and reduce time spent on repetitive tasks.
  • Proactively identified areas for process improvement, implementing changes that led to significant time savings for the team.
  • Regularly reviewed peers'' code contributions, offering constructive feedback to enhance overall product quality.
  • Delivered exceptional client support by promptly addressing concerns and implementing requested changes or enhancements to software solutions.
  • Mentored junior developers, fostering professional growth and enhancing team productivity.
  • Collaborated with cross-functional teams to design innovative software solutions.
  • Analyzed proposed technical solutions based on customer requirements.

Software Engineer

Mphasis
08.2010 - 03.2017
  • Product: JobBossI
  • The JobBossI system is used to transfer parts data from Lockport model shop work system to JobBoss application. The older JobBossI applicationwas developed in MS-Access which requires a manual intervention for upload and generate an output file. The output file is uploaded to JobBoss application. The MS-Access utility was failing to transform data most of the time.
  • Responsibilities
  • Studied existing MS-Access JobBossI utility, gathered new requirement from the client, advised SSIS solution to Delphi.
  • Extracted data from Lockport Model Shop System database directly transformed and sent to destination.
  • Used for each loop container, sequence container, execute SQL task, script task and data flow task of control flow.
  • Used data flow components OLEDB source, Conditional Split, Derived Column, Union All and OLEDB Destination
  • Written and modified existing stored procedure for to validate, filter and process the data.
  • Used SSIS event handler to send an email to support team if package fails at any point in execution flow.
  • Generated XML configurable file through package configuration organizer and added connection properties like Lock port model and JobBoss server name, user name, and password, package variable properties like package variable value.
  • Implemented SSIS package staging and production server.
  • Configured package to file system, used indirect configuration by generating XML configurable file and used an environment variable as the pointer to the XML file.
  • Troubleshooting data issues using Data viewers. Added data viewers to output paths and used grid to view the raw data in columns and rows.
  • Troubleshooting of errors by using an add watch window to view package and task variable values.
  • Used SSIS expressions to evaluate package variables by using mathematical and string functions.
  • Scheduled a job on SQL Server agent to execute a package on daily basis after every half an hour.
  • Getting SQL database backup, adding users, restore database in development or test environment for testing.
  • VBA Code debugging to understand the data flow, documented existing data flow design and new SSIS design
  • Fixed defects rose after SSIS package deployment and worked with client in user acceptance testing.
  • Written a C# code to form a HTML format email body and assign it to package variable.
  • Environment: Visual Studio 2008, SSIS, C#, Windows 2008 R2 Server, SQL Server 2008

Education

Master of Computer Application (M.C.A) -

Pune University
Pune, Maharashtra
01.2010

Bachelor of Computer Application (B.C.A) - undefined

Shivaji University
Kolhapur, Maharashtra
01.2007

Skills

  • Databricks (Azure), Python, Apache Spark, ETL, Data Ware housing, Data Modeling, T-SQL, SSIS, Power BI, DAX functions
  • Apache Spark, Delta Lake, Python, PySpark, SQL, Databricks Workflows, DLT, DBFS, Auto Loader, SQL Server
  • Agile methodology
  • Performance optimization
  • Debugging techniques

Timeline

Tech Lead Database Analysis

Fiserv
07.2023 - Current

Database Consultant

Amazon Web Services (AWS)
12.2022 - 07.2023

Database Specialist

Fiserv
03.2017 - 12.2022

Senior Software Engineer

Mphasis
08.2010 - 03.2017

Software Engineer

Mphasis
08.2010 - 03.2017

Bachelor of Computer Application (B.C.A) - undefined

Shivaji University

Master of Computer Application (M.C.A) -

Pune University
Virbhadra Sutar