IAD MIS Development (ICICI Bank Ltd)
Tools: HP Vertica 7.x, Sybase ASE 15.5 UNIX shell script, vSQL, DB Visualizer, T-SQL, Python
Team Size: 10 People
Duration: 11/2019 - Present
Description:
Internal Audit Group, ICICI: Techno-Functional Role
In this role, I was responsible for bridging the gap between business requirements and data warehouse structures to facilitate the provision of accurate data for MIS report generation.
- Business Requirements Analysis: Communicated with the respective business teams to understand their requirements and translated these into functional information.
- Data Mapping and Query Optimization: Identified source tables and mapped the required data from DWH tables via the Sybase server. Formulated logical information by writing optimized queries using Sybase and HP Vertica databases.
- Data Loading and SQL Script Writing: Authored loading scripts for HP Vertica Database, creating both simple and complex SQL and T-SQL queries.
- Data Verification and Delivery: Ensured data met specified parameters and provided it to business users for analysis and campaign execution.
- DDL Script Creation: Created Teradata and Vertica table DDL scripts using Erwin for staging and target environments.
- HP Vertica Development: Designed and developed HP Vertica anchor tables and projections, analyzed query logs, and made necessary corrections to projections.
- Bulk and Delta Loading: Developed HP Vertica vSQL scripts for bulk and delta loading of stage and target tables.
- Data Transfer Scripts: Created scripts for copying data between various Vertica environments.
NPA Analysis of Loans
Tools: SQL Server 2008, Query Analyzer, SQL Profiler and Enterprise Manager, SAS, Python
Team Size: 8 People
Duration: 11/2020 - Present
Description:
NPA Data Analysis and Forecasting: Data Analyst
In this role, I was responsible for analyzing the organization's Non-Performing Asset (NPA) data, examining past NPA records, and forecasting the probability of loans becoming NPAs. This involved a thorough understanding of the loan cycle and implementing data-driven insights.
- Business Requirements Gathering: Interacted with users to understand and gather business requirements for NPA analysis and forecasting.
- System Development: Involved in the analysis, design, development, testing, deployment, and user training of analytical and transactional reporting systems.
- Stored Procedures: Developed new stored procedures, modified existing ones, and optimized them for performance. Utilized advanced T-SQL features to design and enhance T-SQL interactions with databases and other applications.
- Performance Tuning: Employed indexes to improve individual query performance and enhanced stored procedures. Tuned SQL queries using execution plans for optimal performance.
- Security and Efficiency: Used views to prevent unauthorized access and optimized query performance by assigning relative weights to tables in the catalog.
- Database Optimization: Implemented techniques to optimize the performance of the database by tuning queries and stored procedures, ensuring efficient data processing and retrieval.
ICICI Bank Data warehouse Projects
Tools: Sybase IQ 16, T-SQL, MS SQL 2008, UNIX Shell Scripting, Sybase ASE
Team Size: 30 People
Duration: 11/2016 - 10/2019
Description:
Bank Customer Data Integration: Data Engineer
In this role, I was responsible for gathering and integrating bank customers' personal information from various source systems into target data warehouse (DWH) tables on a monthly basis. This data was utilized by business users to build strategies and business plans.
- Business Requirements Analysis: Communicated with business teams to understand their requirements and translated these into functional information.
- Data Mapping and Query Optimization: Identified source tables and mapped the required data from DWH tables via the Sybase server. Formulated logical information by writing optimized queries using Sybase.
- Data Loading and SQL Script Writing: Authored loading scripts for the Sybase Database, creating both simple and complex SQL and T-SQL queries. Optimized code and improved database efficiency, including re-indexing, updating statistics, recompiling stored procedures, and performing other maintenance tasks.
- Data Verification and Delivery: Ensured data met specified parameters and provided it to business users for analysis and campaign execution.
- Automation with Unix Shell Scripts: Created Unix Shell Scripts to automate the execution process, enhancing efficiency and reducing manual intervention.
Cross Sell Product Data Mart (ICICI Bank Ltd)
Tools: Teradata, BTEQ, FastLoad, MultiLoad, Fast Export, SQL, Vertica
Team Size: 10 People
Duration: 11/2010 - 06/2014
Description:
Business Intelligence Unit (BIU), ICICI: Techno-Functional Role
In this role, I was responsible for gathering data for analysis, cross-selling, and campaign purposes. I developed approximately 50 data marts for various banking products (Loan/Liability/Ctl) to support business users in running campaigns and tracking their responses.
- Requirements and Data Gathering: Collaborated with business teams to gather and understand requirements, translating them into design specifications for developers.
- BTEQ Scripting and Testing: Wrote and executed BTEQ scripts for validation, session testing, and ensuring data integrity between source and target databases, as well as for report generation.
- Data Loading into Teradata: Involved in loading data from legacy systems and flat files into Teradata using complex MLOAD and FASTLOAD scripts.
- Teradata Loader Connections: Created Teradata External loader connections such as MLoad, Upsert, Update, and Fastload for data loading into target tables in the Teradata Database.
- Primary Index Creation: Designed proper Primary Indexes (PI), considering planned data access and even data distribution across all available AMPs.
- Complex Query Execution: Authored numerous BTEQ scripts to run complex queries on the Teradata database.
- Loader Utilities and Logs: Utilized Teradata loader connections for data loading and wrote Teradata utility scripts (Fastload, Multiload), managing loader logs for tracking and troubleshooting.