Summary
Overview
Work History
Education
Skills
Timeline
Generic

Dilip Vasgi

Summary

Meticulous professional brings expertise in testing, validating, and reformulating models. Employs statistical software to manipulate data and forecast outcomes. Proven history of designing and implementing data collection innovations. Insightful Data Scientist recognized for high productivity and efficient task completion. Skilled in machine learning, big data analytics, and statistical modeling to deliver actionable insights. Strong in communication, problem-solving, and teamwork, enabling successful collaboration across departments to drive projects to completion. Highly-motivated employee with desire to take on new challenges. Strong work ethic, adaptability, and exceptional interpersonal skills. Adept at working effectively unsupervised and quickly mastering new skills. Experienced and dependable general worker with a proven track record of efficiently completing tasks in various settings. Skilled in manual labor, equipment operation, and maintaining a clean and organized workspace. Safety-conscious with a strong work ethic and the ability to adapt to different environments. Ready to contribute to a dynamic team and make a positive impact. Hardworking employee with customer service, multitasking, and time management abilities. Devoted to giving every customer a positive and memorable experience.

Overview

8
8
years of professional experience

Work History

Sr. Oracle Database Developer

Mizuho Financial Group
NYC
01.2022 - Current
  • Experienced in migration strategies for database migration to the cloud on both Azure and AWS.
  • Recommended data analysis tools to address business issues.
  • Developed new functions and applications to conduct analyses.
  • Applied feature selection algorithms to predict potential outcomes.
  • Designed surveys, opinion polls and assessment tools to collect data.
  • Experienced in migrating on-premises Oracle databases to Azure Cloud VMs.
  • Experienced in migrating on-premises Oracle databases to the AWS Cloud.
  • Worked on Azure PaaS and migration from on-premises SQL Server to SQL MI.
  • Created linked servers from Azure to on-premises.
  • Migrated Sql server 2016 database to Azure Cloud
  • Working on knowledge of SSL certs with the application team in cloud environments.
  • Successfully migrated critical applications to Azure and AWS environments.
  • Experienced on migrating on-premises MYSQL server to AZURE MYSQL and installed MYSQL workbench and client to establish the connectivity to AZURE PAAS
  • Extensive no ledge on Data guard both Standby and Logical
  • Experienced on Golden gate
  • Created RDS databases on AWS cloud
  • Experienced in applying PSU7 patches on rolling fashion on 3 node RAC 12c
  • Handling database Jira tickets daily on priority bases
  • Experienced on AWS console
  • Increasing EBS backed volume storage capacity when the root volume is full using AWS EBS Volume feature
  • Migrated On-premises Oracle database to EC2 instances on AWS for Dev, Test and Prod databases
  • Created EBS volumes and snapshots
  • Attached EBS volumes to existing EC2 instances
  • Experienced on AWS cloud Watch to collect metrics and logs
  • Have no ledge on AWS services through console
  • Worked on modifications to the existing Procedures, Functions, and Packages as per the Requirement using TOAD environment
  • Created PL/SQL procedure to write out a shell script in order to run into the SQL loader tool
  • This was achieved through the use of the built in UTL FILE package
  • ETL Load of Internal and external data from OLTP and external sources to ODS and Data Warehouse/Mart Using ETL techniques and tools which includes mapping/validations, jobs schedule and controls, Unix Shell Scripting for automated jobs, monitoring log files, SQL Performance Tuning
  • Created Database tables in Global Distribution Systems (GDS) architecture and performed DDL, DML and DCL commands
  • Analyzed Dealer CRM system and provided source to target mapping from ODS and warehouse to consuming systems for reporting needs
  • Designed, developed, and optimized Oracle PL/SQL packages, stored procedures, and functions to create robust commissions reporting applications
  • Created database objects such as tables, views, triggers, types, and synonyms per project requirements
  • Collaborated with IT leads to translate requirements into technical design, ensuring consistent coding and source versioning standards (SVN)
  • Performed complex calculations for commissions reports, working with analysts to understand and enhance processes
  • Utilized advanced Oracle features, including Analytical functions, MERGE statements, WITH clauses, partitioning, and job logging, for efficient code
  • Followed established performance tuning techniques to optimize PL/SQL code, reducing runtime and resource usage
  • Developed Unix scripts for data transfers and integration, using sFTP for secure report delivery from Unix servers to Windows shares
  • Conducted comprehensive unit and system testing; performed UAT to align results with production calculation reports
  • Resolved discrepancies and ensured accurate data validation and transformation
  • Maintained source code in SVN repository, adhering to versioning and review standards
  • Documented and implemented ControlM/Redwood jobs, creating detailed change requests and scheduling jobs effectively
  • Utilized the ServiceNow ticketing system for managing tasks and requests
  • Ensured compliance with SOX procedures for all code deployments
  • Provided critical production support by monitoring daily/weekly/monthly batch processes and resolving job failures
  • Communicated promptly with the ControlM/Redwood group for re-run or restart of failed jobs
  • Handled data requests, minor enhancements, and table updates based on ad-hoc requests
  • Collaborated closely with commissions analysts, DBA teams, and project managers for streamlined project execution
  • Escalated potential issues promptly to ensure deadlines were met
  • Created conceptual, logical and physical data models for use in different business areas.
  • Designed and implemented data migration from legacy systems to new databases.
  • Developed and maintained predictive models to identify customer segments for targeted marketing campaigns.
  • Integrated trained models into web applications using Flask framework or other web frameworks.
  • Tested, validated and reformulated models to foster accurate prediction of outcomes.
  • Cleaned and manipulated raw data.

Sr. Oracle Database Developer

TD Bank
Cherry Hill Township
08.2019 - 12.2021
  • Aetna offers health care, dental, pharmacy, group life, disability, and long-term care insurance and employee benefits, primarily through employer-paid fully or partly insurance and benefit programs, and through Medicare
  • I was working as part of the backend database team on Oracle 10g
  • Was involved in development and maintenance of the various database objects like packages, procedures etc
  • Rewrote from the ground up, five separate packages with very similar logic into a single package
  • This was possible through parameterized procedures and by using dynamic SQL
  • Helped code maintenance and issue identification through debugging in a dramatic way
  • Extensively involved in ETL code using PL/SQL in order to meet requirements for Extract, transformation and loading of data from source to target data structures
  • Involved in multi-tier Java and J2EE based applications support, responsible for writing business logic using core Java, SQL queries for the backend RDBMS
  • Used DML syntax to extracts data from a database
  • Extensively used PL/SQL programming in backend and front-end Functions, Procedures, and Packages to implement business logics/rules by using TOAD
  • Performed Data Analysis, Data Profiling Validity, Data cleansing rules etc
  • Create and edit database objects such as tables, views, indexes, and constraints using TOAD
  • Performed backend programming for Java applications using JDBC connections
  • Created an AWS RDS Aurora DB cluster & Configured AWS RDS Aurora database users to allow each individual user privileges to perform their related tasks
  • Configured an AWS Virtual Private Cloud (VPC) and connected it to the on-Premises data center using AWS VPN Gateway for CloudFront distribution
  • Performed AWS lambda functions on S3
  • Migrated the production MySQL DB to the new AWS RDS Aurora instance
  • Implemented AWS High-Availability using AWS Elastic Load Balancing (ELB), which performed a balance across instances in multiple Availability Zones
  • Defined AWS Security Groups, which acted as virtual firewalls that controlled the traffic, allowed reaching one or more AWS EC2 instances
  • Managed application and patch management of applications running on AWS
  • Designed and configured Azure Virtual Networks (VNets), subnets, Azure network settings, DHCP address blocks, DNS settings, security policies and routing
  • Migrated moderate workloads from on premise to Azure laaS
  • Published web services APIs using Azure API management service
  • Created DDL, DML, DCL Scripts and established relationships between tables using Primary Keys & Foreign Keys
  • Utilized Oracle Enterprise Manager (OEM) to manage users, privileges, profiles, schemas
  • Utilized PLSQL tables and bulk collections to perform very quick loading of data
  • Used ROWID to write the queries to perform updates across the entire breadth of the data set
  • Again, PLSQL table and bulk collections were used to perform even updates at a very quick pace
  • Used SQL Developer to perform data modeling tasks
  • Used SQL Developer to create ERDs Entity Relationship Diagrams for the selected tables in the particular schema
  • Used SQL Developer data modeler tool to create entity relationships and also to create database objects by generations DDL statements
  • Rewrote the entire PLSQL package which was being used to send daily load status updates to the testing team
  • The email formatting was not elegant and had a lot of misalignment
  • Used HTML tags in PLSQL to deliver a well-organized and readable email status update to the testing group
  • Also did unit testing of the PLSQL code by using the popular utPLSQL framework
  • I have experience using SQL Developer to build test packages as well.
  • Actively involved in the optimization and tuning of SQL queries by utilizing the explain plan to identify the problem areas of the query.
  • Also, I performed optimization of PLSQL procedures.
  • Resolved cursors of huge cost through proper use of splitting into parameterized cursors, and then used inner loops to perform the needed operation much quicker.
  • Extensively used SQL Loader to load data into the target tables.
  • Wrote control files based on both positional mapping and for comma-separated or tilde-separated values.
  • Provided support for the reporting team, which was using Crystal Reports on the database.
  • Analyzed and corrected the SQL queries to give the expected values.
  • Wrote UNIX shell scripts to schedule time-based loads for the cron jobs.
  • Implemented triggers that fired upon detecting only updates that occurred in the column for logging purposes.
  • Created views in order to support the reporting team and the BI team.
  • Performed tuning of views and resolved cross joins that occurred, which held up the BI team.
  • Extensively worked with the BA and QA teams to fix the defects that were logged through testing.
  • Created conceptual, logical and physical data models for use in different business areas.

Oracle Database Developer

CVS Health
Woonsocket
08.2018 - 07.2019
  • Involved in production support for the entire decision support team, which involved working on ticket queues and on-call support for most of existing production systems
  • Experience to define security system role profiles, responsibilities to business groups, providing accessibility levels to the users
  • Responsible for creating and modifying the PL/SQL procedures, functions, triggers according to the business requirement - new tickets (enhancements)
  • Populate Data to Database using Data Loader, ETL Tools
  • Used AJAX in suggestive search and to display dialog boxes with JSF, JavaScript, HTML and CSS for some front end
  • Created database objects like tables, views, procedures, packages
  • Use of DBMS packages to implement scheduling, Dynamic SQL and Extensive Usage of Ref Cursors
  • Developed SQL Loader scripts for bulk data loads as part of daily data extraction
  • Responsible for creating indexes to increase performance and complex SQL queries for joining multiple tables
  • Enhanced DML query syntax elements for more powerful data accessing and processing
  • By using TOAD find and fix database problems with constraints, triggers, extents, indexes, and grants
  • Tuned and optimized the SQL statements to improve the performance of the application using Explain Plan and SQL trace utilities
  • Created database triggers to control DML operations at improper timings in transactional tables to effect security
  • Created scripts to create new tables, views, queries for new enhancement in the application using TOAD
  • Agile environment, with constantly shifting priorities and business needs
  • Provided best practices in database related issues to application team
  • Used combination of Java and SQL statements embedding them in Java programs for various issues
  • Responsible in developing Reports as part as new enhancement
  • Experience performing data migration using data loader, ETL tools, Informatica
  • Experience with UML, Use cases, Asset Management Tool (TFS)
  • Involved in writing Design Requirement and BRD
  • Involved in writing Test documentation and performing unit, integration tests for the same modules in the existing application

Data Warehouse Analyst

HDFC
India
01.2017 - 07.2018
  • Reviewed project requirements analysis document for requirements and Test scripts
  • Support the design and development of an ETL strategy to perform environment and system integration validation including ETL using Informatica, Java Data Access, and web services technology
  • Conducted JAD sessions and Requirement gathering meetings with users, stakeholders, and subject matter experts to define and document Data Modeling requirements
  • Lead and Manage Metadata, Data Dictionary and Data governance efforts
  • Database used was Oracle 11g
  • Manage and coordinate deployment and testing effort for system integration and user acceptance testing
  • Support creation and execution of ETL and Database test cases in each environments and release in the iterative development lifecycle
  • Walk through of Transformation rules document and the test cases to reflect these rules were being tested
  • Physical and Logical Data Modeling using IBM Rational Data Architect RDA case tool
  • Reverse Engineered the Physical and Logical Data Model from the transactional database using Data Architect RDA case tool
  • Provide project management support during testing initiative
  • Attend daily and weekly SCRUM meetings to track progress and help in issues resolution

Education

Master’s - CS

Indiana Wesleyan university
01.2020

Bachelor’s - CS

Avanthi degree college Osmania university
01.2017

Skills

  • SQL
  • PL/SQL
  • C
  • UNIX Shell Script
  • XML
  • MYSQL
  • JAVA
  • Oracle 19c
  • Oracle 18c
  • Oracle 12c
  • Oracle 11g
  • Oracle 10g
  • Oracle 9i
  • MS-Access 2003
  • SQL Server
  • Windows NT
  • Windows 95
  • Windows 98
  • Windows 2000
  • UNIX
  • Order Management
  • Inventory
  • BOM
  • Purchasing
  • Accounts Payables
  • HRMS
  • Receivables
  • SYSADMIN
  • AOL
  • TOAD
  • SQL Loader
  • SQL plus
  • Reports 6i
  • Forms 6i
  • Oracle Workflow Builder
  • XML/BI Publisher
  • JDeveloper
  • UtPLSQL testing framework
  • OraclePL/SQL
  • APEX
  • PL/SQL Developer
  • Report2Web
  • SharePoint
  • Jira
  • Toad
  • ORACLE DEVELOPER/2000
  • Bugzilla
  • SQR
  • Informatica
  • COGNOS
  • COBOL
  • FOCUS
  • WEB FOCUS
  • HTML
  • Javascript
  • VBScript
  • Visual Basic
  • Access
  • People Tools
  • Data integration
  • Database design

Timeline

Sr. Oracle Database Developer

Mizuho Financial Group
01.2022 - Current

Sr. Oracle Database Developer

TD Bank
08.2019 - 12.2021

Oracle Database Developer

CVS Health
08.2018 - 07.2019

Data Warehouse Analyst

HDFC
01.2017 - 07.2018

Master’s - CS

Indiana Wesleyan university

Bachelor’s - CS

Avanthi degree college Osmania university
Dilip Vasgi