Summary
Overview
Work History
Education
Skills
Certification
Accomplishments
Datawarehousingandbiprojects
Timeline
Generic
Anitha Joshi

Anitha Joshi

Bangalore

Summary

Highly skilled Snowflake professional with a proven track record of successfully designing and implementing data migration strategies, resulting in significant improvements in data retrieval time and accessibility. Adept at providing data analysis, training, and support to drive revenue growth, improve customer satisfaction, and increase team productivity. Proficient in programming languages such as Python. Working with Logical Data Modeling, Physical Data Modeling, Teradata, IBM Info Sphere Data Stage and in the areas of Data Migration, Data Integration, and Data Cleansing in Financial Markets Services Delivery. - Designing and developing machine learning models and algorithms that address specific business needs, from recommendation systems to predictive analytics. - Collecting, cleaning, and analyzing data from various sources to identify patterns, trends, and insights that can drive decision-making. - Collaborating with cross-functional teams, including data scientists, software developers, and domain experts, to integrate AI solutions into existing systems. - Evaluating the performance of AI models, fine-tuning parameters, and optimizing algorithms to achieve desired outcomes. - Ensuring data privacy and security in AI applications, complying with regulations and industry standards. - Strong applied knowledge of software development efforts through implementation including good design, documentation, and supportability practices. - Strong understanding of advanced data warehouse concepts (Fact less Fact Tables, Temporal BI-Temporal models, etc.) - Detail oriented with strong organizational and analytical skills. - Strong experience in Data warehousing area using Teradata, Data stage, Erwin and Decision Support System tools like Essbase. - Strong understanding and work experience in Data Model like Conceptual, Physical and Semantic Data Models. - Extensive experience in development project SDLC including Requirement analysis, design, and development and testing. - Having sound knowledge on Teradata Database. Worked extensively on Teradata FSLDM Model and utilities such as MLOAD, FLOAD and BTEQ. - Extensively worked in Data Stage Parallel Extender stages like, Join, Funnel, Remove Duplicates, Merge, Lookup, used extensively Reject Link, Job Parameters, and Stage Variables in developing jobs. - Developmental efforts were in the reengineering of the ETL processes using the Orbiter for Data Integration, Data Migration and Data Cleansing.

Overview

15
15
years of professional experience
1
1
Certification

Work History

Lead Data Analyst

AT&T Communication Services India Pvt. Ltd.
Bangalore
08.2023 - Current
  • Learnt and adapted new technologies to build cutting-edge artificial intelligence products
  • Defined and designed a new data analytics platform for a global team of Solutions Architects
  • Architected and implemented robust data solutions for clients, resulting in a 20% growth in the analytics service line
  • Successfully created EDW from base OLTP source systems to deliver BI reporting
  • Supported multiple BU's for both real-time/historical reporting
  • Involved in creating BI product, helping business teams achieve their productivity and profitability goals through data insights and analytics
  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for the production environment
  • Create new mapping designs using various tools in Snowflake
  • Develop the mappings using needed Transformations in Snowflake according to technical specifications
  • Created complex mappings that involved implementation of Business Logic to load data into the staging area
  • Developed mappings/sessions using Snowflake for data loading
  • Performed data manipulations using various Snowflake transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter, and Union
  • Developed Workflow using task developer, Worklet designer and workflow designer in Workflow manager and monitored the results using workflow monitor
  • Building Reports according to user requirements
  • Extracted data from Oracle and SQL Server then used Teradata for data warehousing
  • Implemented slowly changing dimension methodology for accessing the full history of accounts.

Data Specialist

IBM India Pvt Ltd
10.2014 - 08.2023
  • Develops and approves strategies for continuous improvement in data management.
    Provide guidance and leadership to perform analysis and provide solutions.
    Assists with coordination of database changes within the context of the overall configuration management process.
     Identifying, recommending, and implementing emerging Modeling and IT trends, developments,
    tools, and improvements/solutions
     Manage multiple assignments simultaneously, while working independently, and with other
    engineers and SMEs
     Collaborates with cross-functional team in support of business case development and identifying
    modeling method(s) to provide business solutions.
    Preparation of Functional Specification and Mapping document for the requirement.
    Coordinated with the development team in solving technical problems, mainly pertaining to complex domain areas.
    Creation of Tables and Views with standardization as per the client requirement.
    Creation of sanity check reports for the performance checks on the newly created or modified entities.
    Creation of Indexes and Performance Tuning of the Code.
    Creation of partition for large table.
    Creation of a capacity plan to estimate the database size required.

Technical Lead

Wipro Technologies
06.2012 - 09.2012
  • Analysis the RFQ's and initiate the study based on the requirements.
    Involved in Requirement Gathering of the MIS Data mart.
     Providing the estimation based on the study.
     Preparation of MLPD (Middle Level Platform Design), and Functional Specification.
     Worked as Onsite coordinator used to help the offshore team during the build and unit testing.
     Supporting System Testing and User Acceptance Testing from Onsite as well as offshore.
     Tested the mappings to ensure that the data is loaded as per needs.
     Changes made according as per the business requirement and made changes to the existing mapping.
     Resource Allocation, Reviewing of the day-to-day work.

Application Consultant

Optimum Solutions Pte Ltd
03.2010 - 10.2011
  • Involved in Requirement Gathering of the MIS Data mart.
     Providing the estimation based on the study.
     Preparation of Design document based on the requirement.
     Involvement in preparing the Mapping document for the requirement with Business team and source.
     Development of DataStage jobs based on the design.
     Involvement with third party vendor to get the file transfer configured.
     Creation of standards documents like technical specifications, Functional specification, Unit test cases and
    test results.
     Supporting System Testing and User Acceptance Testing.
     Tested the mappings to ensure that the data is loaded as per needs.
     Changes made according as per the business requirement and made changes to the existing mapping.
     Implementation to production once we receive the Sign off from users.
     Migration of Production server from one server to another server.

Education

Bachelor of Science (Computer Science) -

National College (Bangalore University)
01.2004

II PUC -

National College, Bangalore (Pre-University)
01.2001

10th -

National School
01.1999

Skills

  • RDBMS: Snowflake, Teradata Vantage, Oracle, Databricks, MS SQL
  • ETL Tool: IBM InfoSphere DataStage, Informatica
  • Programming Languages: SQL, Shell scripting, Python
  • Concept: DBMS, Data warehouse, Dimensional modeling
  • Modeling Tool: Toad, Erwin
  • Cloud Technology: Azure, AWS

Certification

  • SnowPro core Certified - Snowflake
  • Teradata Vantage Associate
  • Teradata Vantage Cloud Lake
  • Infosphere DataStage v11.3

Accomplishments

  • Awarded as star performers of the Month in 2021.
  • Awarded as star performers of the Month in 2020.
  • Awarded as star performers of the Month in 2019.
  • Winner of "Team Spirit" award, given in recognition for outstanding work done, good team player respectively.
  • Delivered multiple projects without any defects.
  • Ensured 100% compliance to all Acxiom stipulated SLAs and deadlines.

Datawarehousingandbiprojects

  • AT&T Communication Services India Pvt. Ltd, Bangalore, Lead data Analyst, Windows 11x, Snowflake, Databricks, 16, The main objective of this project is to load data in Snowflake so that downstream can use this data for these analysis purpose. The data loaded are used for different kind of analysis like chronic callers, Repeat calls, Chat analysis purpose etc. To analyze the chat data. The ask was to download bulk turn-by-turn data from IAX website where we have all the chat data from customer and agent. Analysis was to be done how to download the whole chat data when multiple agents were involved., Learnt and adapted new technologies to build cutting-edge artificial intelligence products., Defined and designed a new data analytics platform for a global team of Solutions Architects., Architected and implemented robust data solutions for clients, resulting in a 20% growth in the analytics service line., Successfully created EDW from base OLTP source systems to deliver BI reporting., Supported multiple BU's for both real-time/historical reporting., Involved in creating BI product, helping business teams achieve their productivity and profitability goals through data insights and analytics., Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for the production environment., Create new mapping designs using various tools in Snowflake., Develop the mappings using needed Transformations in Snowflake according to technical specifications., Created complex mappings that involved implementation of Business Logic to load data into the staging area., Developed mappings/sessions using Snowflake for data loading., Performed data manipulations using various Snowflake transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter, and Union., Developed Workflow using task developer, Worklet designer and workflow designer in Workflow manager and monitored the results using workflow monitor., Building Reports according to user requirements., Extracted data from Oracle and SQL Server then used Teradata for data warehousing., Implemented slowly changing dimension methodology for accessing the full history of accounts.
  • Nationwide Banking Society (NBS), Bangalore, Senior Teradata developer, Windows 10x, and AIX 5.3, Teradata V16 and UNIX, 13, The objective of this project is to load the data into the One Application from various source system and downstream applications as per the business requirement. Apps one is to provide a Single Data view, however the migration of only selected data will only be provided of the customer base. The group needs an accurate and as complete as possible, single customer view to drive customer insight and, in turn, revenue generation from marketing and leads activities., Develops and approves strategies for continuous improvement in data management., Provide guidance and leadership to perform analysis and provide solutions., Assists with coordination of database changes within context of overall configuration management process., Identifying, recommending, and implementing emerging Modeling and IT trends, developments, tools, and improvements/solutions, Manage multiple assignments simultaneously, while working independently, and with other engineers and SMEs, Collaborates with cross-functional team in support of business case development and identifying modeling method(s) to provide business solutions., Preparation of Functional Specification and Mapping document for the requirement., Coordinated with development team, in solving technical problems, mainly pertaining to complex domain areas., Creation of Tables and Views with Standardization as per the client requirement., Creation of Sanity check reports for the performance checks on the newly created or modified entities., Creation of Indexes and Performance Tuning of the Code., Creation of Partition for large table., Creation of Capacity Plan to estimate the Database Size required.
  • Dubai Parks and recreations (DPR), Bangalore, Project Lead, Windows 9x, and AIX 5.3, Data Stage 11.x PX, Netezza, and UNIX, 3, The main objective of this project was to calculate the Revenue, Ticket and Admission. The purpose was to load the Fact and Dimension tables. Once all these were loaded the flash report will be generated for the calculation of Revenue generated and the admission., Ensure the jobs runs properly every day and data gets loaded., The reports should be generated on time and sent to the business users., All the queries of the business users based on reports should be answered., Develop the DataStage jobs based on the requirement given., Involved in analysis and debugging of the jobs and reports., Work with the DW Development team to implement data strategies, build data flows and develop conceptual data models., Work with the development team to implement data strategies, build data flows and develop data models., Work with the lead to establish standards around flexibility of the data model to ensure deviations are within established guard rails., Work with development, application, and other data architects to provide an overall solution approach., Identify and recommend process improvements that significantly reduce workloads or improve quality for his/her assigned areas of responsibility., Provides technical assistance and mentoring to staff for developing conceptual and logical database designs and the physical database., Preparation of Functional Specification and Mapping document for the requirement.
  • Bank of America (BOA), Bangalore, Senior DataStage developer, Windows 9x, and AIX 5.3, Data Stage 11.x PX, DB2 and UNIX, 8, The objective of this project is to migrate the Merry Lynch legacy systems to Bank of America systems. Data transformation, lookup and all will be done as per the business rules. End objective of this migration is to reconcile Merry Lynch and Bank of America data for each business day. However, the migration of only selected product data view of circa 85% of the combined customer base. The group needs an accurate and complete as possible, single customer view to drive customer insight and, in turn, revenue generation from marketing and leads activities., Analysis the RFQ's and initiate the study based on the requirements., Involved in Requirement Gathering., Identifying the Source system and carry the data to the required target table as per needs., Providing the estimation based on the study., Performance Tuning of the Batch run and Database level performance tuning., Changes made according as per the business requirement and made changes to the existing mapping., Supporting System Testing and User Acceptance Testing from offshore., Tested the mappings to ensure that the data is loaded as per needs., Changes made according as per the business requirement and made changes to the existing mapping., Resource Allocation, Reviewing of the day-to-day work., Was also involved in resource estimation for the requirement.
  • RB (Rickett Benckiser) Healthcare, Bangalore, Technical Lead, Windows XP, and AIX 5.3, SQL server, Data Stage 11.x PX and Connect Direct, 10, The objective of this project is to migrate the site related data to Optiva. Optiva is healthcare front end. Data will be extracted from source and then moved to staging. From staging the data will be transformed as per Optiva requirement and all the business rules will be applied and migrated to Optiva., Analysis the RFQ's and initiate the study based on the requirements., Involved in Requirement Gathering., Identifying the Source system and carry the data to the required target table as per needs., Providing the estimation based on the study., Preparing mapping data sheet (MDS) for each of the subject areas., Performance Tuning of the Batch run and Database level performance tuning., Changes made according as per the business requirement and made changes to the existing mapping., Supporting System Testing and User Acceptance Testing from offshore., Tested the mappings to ensure that the data is loaded as per needs., Changes made according as per the business requirement and made changes to the existing mapping., Resource Allocation, Reviewing of the day-to-day work., Was also involved in resource estimation for the requirement.
  • RSA (Royal and Sun Alliance) Insurance, Bangalore, Technical Lead, Windows XP, and AIX 5.3, SQL server, Data Stage 11.x PX and Connect Direct, 10, The objective of this project is to standardize the claim incident types for all UKRIS claims so that the rollups can aggregate correctly based on the Transactor premium file. These files will be converted as per business mapping and sent across to the downstream to make sure SAP reconciliation would match and to use these data for claim processing for the customer to be made easy. Business team wants to keep the premium files in control on the claims and cancel the required errors so that the customer should not lose any claim processing from RSA and capture claims in balance sheet as per the business requirements., Analysis the RFQ's and initiate the study based on the requirements., Involved in Requirement Gathering., Identifying the Source system and carry the data to the required target table as per needs., Providing the estimation based on the study., Preparing Technical Design Document (TDD) by getting Conversion Design Document (CDD) from the onsite coordinator. CDD is the high-level design document designed by technical and Business Analyst from Onsite., Performance Tuning of the Batch run and Database level performance tuning., Changes made according as per the business requirement and made changes to the existing mapping., Supporting System Testing and User Acceptance Testing from offshore., Tested the mappings to ensure that the data is loaded as per needs., Changes made according as per the business requirement and made changes to the existing mapping., Resource Allocation, Reviewing of the day-to-day work.
  • ANZ Bank, Bangalore, Technical Lead, Windows XP, and AIX 5.3, Teradata V12.1, Data Stage 11.x PX and Connect Direct, 6, The objective of the project is to provide regulatory details for Australian Government regarding bank. The government needs regulatory details from the ANZ bank which can be generated by data warehousing means. The group needs an accurate and as complete as possible, single customer view to drive customer insight and, in turn, revenue generation from marketing and leads activities., Analysis the RFQ's and initiate the study based on the requirements., Involved in Requirement Gathering., Identifying the Source system and carry the data to the required target table as per needs., Providing the estimation based on the study., Preparing Technical Design Document (TDD) by getting Conversion Design Document (CDD) from the onsite coordinator. CDD is the high-level design document designed by technical and Business Analyst from Onsite., Enhancement to the Existing Data Model which is based on the FDSLM., Creation of Tables and Views with the industry level Standardization, Creation of Indexes and Performance Tuning of the Code., Creation of Partition for large table., Creation of Capacity Plan to estimate the Database Size required., Performance Tuning of the Batch run and Database level performance tuning., Changes made according as per the business requirement and made changes to the existing mapping., Involved in Estimate Effort required for the modification/development of reports., Supporting System Testing and User Acceptance Testing from offshore., Tested the mappings to ensure that the data is loaded as per needs., Changes made according as per the business requirement and made changes to the existing mapping., Resource Allocation, Reviewing of the day-to-day work.
  • Lloyds Banking Group (LTSB), Bristol, Data Stage Designer, Windows XP, and AIX 5.3, Teradata V12.1, Data Stage 8.1 PX and Connect Direct, 15, The objective of this project is to load the data into the Group Data Warehouse from various source system and downstream applications as per the business requirement. Lloyds Banking Group prior to the HBOS acquisition, GDW provided a Single Customer view. However, the migration of only selected product data at Release C means that GDW will only provide an LBG view of circa 85% of the combined customer base. The group needs an accurate and complete as possible, single customer view to drive customer insight and, in turn, revenue generation from marketing and leads activities., Analysis the RFQ's and initiate the study based on the requirements., Involved in Requirement Gathering of the MIS Data mart., Providing the estimation based on the study., Preparation of MLPD (Middle Level Platform Design), and Functional Specification., Worked as Onsite coordinator used to help the offshore team during the build and unit testing., Supporting System Testing and User Acceptance Testing from Onsite as well as offshore., Tested the mappings to ensure that the data is loaded as per needs., Changes made according as per the business requirement and made changes to the existing mapping., Resource Allocation, Reviewing of the day-to-day work.
  • Standard Chartered Bank (SCB)- Financial Market Services Delivery, Singapore, ETL Consultant, Windows XP, and Linux, Teradata 12.1, DataStage 8.1 PX, Control M and PM4 Services., 15, FMETaL (Financial Markets ETL) is interface applications within Financial Markets and simplifies the complexity and provides standardization for interfacing systems in real time and batch. Financial Market Service Delivery in SCB captures Data from the various processing systems and provides a single point of contact for Management Information and Decision systems. FMETaL is to create a centralized repository of information for derivatives, equities, bonds, repo, forex etc. System integration in a faster and more efficient way. Scalable, manageable, and reliable solution across all the verticals within Financial Markets. A timely and responsive service provided by pool specialists dedicated for Financial Markets., Involved in Requirement Gathering of the MIS Data mart., Providing the estimation based on the study., Preparation of Design document based on the requirement., Involvement in preparing the Mapping document for the requirement with Business team and source., Development of DataStage jobs based on the design., Involvement with third party vendor to get the file transfer configured., Creation of standards documents like technical specifications, Functional specification, Unit test cases and test results., Supporting System Testing and User Acceptance Testing., Tested the mappings to ensure that the data is loaded as per needs., Changes made according as per the business requirement and made changes to the existing mapping., Implementation to production once we receive the Sign off from users., Migration of Production server from one server to another server.

Timeline

Lead Data Analyst

AT&T Communication Services India Pvt. Ltd.
08.2023 - Current

Data Specialist

IBM India Pvt Ltd
10.2014 - 08.2023

Technical Lead

Wipro Technologies
06.2012 - 09.2012

Application Consultant

Optimum Solutions Pte Ltd
03.2010 - 10.2011
  • SnowPro core Certified - Snowflake
  • Teradata Vantage Associate
  • Teradata Vantage Cloud Lake
  • Infosphere DataStage v11.3

Bachelor of Science (Computer Science) -

National College (Bangalore University)

II PUC -

National College, Bangalore (Pre-University)

10th -

National School
Anitha Joshi