Results-driven Data Analyst with 7+ years of experience, including 2 years in e-discovery. Specializing in Snowflake database, SQL, Tableau, Excel, Python, Unix, Requirement Gathering, Jira, Confluence, Nuix, Encase, and Relativity. Demonstrated success in managing end-to-end product development lifecycles, fostering innovation, and achieving outstanding outcomes. Proficient in real-time inventory-centric MIS, business impact analysis, and communicating performance metrics. Collaborative communicator skilled in resolving customer issues, enhancing operational efficiency, and leading cross-functional projects. Experienced in business intelligence tools, internal audits, and contributing to enterprise-level projects aimed at enhancing GOCM and process efficiencies.
· Identified and gathered relevant data to support business inquiries and track Key Performance Indicators (KPIs).
· Developed and maintained accessible data sets for diverse business needs, ensuring data quality and consistency.
· Conducted comprehensive data exploration and cleansing, assessing data integrity and proposing enhancements.
· Utilized analytical tools such as Snowflake, SQL, Tableau, Python, and Excel to describe, analyze, and visualize data sets.
· Created and updated reports and dashboards to monitor business performance and track asset utilization.
· Collaborated closely with stakeholders and cross-functional teams on data-related projects to understand organizational data dynamics.
· Presented data insights and recommendations to non-technical audiences, facilitating informed decision-making.
· Technology – Snowflake, Jira, SQL, ADF, Tableau, Excel
Worked closely with business users to gather, translate, and document user and data requirements in BRD, providing inputs on source data extraction criteria.
Imported data from various sources and performed data pre-processing and ETL tasks, including Custom Query, Join, and Blending of data.
Proactively identify potential concerns and risks, sharing best risk management practices with the team.
Utilized Snow SQL and Snow pipe for continuous data load, implementing bulk data loading using COPY.
Created data sharing between Snowflake accounts, established internal and external stages, and transformed data during load.
Redesigned Views in Snowflake to optimize performance and enhance data retrieval efficiency.
Developed reports in Tableau based on Snowflake connections, ensuring data accuracy and validating reports against the Snowflake database.
Provided rules for data standardization across multiple sources and supported UAT and PROD implementation.
Demonstrated proficiency in complex SQL, PL/SQL, and theoretical knowledge of Azure, contributing to the initial phase of the project as a Data Lead.
Created Talend Mappings to populate data into dimensions and fact tables, ensuring accurate data transfer.
Validated data from MySQL to Snowflake to ensure source and target alignment.
Initiated data governance efforts using the BIG ID tool, focusing on identifying PII information in MS DBs and generating reports.
Established a standardized process for identifying PII data, including engagement with DB owners, scanning databases, performing QC checks, and obtaining approvals for data access and reporting.
Collaborated closely with business users to gather, interpret, and document user and data requirements in Business Requirement Documents (BRD), providing insights on source data extraction criteria.
Imported data from diverse sources and executed data pre-processing and ETL tasks, including Custom Query execution, Join operations, and data blending.
Proactively identified potential concerns and risks, sharing best practices in risk management with the team.
Utilized Snow SQL and Snowpipe for continuous data loading, implementing bulk data loading via COPY.
Established data sharing between Snowflake accounts, configuring internal and external stages, and transforming data during the loading process.
Redesigned Views in Snowflake to enhance performance and improve data retrieval efficiency.
Developed Tableau reports based on Snowflake connections, ensuring data accuracy and validating reports against the Snowflake database.
Implemented rules for data standardization across multiple sources and supported User Acceptance Testing (UAT) and Production (PROD) implementation.
Demonstrated proficiency in complex SQL, PL/SQL, and theoretical knowledge of Azure, contributing as a Data Lead in the initial project phase.
Created Talend Mappings to populate data into dimensions and fact tables, ensuring precise data transfer.
Validated data from MySQL to Snowflake to ensure alignment between source and target.
Initiated data governance efforts using the BIG ID tool, focusing on identifying Personally Identifiable Information (PII) in MS databases and generating reports.
Established a standardized process for identifying PII data, involving engagement with Database owners, database scanning, QC checks, and obtaining approvals for data access and reporting.
Processed ESI discovery data using specialized platforms to extract text and metadata from electronic documents, ensuring precise and efficient data extraction.
Created and maintained schema objects such as tables, views, stored procedures, and triggers to uphold referential integrity and optimize data management.
Loaded data from flat data files and SQL server database tables using SQL queries, enhancing data accessibility and usability.
Spearheaded the development of processes for specific components of the eDiscovery mandate, ensuring compliance and efficient execution.
Developed and executed SQL queries with multi-table joins, group functions, subqueries, and set operations to extract and analyze data effectively.
Collaborated with the Data Mapping team to analyze externally received metadata files, map the data to standard system database tables, and apply transformations based on business requirements.
Facilitated effective communication with cross-functional teams to gather requirements, address issues, and ensure seamless execution of eDiscovery processes.
Utilized SQL, data editors, regular expressions, Microsoft Excel, data forensic software (e.g., Nuix), and Relativity to address client requests and perform various tasks efficiently.
Processed electronic litigation data using data forensic software and data extraction tools, ensuring accurate and efficient data processing.
Hosted processed data on the Confidential Relativity platform, providing clients with secure and convenient online access.
Cleaned, edited, imported, and exported client-provided data while ensuring data integrity and advising on optimal data handling methods.
Troubleshooted issues related to data hosting, metrics tracking, and reports, promptly resolving them to maintain smooth operations.
Developed and implemented new processes for the processing team, resulting in improved data load time and operational efficiency.
Data Analysis, Jira, Confluence, Pentaho
Project Management, Business Analysis
Requirement gathering, Project Planning and Documentation
Client Communication
Working in Agile Methodology
SDLC (Software Development Lifecycle) Object Oriented Concepts
· Tableau Analyst
· Microsoft Certified: Azure Data Engineer Associate
· Microsoft Certified: Azure Data Fundamentals
· Microsoft Certified: Azure Fundamentals
· Tableau Analyst
· Microsoft Certified: Azure Data Engineer Associate
· Microsoft Certified: Azure Data Fundamentals
· Microsoft Certified: Azure Fundamentals