Highly experienced Data Engineer with proven expertise in Power BI and Snowflake. Bring 5 years experience in designing data architecture, developing ETL and BI solutions, and optimizing performance. Proudly designed robust data architectures and significantly improved data load time.
Overview
5
5
years of professional experience
1
1
Certification
Work History
Lead Engineer
Nitor Infotech Pvt. Ltd.
02.2023 - Current
Highly experienced Data Engineer with proven expertise in Power BI and Snowflake
FHIR resource handler and gathering/capturing resources in tabular format using snowflake. Developed 40+ interfaces and supported juniors for their major/critical issues.
I am responsible for giving training to new joiners who can then join the project and find himself comfortable with the projects.
I am actively taking care of interface real-time issues coming and helping clients by providing solution towards the same.
Proudly designed robust data architectures and significantly improved data load time.
Analyzed user requirements to develop software solutions and created technical specifications.
Developed, tested, debugged and documented software programs using python and HTML and CSS.
Created custom roles in Azure Active Directory for access control management of users and groups across multiple subscriptions.
Implemented backup solutions for cloud-based systems using Azure Backup service.
Migrated legacy workloads from on-premise data centers to public cloud platforms like Microsoft Azure.
Created a positive learning environment by encouraging creativity, problem-solving, teamwork, and collaboration.
Data ingestion (JSON, XML) into NoSQL objects.
Schema validation and Audit log tracing.
Participation on data modeling both forward/reverse engineering.
Data Engineer
Zenoti
02.2022 - 02.2023
It was a Product enhancement project and data Migration where I had to work on data integrity and mapping along with client's satisfaction
Migrated legacy workloads from on-premise data centers to public cloud platforms like Microsoft Azure.
Created a positive learning environment by encouraging creativity, problem-solving, teamwork, and collaboration.
Data ingestion (JSON, XML) into NoSQL objects.
Schema validation and Audit log tracing.
Participation on data modeling both forward/reverse engineering.
Constructed new ELT processes to move and transform data from various sources into Snowflake data warehouse, reducing data retrieval time by 20%
Collaborated with 3 cross-functional teams to design a data architecture strategy, resulting in a 40% increase in data accessibility
Created 30+ compelling Power BI reports that provided actionable insights for key stakeholders
Created Procedures/ UDFs and Triggers for business requirement.
FHIR resource handler and gathering/capturing resources in tabular format using snowflake.
Flatten complex JSONs into Tabular data set in Snowflake.
Debugging SQL procedures, creating procedure to handle the new approach on data transformation from Practice management to EHR/EMR
Helped new joiner and team mates on their challenges/roadblocks.
Product Engineer
Nextgen Healthcare
01.2021 - 02.2022
Primarily focused on designing and optimizing data architectures to meet the client's analytical and reporting needs.
Coordinated activities with other departments to ensure timely completion of projects.
Developed product design specifications and requirements.
Investigated customer complaints to determine root cause of product failure.
Managed and optimized performance of data pipelines for enhanced efficiency
Received JIRA request for product enhancement and completed them on sprint
Created Procedures/ UDFs and Triggers for business requirement.
Involved in Data quality which are sent from different data source.
Involved in interface development using Mirth engine using JavaScript, HL7, EDI, 837 and 835 files.
Involved in High-usage alert using Datadog where I have to create Dashboards graphically using Tableau for different servers.
Working as SME on interface issues specifically on code mapping and data syncing using JavaScript, HL7 and basics of Python.
Creating Pipelines using different Transformation for migrating data from source to Destination
I am responsible for Data quality and Data transformation from source to target.
Senior Integration Engineer
Sutherland Healthcare Solutions
10.2019 - 01.2021
This is a HealthCare Claims Processing Project which involves processing of scanned Healthcare claims forms and converting it to client specified data files.
Involved in Data Conversions for the clients who are moving From Hosted to On-premises.
Involved in Data Conversion, Data mapping, Data cleansing, Data quality check and Data modelling for the clients who are moving from 3rd Party to – On Premise/Hosted environments.
Involved in Data Extractions for both On-Premises as well as Hosted Environments.
Worked very closely with clients to migrate their data from On-premises to Cloud.
Involved in Integrating data from source database, flat files to staging and then to target using Informatica.
Extracted data from Oracle and Flat files then used SQL Server for data warehousing.
I was responsible for enabling the secure communication from PM Application to Clinical module. It allows the providers, patients to see the required details.
Also, this helps to maintain the HIPAA regulations. As we use HL7 for data push and connections.
I am responsible for Data quality and Data analysis from source to target.
My team was using informatica for data load in the target environment.
Involved in Analysis, cleansing, scrubbing, and formatting of the data from source to target.
Worked closely with upstream team for generating the extracted data into flat files.
Involved in Creating various Mappings, Connections to DB, Mapping parameters, variables.
Involved in moving the data from Hosted environments to on-premises servers.
I was involved in the ETL Testing, Creating Pipelines Business Analysis, Manual testing and auditing of data for the projects done by other team members. Gave KT to new joiners about technical, process and workflow on US Healthcare (Medicare and Medicaid payers).
Handling Hard coded templates and framing, updating them as per business needs while integrating.
ETL/ELT using- Informatica Power Center/cloud, SSIS, Matillion, Glue, Pentaho
Markup languages- HTML, XML
Data cleaning tool- Advance excel, Notepad
Oral and written communications
Design reviews
Amazon Web Services
Data modeling tools- SQL data Modeler (oracle), Erwin Data Modeler
Key Contribution Area
Constructed new ELT processes to move and transform data from various sources into data warehouse, reducing data retrieval time by 20%
Collaborated with 3 cross-functional teams to design a data architecture strategy, resulting in a 40% increase in data accessibility
FHIR resource handler and gathering/capturing resources in tabular format using snowflake. I used to hope on call with the client to understand the data structure and trend in order to incorporate the business requirement by adding different work around and technical solutioning.
Technology Competencies
JavaScript
Informatica PowerCenter 10.x
Pentaho
Matillion
Snowflake
MySQL
Data warehousing
Data modeling
Python
Data visualization (Power BI, Dax queries)
Cloud platform (AWS, GCP)
Project Highlights
Data Engineer, It was a Product enhancement project and data Migration where I had to work on data integrity and mapping along with client's satisfaction, Interoperability Engineer, SQL, HL7, FHIR, FileZilla, WinSCP, Matillion, Python, Mirth Connect, JavaScript, XML, HTML, RestAPI, Power BI, I was able to deliver the project within the TAT. We do have bi-weekly calls with the client to validate the data and push it to Production environment. I used to work on Creating ETL/ELT pipelines, data mapping, setup, Unit testing, Prechecks and data load to ingest the data from sources and transformed for meaningful business visualization.
Product Engineer, Primarily focused on designing and optimizing data architectures to meet the client's analytical and reporting needs., Product Engineering, SQL, Snowflake, HL7, FHIR, FileZila, WinSCP, Informatica PowerCenter, Python, MirthConnect, JavaScript, XML, HTML, I was able to deliver the project within the TAT. We do have bi-weekly calls with the client to validate the data and push it to Production environment. I was responsible for validating the vendor's specifications and creating workaround. I was taking care of connection setup using different tools. I was responsible for data implementation and Creating bridges for the client using different skills.
Senior Integration Engineer, This is a HealthCare Claims Processing Project which involves processing of scanned Healthcare claims forms and converting it to client specified data files. Plot data into different Relational databases instance (SSMS). Processing of claims involves data capture, validation, quality check and final quality check. Manual testing of the applications required for the above-mentioned activities. This was carried out in onsite/offshore mode with average team size of 12 associates., Senior Engineer, Pentaho ETL, SQL, SSMS, HL7, FHIR resource, JavaScripts, XML, HTML, Data Mapping, Data integration, Validating the data from source, once migrated from source to stage. Worked on AWS EC2 instances for creating security groups and assigning it to different host systems. Worked on data extraction from legacy system and load them onto stage site in new system using ETL tool. Creating validation report for validating the records from source to target. Once validated, I use to create interface for the new software to the different EHRs/EMRs
Educational Background
B.tech (Mechanical), WBUT
Diploma (Mechanical), WBSCTE
Profile Background
Highly experienced Data Engineer with proven expertise in Power BI and Snowflake. Bring 5 years experience in designing data architecture, developing ETL and BI solutions, and optimizing performance. Proudly designed robust data architectures and significantly improved data load time.