Experienced Developer with 8 years of experience in creating programs based on specific requirements. Adaptable, diligent and knowledgeable in Python, C# and VBA.
Create automations to collate data and make monthly reports.
Worked along data analyst to make python based and selenium based web scrappers to harvest data
Responsible to migrate current existing C# scripts to python scripts
Understand the current model build on C# which does some fintech calculation to calculate ratings
Worked along side data analyst and data scientist to understand the flow of models which are client from APAC and EMEA region majorly
Debug and maintained PySpark job scripts
Added efficient code to make the processing faster while creating jobs in PySpark
Project – SFAddin C# Project
This plugin is responsible to calculate the financial queries in excel which are in-house developed
Responsible to add new features in DLL wrapper that in turns call the project main rest API
Debug to any issues or bugs found on production line.
Spearhead technical aspects within Scalable Data Department as a key member of the newly formed "Data Integration & Management" team.
Collaborate in an agile environment with highly skilled colleagues, utilizing languages such as Python and Java to bring data products, including machine learning models, into production.
Work closely with data scientists, employing SQL and Big Data tools, to extract meaningful insights and reveal substantial business value through efficient data integration and management processes.
Lead the design and development of robust software solutions, employing cutting-edge technologies, to ensure compliance with regulatory data-related requirements. This includes the implementation of automated data protection-required deletions, aligning with the Engineering Department's core competencies.
Serve as a technical bridge between data teams, engineering teams, and legal departments, facilitating seamless communication and collaboration.
Drive the adoption of modern software development practices, actively participating in agile teams, implementing continuous integration and deployment pipelines, overseeing test automation strategies, and leveraging cloud-based infrastructure for optimal scalability.
Apply a security-focused mindset to write clean, testable, and maintainable code, with a high emphasis on security aspects throughout the software development lifecycle.
SENIOR SOFTWARE DEVELOPER
COFORGE PVT LTD
07.2021 - 08.2022
Worked as Software Developer
My jobs and responsibilities were as follows:
Project - Noon Data Matching and Data Mining
Support and debug the current matching automation
Write configuration file to run for different catalog
Initiate the Web crawling scripts for various domains
Collate the data and make reports
Automate and monitor the current running scripts
Worked along data analyst to make python based and selenium based web scrappers to harvest data
Wrote microservices on Golang
Used continuous delivery/Continuous Integration tools like docker and Jenkins to deploy application on AWS
Design, build and manage the ELK (Elasticsearch, Logstash, Kibana) cluster for centralized logging and search functionalities in app
Project - Data Mining Project
I give support and debug the current tool based on python,Flask,Django and RabbitMQ
I create python Test-scripts using pytest for Data Validations
SOFTWARE DEVELOPER
AFFLUENT GLOBAL SOLUTIONS
07.2020 - 03.2021
Worked as Software Developer
Jobs and responsibilities were as follows:
Project- RBQCM Tool
Have intermediate knowledge on packages like pandas, request, and beautiful-soup 4, SQLAlchemy.
Managed large datasets using Panda data frames and MySQL
Created numerous API for flask based web tool for data to and fro from database to view
Integrated views to controllers and mapped in flask projects
Worked alongside ui-developers on integrating the view to Front-end
Project - Endorsement Billing Tool
Extracting the information from sql views and extracting it into excel based views
Worked on DAX queries to extract and clean data in power-bi tool to generate report
Expertise in automation for client in scripting languages like python, VBA, C#
Project - Insurer Mi Tool
Summarization of insurer data and to draw insights based on repeating patterns and generate (PowerPoint presentation) PPT file
Build automation python scripts to extract data from excel file a and load into Microsoft PowerPoint presentation
Build charts, diagrams into PowerPoint presentation using python scripts
Analyze the data and clean data on large datasets
Build web crawler automation python scripts to scrape data from various websites.
SOFTWARE ENGINEER/DATA ENGINEER
LTTS, LARSEN AND TURBO TECHNOLOGY SERVICES PVT LTD
12.2018 - 07.2020
Worked as Software Engineer/Data Engineer
Jobs and responsibilities were as follows:
Project - Validater for Denzo
Created modules which will scan's image for objects using tensorflow lib
Increased the image quality for better OCR detection using scikit-image lib
Build a module to plot the test result of several data into plot map using matplotlib
Build tensorflow algorithms to extract the information’s from the image or video
Build excels macros for report generation according to nomenclature of client
Build automation using Python-openpyxl (Python for MS Excel) and python-docx2txt (Python for MS Word) i.e
To extract and to create report
Project - Price Monitoring Automation for Denzo
Details- Summarization of customer data and to draw insights based on repeating patterns
Build web crawlers using Python for gathering data related to prices and various other factors from the web
Have intermediate knowledge of web extracting framework like Scrapy, Web harvest
Have intermediate knowledge on packages like pandas, request, beautifulsoup 4 and SQLAlchemy
Developed an algorithm in python which is used to segregate chat data into respected categories and then translate regional language data into English language
Performed system analysis, documentation, testing, implementation and user support for platform transitions
Used Selenium Webdriver in python scripts to extract information from webpage
Automated various task using Pyauto package
Developed SSIS Packages to Extract, Transform and Load (ETL) data into the Data warehouse from SQL Server
Hands on experience in Performance Tuning, Query Optimization
Imported data from SQL Server DB, Azure SQL DB to Power BI to generate reports
Created Azure Blob Storage for Import/Export data to/from .CSV File
Collect data daily and collate report weekly based on various requirements
Build web crawler automation python scripts to scrape data from various websites
Have intermediate knowledge on packages like pandas, request, and beautifulsoup 4, SQLAlchemy
Managed large datasets using Panda data frames and MySQL
Expertise in automation for client in scripting languages like python, VBA, C
PROGRAMMMER ANALYST
ECLERX PVT LTD
03.2017 - 12.2018
After perusing Bachelors in Computer Engineering I got a chance in a firm called ECLERX as a Programmer Analyst to learn more about Software Development
Here I used to learn more about how to design and develop web app using python and frameworks like flask, django or bottle framework
And besides that I used to develop web-crawlers script for various websites in languages like VBA, Python, and C #
Project - APAC region language identifier and extractor using NLP
It is used for identify the text language information for APAC region and extract the meaning of the text
Used to identify the product information from the captured data and map the existing correct product information and map other and aspects of the products
Used fastTExt model in API to extract the language information
Designed algorithm to map correct information based on the extracted information
Project - Amaze (Internal Tool)
It is a python-based framework built on bottle framework
It uses boto3 AWS sdk framework to fetch details of Amazon products
Used AWS lambda functions for extracting data from different data source
Build excel macros for report generation according to nomenclature of client
Wrote python scripts to manage AWS resources
From API calls using BOTO SDK and worked with
AWS CLI to gather data related to prices and various other factors from web
Created some bots for Web - Scrapping to scrape various websites
Build automation using Python-openpyxl (Python for MS Excel) and python-docx2txt (Python for MS Word) i.e
To extract and to create report
Worked alongside tech team for integration of Redis integration
Have intermediate knowledge on packages like pandas, request, beautifulsoup 4 and SQLAlchemy
Developed an algorithm in python which is used to segregate chat data into respected categories and then translate regional language data into English language
Performed system analysis, documentation, testing, implementation and user support for platform transitions
Used Selenium Webdriver in python scripts to extract information from webpage
Automated various task using Pyauto package
Developed SSIS Packages to Extract, Transform and Load (ETL) data into the Data warehouse from SQL Server
Hands on experience in Performance Tuning, Query Optimization
Imported data from SQL Server DB, Azure SQL DB to Power BI to generate reports
Created Dax Queries to generated computed columns in Power BI
Generated computed tables in Power BI by using Dax
Involved in creating new stored procedures and optimizing existing queries and stored procedures
Created Azure Blob Storage for Import/Export data to/from .CSV File
Used Power BI, Power Pivot to develop data analysis prototype, and used Power View and Power Map to visualize report
Project - Price Monitoring and competitive benchmarking
Details- Summarization of customer data and to draw insights based on repeating patterns
Build web crawlers using Python for gathering data related to prices and various other factors from the web
Have intermediate knowledge of web extracting framework like Scrapy, Web harvest.
JR SOFTWARE APPRENTICE ( INTERN. )
Dezignolics
3 2016 - 3 2017
Dezignolics, Thane, Maharashtra
PROJECT - Price Monitoring and competitive
benchmarking
Details- Summarization of customer data and to draw
insights based on repeating patterns.
Build web crawlers using Python for gathering data
related to prices and various other factors from the web.
Have intermediate knowledge of web extracting
framework like scrapy,webharvest.