DevOps Lead/Data Engineer 9+ years experienced, meticulous & result-oriented Data Science Expert armed with an analytical acumen in econometric modelling, algorithm development & machine learning methodologies. Building data intensive applications, tackling challenging architectural and scalability problems, collecting and sorting data in the healthcare field of organizations. Conducted extensive research on revenue management and pricing analytics in the hospitality sector Applied various machine learning techniques to build dynamic pricing models and maximize profits Gathered pricing data from different aggregators by performing web scraping in Python for competitive analysis Developed an algorithm for yield management using the concept of price elasticity of demand Deployed multiple loss minimization & optimization techniques Led the development of a hotel performance assessment and pricing analysis platform created via k-NN Algorithm Created a recommendation engine to suggest an ideal cluster price for various identified hotel segments Created multivariate regression-based attribution models using ad stock analysis from digital marketing data Developed segmentation models using K-means Clustering for exploring new user segments Conceptualized and implemented a sentiment analysis tool to rate hotels based on subjective customer reviews Established the Data Sciences division from scratch by recruiting a team of 8 Data Analysts Deployed R to develop a customer segmentation algorithm for boosting sales leads and increasing market share by 28% Expert understanding of DevOps principles and Infrastructure as a Code concepts and techniques. Experience of working with Continuous Integration (CI), Continuous Delivery (CD) and continuous testing tools Security and Compliance, e.g. IAM and cloud Docker image and use the docker image to deploy in gcloud clusters. Experience in Designing, Architecting and implementing scalable cloud-based web applications using AWS and GCP. Setup Alerting and monitoring using Stack driver in GCP. Worked on google cloud platform (GCP) services like compute engine, cloud load balancing, cloud storage, cloud SQL, stack driver monitoring and cloud deployment compliance/auditing/monitoring tools Customer/stakeholder focus. Ability to build strong relationships with Application teams, cross functional IT and global/local IT teams. Good leadership and teamwork skills - Works collaboratively in an agile environment with DevOps application ‘pods' to provide GCP specific capability/skills required to deliver the service. Operational effectiveness - delivers solutions that align to approved design patterns and security standards Demonstrable Cloud service provider experience (ideally GCP) - infrastructure build and Environment: Jenkins, Atlassian Stash, Docker, Vagrant, Red Hat , Java, JSON, API, AWS and Azure. Build and deployed Docker container for implementing Micro services Architecture from Monolithic Architecture System build and continuous integration (Building an automated CI/CD pipeline) Automated provisioning, configuration and management of GIT LAB and Team City
Brief: Make predictions using data containing values like crime rate, age, accessibility, population, etc, Make predictions based on the patient’s characteristic data set to predict whether s/he is diabetic or not, Paytm
Paytm is by far the largest online payment platform in India that allows its users to transfer money to another using the Paytm wallet user at zero cost.
Automated deployment into the UAT and Pre production environments.
Maintain the pipeline associated with end-to-end
deployment delivery.
Implemented testing environment for Kubernetes and Deployment , the Kubernetes clusters. Having
experience as Apache Tomcat and Linux Environment.
Hands on experience Deployment and troubleshooting of JAR,WAR, and Ear files in domain and clustered environments.
Deployed applications in various environments like Development,Test,QA and production on multiple tomcat server.
Explore ways to enhance data quality and reliability and identify opportunities for data acquisition
Ensure the daily jobs are run and any issues looked into and solved in time.