Summary
Overview
Work History
Education
Skills
Timeline
Work Preference
Generic

sayed Huzaifa

Data Engineer Lead
kanpur,utter Pradesh

Summary

DevOps Lead/Data Engineer 9+ years experienced, meticulous & result-oriented Data Science Expert armed with an analytical acumen in econometric modelling, algorithm development & machine learning methodologies. Building data intensive applications, tackling challenging architectural and scalability problems, collecting and sorting data in the healthcare field of organizations. Conducted extensive research on revenue management and pricing analytics in the hospitality sector Applied various machine learning techniques to build dynamic pricing models and maximize profits Gathered pricing data from different aggregators by performing web scraping in Python for competitive analysis Developed an algorithm for yield management using the concept of price elasticity of demand Deployed multiple loss minimization & optimization techniques Led the development of a hotel performance assessment and pricing analysis platform created via k-NN Algorithm Created a recommendation engine to suggest an ideal cluster price for various identified hotel segments Created multivariate regression-based attribution models using ad stock analysis from digital marketing data Developed segmentation models using K-means Clustering for exploring new user segments Conceptualized and implemented a sentiment analysis tool to rate hotels based on subjective customer reviews Established the Data Sciences division from scratch by recruiting a team of 8 Data Analysts Deployed R to develop a customer segmentation algorithm for boosting sales leads and increasing market share by 28% Expert understanding of DevOps principles and Infrastructure as a Code concepts and techniques. Experience of working with Continuous Integration (CI), Continuous Delivery (CD) and continuous testing tools Security and Compliance, e.g. IAM and cloud Docker image and use the docker image to deploy in gcloud clusters. Experience in Designing, Architecting and implementing scalable cloud-based web applications using AWS and GCP. Setup Alerting and monitoring using Stack driver in GCP. Worked on google cloud platform (GCP) services like compute engine, cloud load balancing, cloud storage, cloud SQL, stack driver monitoring and cloud deployment compliance/auditing/monitoring tools Customer/stakeholder focus. Ability to build strong relationships with Application teams, cross functional IT and global/local IT teams. Good leadership and teamwork skills - Works collaboratively in an agile environment with DevOps application ‘pods' to provide GCP specific capability/skills required to deliver the service. Operational effectiveness - delivers solutions that align to approved design patterns and security standards Demonstrable Cloud service provider experience (ideally GCP) - infrastructure build and Environment: Jenkins, Atlassian Stash, Docker, Vagrant, Red Hat , Java, JSON, API, AWS and Azure. Build and deployed Docker container for implementing Micro services Architecture from Monolithic Architecture System build and continuous integration (Building an automated CI/CD pipeline) Automated provisioning, configuration and management of GIT LAB and Team City

Overview

10
10
years of professional experience

Work History

Data Engineer Lead

Telus International
Noida, Noida
05.2021 - Current
  • It is engaged in the business of providing communications products and services, which include data, Internet protocol, voice, entertainment and Health in 80+ services to 40,000+ customers worldwide
  • nalytics & Machine Learning Methodologies
    Optimization & Algorithm Development
    Statistical Modelling & Analysis
    Conducted extensive research on revenue management and pricing analytics in the hospitality sector
    Applied various machine learning techniques to build dynamic pricing models and maximize profits
    Gathered pricing data from different aggregators by performing web scraping in Python for competitive analysis
    Developed an algorithm for yield management using the concept of price elasticity of demand
    Deployed multiple loss minimization & optimization techniques
    Led the development of a hotel performance assessment and pricing analysis platform created via k-NN Algorithm
    Created a recommendation engine to suggest an ideal cluster price for various identified hotel segments
    Created multivariate regression-based attribution models using ad stock analysis from digital marketing data
    Developed segmentation models using K-means Clustering for exploring new user segments
    Conceptualized and implemented a sentiment analysis tool to rate hotels based on subjective customer reviews
    Established the Data Sciences division from scratch by recruiting a team of 8 Data Analysts
    Deployed R to develop a customer segmentation algorithm for boosting sales leads and increasing market share by 28%
    Expert understanding of DevOps principles and Infrastructure as a Code concepts and techniques.
    Experience of working with Continuous Integration (CI), Continuous Delivery (CD) and continuous testing tools Security and
    Compliance, e.g. IAM and cloud Docker image and use the docker image to deploy in gcloud clusters. Experience in Designing,
    Architecting and implementing scalable cloud-based web applications using AWS and GCP. Setup Alerting and monitoring using
    Stack driver in GCP. Worked on google cloud platform (GCP) services like compute engine, cloud load balancing, cloud storage,
    cloud SQL, stack driver monitoring and cloud deployment compliance/auditing/monitoring tools Customer/stakeholder focus.
    Ability to build strong relationships with Application teams, cross functional IT and global/local IT teams. Good leadership and
    teamwork skills - Works collaboratively in an agile environment with DevOps application ‘pods' to provide GCP specific
    capability/skills required to deliver the service. Operational effectiveness - delivers solutions that align to approved design
    patterns and security standards Demonstrable Cloud service provider experience (ideally GCP) - infrastructure build and
  • Analytics & Machine Learning Methodologies
  • Optimization & Algorithm Development
  • Working on PL/SQL procedures and data loads into data warehouse.
  • Working with application owners and DBAs to ensure data warehouse integrates well across the environment and has accurate and timely data.
  • Build, operate and maintain data architecture and infrastructure for optimal data preparation and transformation.
  • Statistical Modelling & Analysis.
  • Follow up with the stakeholders for Data Engineering related tasks.
  • Ensure and plan for new products and datasets to be accounted for in data warehouse

DevOps Engineer/Data Engineer

Paytm Payments Bank
Bangalore, Karnatka
2019.12 - 2021.05

Brief: Make predictions using data containing values like crime rate, age, accessibility, population, etc, Make predictions based on the patient’s characteristic data set to predict whether s/he is diabetic or not, Paytm
Paytm is by far the largest online payment platform in India that allows its users to transfer money to another using the Paytm wallet user at zero cost.
Automated deployment into the UAT and Pre production environments.
Maintain the pipeline associated with end-to-end
deployment delivery.
Implemented testing environment for Kubernetes and Deployment , the Kubernetes clusters. Having
experience as Apache Tomcat and Linux Environment.
Hands on experience Deployment and troubleshooting of JAR,WAR, and Ear files in domain and clustered environments.
Deployed applications in various environments like Development,Test,QA and production on multiple tomcat server.
Explore ways to enhance data quality and reliability and identify opportunities for data acquisition
Ensure the daily jobs are run and any issues looked into and solved in time.

DevOps Engineer

NetApp
Bangalore, Lucknow, IN
07.2017 - 11.2019
  • NetApp
  • NetApp storage equipment is used by enterprises and service providers to store and to share large amounts of digital data across physical and hybrid cloud environments., Automated deployment into the UAT and Pre production environments
  • Maintain the pipeline associated with end-to-end deployment delivery
  • Implemented testing environment for Kubernetes and Deployment , the Kubernetes clusters
  • Having experience as Apache Tomcat and Linux Environment
  • Hands on experience Deployment and troubleshooting of JAR,WAR, and
  • Ear files in domain and clustered environments
  • Deployed applications in various environments like Development,Test,QA and production on multiple tomcat server
  • Developed environment provisioning solutions using Docker, Vagrant Conceptualized & developed solutions for the
  • Development team to build, deploy, monitor & test application
  • Pipeline Management & Deployment Maintain in-house developed Devops tool- CTL (Common Testing Lab)
  • Maintaining Git workflows for version control (Source code management)
  • Provisioned server and deployed feature using Chef/Ansible Troubleshooting Splunk Forwarders on end machines
  • Getting new users into splunk and assigning particular roles Onboarding logs into splunk
  • Troubleshooting Logs validation
  • Installing Apps with
  • Splunk
  • Configurations of a variety of services including Compute, Storage, SDN (VPC and XPN)
  • Designed a continuous deployment pipeline & integrated Test-Kitchen, Vagrant, Git, Jenkins & Chef across geographical
  • Separated hosting zones in AWS, Azure and Google Compute
  • Developed automation & deployment templates
  • Relational/noSQL databases including MSSQL, Mysql, MongoDB & Redis Splunk Integration with Kubernetes
  • Worked with customer to understand the requirements and provide relevant solution architecture on splunk
  • Working with
  • Splunk support for various splunk issues and splunk licenses
  • Monitoring Logs from splunk
  • Top 15 percentile of the class
  • Big Data Ecosystem and Hadoop
  • Languages: Python, go, professional DevOps engineer .

DevOps Engineer

Iqra Technologies
kanpur, utter Pradesh
01.2014 - 07.2017
  • Shadoworks Technologies is a pioneer in providing Big Data-enabled solutions across banking, insurance and telecom industries.
  • Designed a continuous deployment pipeline & integrated Test-Kitchen, Vagrant, Git,
  • Jenkins & Chef across geographically separated hosting zones in AWS, Azure and Google Compute.
  • Developed automation & deployment templates for relational/noSQL databases including.
  • MSSQL, Mysql, MongoDB & Redis Splunk Integration with Kubernetes.
  • Worked with customer to understand the requirements and provide relevant solution architecture on splunk.
  • Working with Splunk support for various splunk issues and splunk licenses. Monitoring
    Logs from splunk

Education

Master of Computer Application MCA - undefined

INTEGRAL UNIVERSITY LUCKOW
Mar '13

Skills

  • Machine Learning Methodologies
  • Optimization Techniques
  • Attribution Models
  • Segmentation Models
  • Leadership & Training
  • Time Series Analysis
  • Sentiment Analysis
  • Text Mining
  • Data Mining & Analytics
  • Predictive & Statistical Modelling
  • Logistic Regression
  • Customer Segmentation
  • Account Mining Containerization
  • Version Control
  • Project Management
  • Team Management
  • Automation
  • Pipeline Management
  • Database Administration
  • Load Balancing
  • Configuration Management
  • Cloud Technology Management
  • Virtualization
  • Infrastructure Monitoring
  • Security & User Administration
  • Networking & Configuration
  • TECHNICAL SKILLS
  • CERTIFICATIONS

Timeline

Data Engineer Lead

Telus International
05.2021 - Current

DevOps Engineer/Data Engineer

Paytm Payments Bank
2019.12 - 2021.05

DevOps Engineer

NetApp
07.2017 - 11.2019

DevOps Engineer

Iqra Technologies
01.2014 - 07.2017

Master of Computer Application MCA - undefined

INTEGRAL UNIVERSITY LUCKOW

Work Preference

Work Type

Full Time

Important To Me

Company CultureWork-life balanceCareer advancementPaid sick leaveFlexible work hoursPersonal development programsWork from home option4-day work week
sayed HuzaifaData Engineer Lead