Skip to content
#

xgboost-algorithm

Here are 240 public repositories matching this topic...

awesome-gradient-boosting-papers

Create a model to assess the likelihood of a death by heart failure event. This can be used to help hospitals in assessing the severity of patients with cardiovascular diseases.

  • Updated Mar 19, 2022
  • Jupyter Notebook

A binary classification model is developed to predict the probability of paying back a loan by an applicant. Customer previous loan journey was used to extract useful features using different strategies such as manual and automated feature engineering, and deep learning (CNN, RNN). Various machine learning algorithms such as Boosted algorithms (XGBoost, LightGBM, CatBoost) and Deep Neural Network are used to develop a binary classifier and their performances were compared.

  • Updated Nov 22, 2021
  • Jupyter Notebook

Machine learning Based Minor Project, which uses various classification Algorithms to classify the news into FAKE/REAL, on the basis of their Title and Body-Content. Data has been collected from 3 different sources and uses algorithms like Random Forest, SVM, Wordtovec and Logistic Regression. It gave 94% accuracy.

  • Updated Jan 9, 2019
  • Jupyter Notebook

The problem that this case study is dealing with predicts the location that a user is most likely to book for the first time. The accurate prediction helps to decrease the average time required to book by sharing more personalized recommendations and also in better forecasting of the demand. We use the browser’s session data as well as the user’s demographic information that is provided to us to create features that help in solving the problem.

  • Updated Mar 30, 2021
  • Jupyter Notebook

In this project, I have predicted Housing sales price prices for King County,USA which includes Seattle. It includes homes sold between May 2014 and May 2015. It has 19 house features plus the price and the id columns, along with 21613 observations. In this project I have done the implementation of different Boosting regression machine learning models such as Gradient Boosting, eXtreme Gradient Boosting (XGB) and Adaboost. In this project, I have also used Permutation Importance for filtering the irrelevant features of the dataset. In this project, I have predicted Housing sales price prices for King County,USA which includes Seattle. It includes homes sold between May 2014 and May 2015. It has 19 house features plus the price and the id columns, along with 21613 observations. In this project I have done the implementation of different Boosting regression machine learning models such as Gradient Boosting, eXtreme Gradient Boosting (XGB) and Adaboost. In this project, I have also used Permutation Importance for filtering the irrelevant features of the dataset. Maximum Accuracy achieved around 98.59%.

  • Updated Dec 5, 2018
  • HTML

Improve this page

Add a description, image, and links to the xgboost-algorithm topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the xgboost-algorithm topic, visit your repo's landing page and select "manage topics."

Learn more