Awesome XGBoost
===============
This page contains a curated list of examples, tutorials, blogs about XGBoost usecases.
It is inspired by [awesome-MXNet](https://github.com/dmlc/mxnet/blob/master/example/README.md),
[awesome-php](https://github.com/ziadoz/awesome-php) and [awesome-machine-learning](https://github.com/josephmisiti/awesome-machine-learning).
Please send a pull request if you find things that belongs to here.
Contents
--------
- [Code Examples](#code-examples)
- [Features Walkthrough](#features-walkthrough)
- [Basic Examples by Tasks](#basic-examples-by-tasks)
- [Benchmarks](#benchmarks)
- [Machine Learning Challenge Winning Solutions](#machine-learning-challenge-winning-solutions)
- [Tutorials](#tutorials)
- [Usecases](#usecases)
- [Tools using XGBoost](#tools-using-xgboost)
- [Awards](#awards)
- [Windows Binaries](#windows-binaries)
Code Examples
-------------
### Features Walkthrough
This is a list of short codes introducing different functionalities of xgboost packages.
* Basic walkthrough of packages
[python](guide-python/basic_walkthrough.py)
[R](../R-package/demo/basic_walkthrough.R)
[Julia](https://github.com/antinucleon/XGBoost.jl/blob/master/demo/basic_walkthrough.jl)
[PHP](https://github.com/bpachev/xgboost-php/blob/master/demo/titanic_demo.php)
* Customize loss function, and evaluation metric
[python](guide-python/custom_objective.py)
[R](../R-package/demo/custom_objective.R)
[Julia](https://github.com/antinucleon/XGBoost.jl/blob/master/demo/custom_objective.jl)
* Boosting from existing prediction
[python](guide-python/boost_from_prediction.py)
[R](../R-package/demo/boost_from_prediction.R)
[Julia](https://github.com/antinucleon/XGBoost.jl/blob/master/demo/boost_from_prediction.jl)
* Predicting using first n trees
[python](guide-python/predict_first_ntree.py)
[R](../R-package/demo/predict_first_ntree.R)
[Julia](https://github.com/antinucleon/XGBoost.jl/blob/master/demo/predict_first_ntree.jl)
* Generalized Linear Model
[python](guide-python/generalized_linear_model.py)
[R](../R-package/demo/generalized_linear_model.R)
[Julia](https://github.com/antinucleon/XGBoost.jl/blob/master/demo/generalized_linear_model.jl)
* Cross validation
[python](guide-python/cross_validation.py)
[R](../R-package/demo/cross_validation.R)
[Julia](https://github.com/antinucleon/XGBoost.jl/blob/master/demo/cross_validation.jl)
* Predicting leaf indices
[python](guide-python/predict_leaf_indices.py)
[R](../R-package/demo/predict_leaf_indices.R)
### Basic Examples by Tasks
Most of examples in this section are based on CLI or python version.
However, the parameter settings can be applied to all versions
- [Binary classification](binary_classification)
- [Multiclass classification](multiclass_classification)
- [Regression](regression)
- [Learning to Rank](rank)
### Benchmarks
- [Starter script for Kaggle Higgs Boson](kaggle-higgs)
- [Kaggle Tradeshift winning solution by daxiongshu](https://github.com/daxiongshu/kaggle-tradeshift-winning-solution)
- [Benchmarking the most commonly used open source tools for binary classification](https://github.com/szilard/benchm-ml#boosting-gradient-boosted-treesgradient-boosting-machines)
## Machine Learning Challenge Winning Solutions
XGBoost is extensively used by machine learning practitioners to create state of art data science solutions,
this is a list of machine learning winning solutions with XGBoost.
Please send pull requests if you find ones that are missing here.
- Maksims Volkovs, Guangwei Yu and Tomi Poutanen, 1st place of the [2017 ACM RecSys challenge](http://2017.recsyschallenge.com/). Link to [paper](http://www.cs.toronto.edu/~mvolkovs/recsys2017_challenge.pdf).
- Vlad Sandulescu, Mihai Chiru, 1st place of the [KDD Cup 2016 competition](https://kddcup2016.azurewebsites.net). Link to [the arxiv paper](http://arxiv.org/abs/1609.02728).
- Marios Michailidis, Mathias Müller and HJ van Veen, 1st place of the [Dato Truely Native? competition](https://www.kaggle.com/c/dato-native). Link to [the Kaggle interview](http://blog.kaggle.com/2015/12/03/dato-winners-interview-1st-place-mad-professors/).
- Vlad Mironov, Alexander Guschin, 1st place of the [CERN LHCb experiment Flavour of Physics competition](https://www.kaggle.com/c/flavours-of-physics). Link to [the Kaggle interview](http://blog.kaggle.com/2015/11/30/flavour-of-physics-technical-write-up-1st-place-go-polar-bears/).
- Josef Slavicek, 3rd place of the [CERN LHCb experiment Flavour of Physics competition](https://www.kaggle.com/c/flavours-of-physics). Link to [the Kaggle interview](http://blog.kaggle.com/2015/11/23/flavour-of-physics-winners-interview-3rd-place-josef-slavicek/).
- Mario Filho, Josef Feigl, Lucas, Gilberto, 1st place of the [Caterpillar Tube Pricing competition](https://www.kaggle.com/c/caterpillar-tube-pricing). Link to [the Kaggle interview](http://blog.kaggle.com/2015/09/22/caterpillar-winners-interview-1st-place-gilberto-josef-leustagos-mario/).
- Qingchen Wang, 1st place of the [Liberty Mutual Property Inspection](https://www.kaggle.com/c/liberty-mutual-group-property-inspection-prediction). Link to [the Kaggle interview](http://blog.kaggle.com/2015/09/28/liberty-mutual-property-inspection-winners-interview-qingchen-wang/).
- Chenglong Chen, 1st place of the [Crowdflower Search Results Relevance](https://www.kaggle.com/c/crowdflower-search-relevance). Link to [the winning solution](https://www.kaggle.com/c/crowdflower-search-relevance/forums/t/15186/1st-place-winner-solution-chenglong-chen/).
- Alexandre Barachant (“Cat”) and Rafał Cycoń (“Dog”), 1st place of the [Grasp-and-Lift EEG Detection](https://www.kaggle.com/c/grasp-and-lift-eeg-detection). Link to [the Kaggle interview](http://blog.kaggle.com/2015/10/12/grasp-and-lift-eeg-winners-interview-1st-place-cat-dog/).
- Halla Yang, 2nd place of the [Recruit Coupon Purchase Prediction Challenge](https://www.kaggle.com/c/coupon-purchase-prediction). Link to [the Kaggle interview](http://blog.kaggle.com/2015/10/21/recruit-coupon-purchase-winners-interview-2nd-place-halla-yang/).
- Owen Zhang, 1st place of the [Avito Context Ad Clicks competition](https://www.kaggle.com/c/avito-context-ad-clicks). Link to [the Kaggle interview](http://blog.kaggle.com/2015/08/26/avito-winners-interview-1st-place-owen-zhang/).
- Keiichi Kuroyanagi, 2nd place of the [Airbnb New User Bookings](https://www.kaggle.com/c/airbnb-recruiting-new-user-bookings). Link to [the Kaggle interview](http://blog.kaggle.com/2016/03/17/airbnb-new-user-bookings-winners-interview-2nd-place-keiichi-kuroyanagi-keiku/).
- Marios Michailidis, Mathias Müller and Ning Situ, 1st place [Homesite Quote Conversion](https://www.kaggle.com/c/homesite-quote-conversion). Link to [the Kaggle interview](http://blog.kaggle.com/2016/04/08/homesite-quote-conversion-winners-write-up-1st-place-kazanova-faron-clobber/).
## Talks
- [XGBoost: A Scalable Tree Boosting System](http://datascience.la/xgboost-workshop-and-meetup-talk-with-tianqi-chen/) (video+slides) by Tianqi Chen at the Los Angeles Data Science meetup
## Tutorials
- [Machine Learning with XGBoost on Qubole Spark Cluster](https://www.qubole.com/blog/machine-learning-xgboost-qubole-spark-cluster/)
- [XGBoost Official RMarkdown Tutorials](https://xgboost.readthedocs.org/en/latest/R-package/index.html#tutorials)
- [An Introduction to XGBoost R Package](http://dmlc.ml/rstats/2016/03/10/xgboost.html) by Tong He
- [Open Source Tools & Data Science Competitions](http://www.slideshare.net/odsc/owen-zhangopen-sourcetoolsanddscompetitions1) by Owen Zhang - XGBoost parameter tuning tips
* [Feature Importance Analysis with XGBoost in Tax audit](http://fr.slideshare.net/MichaelBENESTY/feature-importance-analysis-with-xgboost-in-tax-audit)
* [Winning solution of Kaggle Higgs competition: what a single model can do](http://no2147483647.wordpress.com/2014/09/17/winning-solution-of-kaggle-higgs-competition-what-a-single-model-c
没有合适的资源?快使用搜索试试~ 我知道了~
资源详情
资源评论
资源推荐
收起资源包目录
xgboost-details.zip_xgboost_xgboost源码_机器学习 (546个子文件)
00Index 807B
configure.ac 917B
xgboost.bib 912B
init.c 4KB
xgboost_assert.c 564B
updater_colmaker.cc 37KB
c_api.cc 36KB
updater_fast_hist.cc 35KB
updater_histmaker.cc 34KB
gbtree.cc 22KB
learner.cc 19KB
hist_util.cc 18KB
rank_metric.cc 16KB
regression_obj.cc 15KB
updater_skmaker.cc 15KB
cpu_predictor.cc 15KB
xgboost_R.cc 13KB
rank_obj.cc 13KB
cli_main.cc 12KB
data.cc 12KB
sparse_page_lz4_format.cc 11KB
gblinear.cc 11KB
sparse_page_dmatrix.cc 8KB
split_evaluator.cc 8KB
test_regression_obj.cc 8KB
sparse_page_source.cc 8KB
elementwise_metric.cc 7KB
tree_model.cc 6KB
updater_refresh.cc 6KB
test_rank_metric.cc 5KB
updater_coordinate.cc 5KB
updater_shotgun.cc 5KB
multiclass_obj.cc 5KB
helpers.cc 4KB
test_metainfo.cc 4KB
multiclass_metric.cc 4KB
simple_csr_source.cc 4KB
sparse_page_raw_format.cc 3KB
test_sparse_page_dmatrix.cc 3KB
simple_dmatrix.cc 3KB
host_device_vector.cc 3KB
custom_obj.cc 3KB
test_simple_dmatrix.cc 3KB
updater_prune.cc 3KB
dense_libsvm.cc 3KB
test_elementwise_metric.cc 2KB
sparse_page_writer.cc 2KB
test_param.cc 2KB
test_c_api.cc 2KB
xgboost-all0.cc 2KB
test_cpu_predictor.cc 2KB
test_compressed_iterator.cc 2KB
xgboost_custom.cc 2KB
test_linear.cc 2KB
updater_sync.cc 1KB
test_simple_csr_source.cc 1KB
metric.cc 1KB
tree_updater.cc 1KB
objective.cc 1KB
test_multiclass_metric.cc 1KB
predictor.cc 940B
test_ranking_obj.cc 935B
gbm.cc 859B
linear_updater.cc 783B
logging.cc 773B
common.cc 580B
dmlc-minimum0.cc 559B
c_api_error.cc 460B
test_learner.cc 402B
test_metric.cc 349B
test_objective.cc 252B
test_main.cc 214B
setup.cfg 41B
CITATION 620B
.clang-tidy 2KB
cleanup 30B
FindLibR.cmake 6KB
Utils.cmake 4KB
FindGTest.cmake 3KB
FindNccl.cmake 2KB
yearpredMSD.conf 998B
machine.conf 967B
mushroom.conf 933B
mushroom.aws.conf 855B
mq2008.conf 754B
theme.conf 24B
configure 82KB
xgboost4j.cpp 27KB
vignette.css 4KB
xgboost.css 4KB
rank.train.csv 7KB
rank.test.csv 3KB
updater_gpu_hist.cu 41KB
updater_gpu.cu 28KB
gpu_predictor.cu 16KB
updater_gpu_coordinate.cu 13KB
host_device_vector.cu 11KB
regression_obj_gpu.cu 7KB
test_regression_obj_gpu.cu 3KB
test_gpu_predictor.cu 3KB
共 546 条
- 1
- 2
- 3
- 4
- 5
- 6
weixin_42653672
- 粉丝: 93
- 资源: 1万+
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功
评论0