<h2><font color="red">Contents</font></h2>
The YAN-PRTools matlab toolbox now includes 40 common pattern recognition algorithms:
**Feature processing**
1. *mat2ftvec* : Transform sample matrices to a feature matrix
1. *zscore* : feature normalization
1. *pca* : PCA
1. *kpca* : KPCA
1. *lda* : LDA
**Classification**
1. *lr* : Logistic regression
1. *softmax* : Softmax
1. *svm* : Wrapper of libsvm
1. *rf* : Random forest
1. *knn* : K nearest neighbors
1. *gauss* : Wrapper of Matlab's classify function, including methods like naive Bayes, fitting normal density function, Mahalanobis distance, etc.
1. *boost* : AdaBoost with stump weak classifier
1. *tree* : Wrapper of Matlab's tree classifier
1. *ann* : Wrapper of the artificial neural networks in Matlab
1. *elm* : Basic extreme learning machine
**Regression**
1. *ridge* : Ridge regression
1. *kridge* : Kernel ridge regression
1. *svr* : Wrapper of support vector regression in libsvm
1. *simplefit* : Wrapper to Matlab's basic fitting functions, inncluding least squares, robust fitting, quadratic fitting, etc.
1. *lasso* : Wrapper of Matlab's lasso regression
1. *pls* : Wrapper of Matlab's patial least square regression
1. *step* : Wrapper of Matlab's stepwisefit
1. *rf* : Random forest
1. *ann* : Wrapper of the artificial neural networks in Matlab
1. *elm* : Basic extreme learning machine
**Feature selection**
1. *corr* : Feature ranking based on correlation coefficients (filter method)
1. *fisher* : Feature ranking using Fisher ratio (filter method)
1. *mrmr* : Feature ranking using minimum redundancy maximal relevance (mRMR) (filter method)
1. *single* : Feature ranking based on each single feature's prediction accuracy (wrapper method)
1. *sfs* : Feature selection using sequential forward selection (wrapper method)
1. *ga* : Feature selection using the genetic algorithm in Matlab (wrapper method)
1. *rf* : Feature ranking using random forest (embedded method)
1. *stepwisefit* : Feature selection based on stepwise fitting (embedded method)
1. *boost* : Feature selection using AdaBoost with the stump weak learner (embedded method)
1. *svmrfe_ori* : Feature ranking using SVM-recursive feature elimination (SVM-RFE), the original linear version (embedded method)
1. *svmrfe_ker* : Feature ranking using the kernel version of SVM-RFE (embedded method)
**Representative sample selection (active learning)**
1. *cluster* : Sample selection based on cluster centers
1. *ted* : Transductive experimental design
1. *llr* : Locally linear reconstruction
1. *ks* : Kennard-Stone algorithm
<br>
<h2><font color="orange">Interfaces</font></h2>
**Feature processing**
[Xnew, model] = ftProc_xxx_tr(X,Y,param) % training
Xnew = ftProc_xxx_te(model,X) % test
**Classification**
model = classf_xxx_tr(X,Y,param) % training
[pred,prob] = classf_xxx_te(model,Xtest) % test, return the predicted labels and probabilities (optional)
**Regression**
model = regress_xxx_tr(X,Y,param) % training
rv = regress_xxx_te(model,Xtest) % test, return the predicted values
**Feature selection**
[ftRank,ftScore] = ftSel_xxx(ft,target,param) % return the feature rank (or subset) and scores (optional)
**Representative sample selection (active learning)**
smpList = smpSel_xxx(X,nSel,param) % return the indices of the selected samples
Please see test.m for sample usages.
Besides, there are three uniform wrappers: ftProc_, classf_, regress_. They accept algorithm name strings as inputs and combine the training and test phase.
<br>
<h2><font color="green">Characteristics</font></h2>
* The training (tr) and test (te) phases are split for feature processing, classification and regression to allow more flexible use. For example, one trained model can be applied multiple times.
* The struct "param" is used to pass parameters to algorithms.
* Default parameters are set clearly at the top of the code, along with the explainations.
In brief, I aimed at three main objectives when developing this toolbox:
* Unified and simple interface;
* Convenient to observe and change algorithm parameters, avoiding tedious parameter setting and checking;
* Extensibile. Simple file structures makes it easier to modify the algorithms.
<br>
<h2><font color="blue">Dependencies</font></h2>
In the toolbox, 20 algorithms are self-implemented, 11 are wrappers or mainly based on Matlab functions, and 9 are wrappers or mainly based on 3rd party toolboxes, which are listed below. They are included in the project, however, you may need to recompile some of them depending on your computer platform.
* SVM and SVR: Chih-Chung Chang and Chih-Jen Lin, libsvm (this toolbox is so famous that you only need to google it)
* RF: Abhishek Jaiantilal, [https://code.google.com/p/randomforest-matlab/]()
* ELM: Qin-Yu Zhu and Guang-Bin Huang, [http://www.ntu.edu.sg/home/egbhuang/elm_random_hidden_nodes.html]()
* mRMR: Hanchuan Peng, [http://www.mathworks.com/matlabcentral/fileexchange/?term=authorid%3A27911]()
* TED: Kai Yu, Jinbo Bi, and Volker Tresp, [http://www.dbs.ifi.lmu.de/~yu_k/ted/]()
Thanks to the authors and MathWorks Inc.! I know that there is so many important algorithms not contained in the toolbox, so everybody is welcomed to contribute new codes! Also, if you find any bug in the codes, please don't hesitate to let me know!
Ke YAN, 2016, Tsinghua Univ. http://yanke23.com, [email protected]
没有合适的资源?快使用搜索试试~ 我知道了~
40种回归、分类、特征选择降维等方法合集
共282个文件
m:121个
cpp:25个
mexw64:19个
需积分: 0 1 下载量 176 浏览量
2023-05-10
20:52:48
上传
评论 1
收藏 4.77MB ZIP 举报
温馨提示
40种回归、分类、特征选择降维等方法合集
资源推荐
资源详情
资源评论
收起资源包目录
40种回归、分类、特征选择降维等方法合集 (282个子文件)
simu_newsgroup.asv 2KB
svmtrain.c 11KB
svmpredict.c 9KB
svm-train.c 9KB
svm_model_matlab.c 8KB
svm-scale.c 8KB
interface.c 6KB
svm-predict.c 5KB
qsort.c 5KB
qsort.c 5KB
libsvmread.c 4KB
libsvmwrite.c 2KB
main.c 398B
Compile_Check 856B
Compile_Check_kcachegrind 611B
Compile_Check_memcheck 623B
COPYRIGHT 1KB
svm.cpp 62KB
reg_RF.cpp 39KB
classRF.cpp 33KB
mex_regressionRF_train.cpp 12KB
diabetes_C_wrapper.cpp 11KB
svm-toy.cpp 11KB
callbacks.cpp 10KB
twonorm_C_wrapper.cpp 10KB
svm-toy.cpp 10KB
rfutils.cpp 9KB
classTree.cpp 9KB
mex_ClassificationRF_train.cpp 8KB
cokus.cpp 7KB
cokus.cpp 7KB
findjointstateab.cpp 7KB
estpab.cpp 6KB
mex_ClassificationRF_predict.cpp 5KB
estpa.cpp 5KB
mex_regressionRF_predict.cpp 4KB
estmutualinfo.cpp 3KB
estcondentropy.cpp 2KB
estjointentropy.cpp 1KB
estentropy.cpp 1KB
cokus_test.cpp 1KB
cokus_test.cpp 1KB
svm.def 478B
twonorm_C_devcpp.dev 2KB
diabetes_C_devc.dev 1KB
libsvm.dll 157KB
estentropy.dll 40KB
estpa.dll 40KB
estcondentropy.dll 40KB
estpab.dll 40KB
findjointstateab.dll 40KB
estjointentropy.dll 40KB
estmutualinfo.dll 40KB
method summary.doc 43KB
svm-train.exe 152KB
svm-toy.exe 138KB
svm-predict.exe 123KB
svm-scale.exe 79KB
rfsub.f 15KB
svm-toy.glade 6KB
rf.h 5KB
svm.h 3KB
callbacks.h 2KB
reg_RF.h 560B
miinclude.h 521B
elementmexheader.h 477B
interface.h 203B
svm_model_matlab.h 201B
heart_scale 27KB
FAQ.html 71KB
test_applet.html 81B
libsvm.jar 50KB
svm.java 61KB
svm_toy.java 12KB
svm_scale.java 9KB
svm_train.java 8KB
svm_predict.java 5KB
svm_parameter.java 1KB
svm_model.java 868B
svm_problem.java 136B
svm_node.java 115B
svm_print_interface.java 87B
README.libsvm 28KB
classRF_train.m 14KB
regRF_train.m 13KB
tutorial_ClassRF.m 10KB
tutorial_RegRF.m 9KB
fmincg.m 9KB
ftSel_svmrfe_ker.m 7KB
ftSel_svmrfe_ori.m 5KB
test.m 5KB
multierrorbar_v6.m 5KB
multierrorbar.m 5KB
mrmr_miq_d.m 3KB
mrmr_mid_d.m 3KB
classf_elm_tr.m 3KB
activelearn_newsgroup.m 3KB
transdesign_kernelridge.m 3KB
regress_elm_tr.m 3KB
multilabel_accu.m 2KB
共 282 条
- 1
- 2
- 3
资源评论
漫步编程路
- 粉丝: 477
- 资源: 11
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功