Matlab Toolbox for Dimensionality Reduction (v0.8.1b)
=====================================================
Information
-------------------------
Author: Laurens van der Maaten
Affiliation: Delft University of Technology
Contact: lvdmaaten@gmail.com
Release date: March 21, 2013
Version: 0.8.1b
Installation
-------------------------
Copy the drtoolbox/ folder into the $MATLAB_DIR/toolbox directory (where $MATLAB_DIR indicates your Matlab installation directory). Start Matlab and select 'Set path...' from the File menu. Click the 'Add with subfolders...' button, select the folder $MATLAB_DIR/toolbox/drtoolbox in the file dialog, and press Open. Subsequently, press the Save button in order to save your changes to the Matlab search path. The toolbox is now installed.
Some of the functions in the toolbox use MEX-files. Precompiled versions of these MEX-files are distributed with this release, but the compiled version for your platform might be missing. In order to compile all MEX-files, type cd([matlabroot '/toolbox/drtoolbox']) in your Matlab prompt, and execute the function MEXALL.
Features
-------------------------
This Matlab toolbox implements 34 techniques for dimensionality reduction and metric learning. These techniques are all available through the COMPUTE_MAPPING function or through the GUI. The following techniques are available:
- Principal Component Analysis ('PCA')
- Linear Discriminant Analysis ('LDA')
- Multidimensional scaling ('MDS')
- Probabilistic PCA ('ProbPCA')
- Factor analysis ('FactorAnalysis')
- Sammon mapping ('Sammon')
- Isomap ('Isomap')
- Landmark Isomap ('LandmarkIsomap')
- Locally Linear Embedding ('LLE')
- Laplacian Eigenmaps ('Laplacian')
- Hessian LLE ('HessianLLE')
- Local Tangent Space Alignment ('LTSA')
- Diffusion maps ('DiffusionMaps')
- Kernel PCA ('KernelPCA')
- Generalized Discriminant Analysis ('KernelLDA')
- Stochastic Neighbor Embedding ('SNE')
- Symmetric Stochastic Neighbor Embedding ('SymSNE')
- t-Distributed Stochastic Neighbor Embedding ('tSNE')
- Neighborhood Preserving Embedding ('NPE')
- Locality Preserving Projection ('LPP')
- Stochastic Proximity Embedding ('SPE')
- Linear Local Tangent Space Alignment ('LLTSA')
- Conformal Eigenmaps ('CCA', implemented as an extension of LLE)
- Maximum Variance Unfolding ('MVU', implemented as an extension of LLE)
- Landmark Maximum Variance Unfolding ('LandmarkMVU')
- Fast Maximum Variance Unfolding ('FastMVU')
- Locally Linear Coordination ('LLC')
- Manifold charting ('ManifoldChart')
- Coordinated Factor Analysis ('CFA')
- Gaussian Process Latent Variable Model ('GPLVM')
- Deep autoencoders ('Autoencoder')
- Neighborhood Components Analysis ('NCA')
- Maximally Collapsing Metric Learning ('MCML')
- Large Margin Nearest Neighhbor metric learning ('LMNN')
Furthermore, the toolbox contains 6 techniques for intrinsic dimensionality estimation. These techniques are available through the function INTRINSIC_DIM. The following techniques are available:
- Eigenvalue-based estimation ('EigValue')
- Maximum Likelihood Estimator ('MLE')
- Estimator based on correlation dimension ('CorrDim')
- Estimator based on nearest neighbor evaluation ('NearNb')
- Estimator based on packing numbers ('PackingNumbers')
- Estimator based on geodesic minimum spanning tree ('GMST')
In addition to these techniques, the toolbox contains functions for prewhitening of data (the function PREWHITEN), exact and estimate out-of-sample extension (the functions OUT_OF_SAMPLE and OUT_OF_SAMPLE_EST), and a function that generates toy datasets (the function GENERATE_DATA).
The graphical user interface of the toolbox is accessible through the DRGUI function.
Usage
-------------------------
All the functions that you should call as a user of the toolbox are located in the same folder as this Readme-file. The folder contains the following files:
- compute_mapping.m This function performs the specified dimension reduction technique on the specified data set. Type HELP COMPUTE_MAPPING to get details on supported techniques and on the parameters of the techniques.
- drgui.m This function allows you to use some of the toolbox functionality via a graphical user interface.
- generate_data.m This function generates some artificial data sets such as the Swiss roll data set.
- intrinsic_dim.m This function performs intrinsic dimensionality estimation using the specified estimator on the specified data set.
- mexall.m This function compiles all the MEX-files that are required to use the toolbox. Please run immediately after installation.
- out_of_sample.m This function takes as input a dimension reduction mapping and a set of new test points, and outputs the locations of the test points in the reduced space. This function is only supported by parametric and spectral techniques.
- out_of_sample_est.m This function takes as input a training set, a reduced version of that training set, and a set of new test points, and finds an approximate locations of the test points in the reduced space. Only use this function for techniques that do not support out-of-sample-extensions.
- prewhiten.m This function whitens data, i.e., it makes it zero-mean, identity-covariance.
- reconstruct_data.m This function computes reconstructions of reduced data for linear techniques and autoencoders.
- test_toolbox.m This function runs a full test of all functionalities of the toolbox.
Here is an example on how to use the toolbox:
[X, labels] = generate_data('helix', 2000);
figure, scatter3(X(:,1), X(:,2), X(:,3), 5, labels); title('Original dataset'), drawnow
no_dims = round(intrinsic_dim(X, 'MLE'));
disp(['MLE estimate of intrinsic dimensionality: ' num2str(no_dims)]);
[mappedX, mapping] = compute_mapping(X, 'PCA', no_dims);
figure, scatter(mappedX(:,1), mappedX(:,2), 5, labels); title('Result of PCA');
[mappedX, mapping] = compute_mapping(X, 'Laplacian', no_dims, 7);
figure, scatter(mappedX(:,1), mappedX(:,2), 5, labels(mapping.conn_comp)); title('Result of Laplacian Eigenmaps'); drawnow
It will create a helix dataset, estimate the intrinsic dimensionality of the dataset, run Laplacian Eigenmaps on the dataset, and plot the results. All functions in the toolbox can work both on data matrices as on PRTools datasets (http://prtools.org). For more information on the options for dimensionality reduction, type HELP COMPUTE_MAPPING in your Matlab prompt. Information on the intrinsic dimensionality estimators can be obtained by typing the HELP INTRINSIC_DIM.
Pitfalls
-------------------------
When you run certain code, you might receive an error that a certain file is missing. This is because in some parts of the code, MEX-functions are used. I provide a number of precompiled versions of these MEX-functions in the toolbox. However, the MEX-file for your platform might be missing. To fix this, type in your Matlab:
mexall
Now you have compiled versions of the MEX-files as well. This fix also solves slow execution of the shortest path computations in Isomap.
If you encounter an error concerning CSDP while running the FastMVU-algorithm, the binary of CSDP for your platform is missing. If so, please obtain a binary distribution of CSDP from https://projects.coin-or.org/Csdp/ and place it in the drtoolbox/techniques directory. Make sure it has the right name for your platform (csdp.exe for Windows, csdpmac for Mac OS X (PowerPC), csdpmaci for Mac OS X (Intel), and csdplinux for Linux).
Many methods for dimensionality reduction perform spectral analyses of sparse matrices. You might think that eigenanalysis is a well-studied problem that can easily be solved. However, eigenanalysis of large matrices turns out to be tedious. The toolbox allows you to use two different methods for eigenanalysis:
- The original Matlab functions (based on Arnoldi methods)
- The JDQR functions (based on Jacobi-Davidson methods)
For problems up to 10,000 data poin
没有合适的资源?快使用搜索试试~ 我知道了~
温馨提示
故障诊断方案 MATLAB代码 1、采用改进麻雀算法优化LSTM网络中的学习率、LSTM单元数。 2、与SSA、WOA、PSO、GA做对比,以训练集准确率、测试集准确率、训练时间、测试时间作为性能评价指标。 3、训练集准确率和测试集准确率各生成一个下面的图:测试集和训练集的准确度分开画图。 4、ISSA、SSA、WOA、PSO、GA-LSTM训练集和测试集的迭代次数和准确率/损失值各画在一张图上;ISSA-LSTM准确率要在95%以上。
资源推荐
资源详情
资源评论
收起资源包目录
采用改进麻雀算法优化LSTM网络中的学习率、LSTM单元数 (389个子文件)
mexCCACollectData.c 8KB
kernel_function.c 7KB
mexCCACollectData2.c 6KB
find_nn.c 4KB
computegr.c 3KB
._kernel_function.c 120B
dijkstra.cpp 30KB
._dijkstra.cpp 120B
csdplinux 1.62MB
csdpmac 81KB
csdpmaci 88KB
mexCCACollectData2.dll 8KB
computegr.dll 7KB
mexCCACollectData.dll 7KB
故障诊断方案(ISSA).docx 338KB
drtoolbox使用.docx 25KB
.DS_Store 6KB
._.DS_Store 120B
csdp.exe 1.06MB
mapping_parameters.fig 12KB
no_history.fig 8KB
not_calculated.fig 7KB
not_loaded.fig 7KB
drtool.fig 6KB
load_data_vars.fig 5KB
load_data.fig 5KB
load_xls.fig 4KB
load_data_1_var.fig 3KB
choose_method.fig 3KB
fibheap.h 3KB
jdqz.m 77KB
jdqr.m 71KB
drtool.m 53KB
mapping_parameters.m 24KB
compute_mapping.m 20KB
cca.m 14KB
out_of_sample.m 10KB
intrinsic_dim.m 9KB
CEEMD_main2.m 9KB
lmvu.m 8KB
minimize.m 8KB
load_data_vars.m 8KB
writesdpa.m 8KB
not_calculated.m 8KB
not_loaded.m 8KB
no_history.m 7KB
sdecca2.m 7KB
sammon.m 7KB
load_data.m 6KB
cfa.m 6KB
csdp.m 6KB
choose_method.m 5KB
lmnn.m 5KB
kernel_function.m 5KB
test_toolbox.m 5KB
generate_data.m 5KB
mppca.m 5KB
load_data_1_var.m 5KB
tsne_p.m 5KB
load_xls.m 5KB
train_grbm.m 5KB
lle.m 5KB
fastmvu.m 4KB
ssaforlstm.m 4KB
train_rbm.m 4KB
npe.m 4KB
train_lin_rbm.m 4KB
kernel_pca.m 4KB
hlle.m 4KB
llc.m 4KB
readsol.m 4KB
plotn.m 4KB
gda.m 4KB
spe.m 4KB
cg_update.m 4KB
issaforlstm.m 4KB
issaforbp.m 4KB
scattern.m 4KB
dijk.m 4KB
x2p.m 4KB
laplacian_eigen.m 3KB
ltsa.m 3KB
d2p.m 3KB
em_pca.m 3KB
lltsa.m 3KB
backprop.m 3KB
sne.m 3KB
sym_sne.m 3KB
lpp.m 3KB
tsne.m 3KB
landmark_isomap.m 3KB
charting.m 3KB
find_nn_adaptive.m 3KB
backprop_gradient.m 3KB
combn.m 3KB
significance.m 3KB
significance.m 3KB
ceemd.m 3KB
isomap.m 2KB
train_deep_autoenc.m 2KB
共 389 条
- 1
- 2
- 3
- 4
资源评论
- weixin_481879792023-11-20资源很赞,希望多一些这类资源。
- hovering20082024-03-24资源很赞,希望多一些这类资源。
- elec_boy2023-07-12资源很赞,希望多一些这类资源。
- 2301_774735202023-11-22资源很不错,内容和描述一致,值得借鉴,赶紧学起来!
AI信仰者
- 粉丝: 1w+
- 资源: 143
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功