**This repo is no longer maintained!**
# [DICTOL](https://github.com/tiepvupsu/DICTOL) - A Discriminative dictionary Learning Toolbox for Classification (MATLAB version).
_This Toolbox is a part of our [LRSDL project](http://signal.ee.psu.edu/lrsdl.html)._
[**See Python version**](https://github.com/tiepvupsu/DICTOL_python).
**Related publications:**
1. Tiep H. Vu, Vishal Monga. "Fast Low-rank Shared Dictionary Learning for Image Classification." *to appear in IEEE Transactions on Image Processing*. [[paper](https://arxiv.org/abs/1610.08606)].
2. Tiep H. Vu, Vishal Monga. "Learning a low-rank shared dictionary for object classification." *International Conference on Image Processing (ICIP)* 2016. [[paper]](http://arxiv.org/abs/1602.00310).
**Author: [Tiep Vu](http://www.personal.psu.edu/thv102/)**
**Run `DICTOL_demo.m` to see example**
If you experience any bugs, please let me know via the [**Issues**](https://github.com/tiepvupsu/DICTOL/issues) tab. I'd really appreciate and fix the error ASAP. Thank you.
**On this page:**
<!-- MarkdownTOC -->
- [Notation](#notation)
- [Sparse Representation-based classification \(SRC\)](#sparse-representation-based-classification-src)
- [Online Dictionary Learning \(ODL\)](#online-dictionary-learning-odl)
- [Cost function](#cost-function)
- [Training ODL](#training-odl)
- [LCKSVD](#lcksvd)
- [Dictionary learning with structured incoherence and shared features \(DLSI\)](#dictionary-learning-with-structured-incoherence-and-shared-features-dlsi)
- [Cost function](#cost-function-1)
- [Training DLSI](#training-dlsi)
- [DLSI predict new samples](#dlsi-predict-new-samples)
- [Demo](#demo)
- [Dictionary learning for separating the particularity and the commonality \(COPAR\)](#dictionary-learning-for-separating-the-particularity-and-the-commonality-copar)
- [Cost function](#cost-function-2)
- [Training COPAR](#training-copar)
- [COPAR predect new samples](#copar-predect-new-samples)
- [Demo](#demo-1)
- [LRSDL](#lrsdl)
- [Motivation](#motivation)
- [Cost function](#cost-function-3)
- [Training LRSDL](#training-lrsdl)
- [LRSDL predict new samples](#lrsdl-predict-new-samples)
- [Demo](#demo-2)
- [Fisher discrimination dictionary learning \(FDDL\)](#fisher-discrimination-dictionary-learning-fddl)
- [Cost function](#cost-function-4)
- [Training FDDL](#training-fddl)
- [FDDL predect new samples](#fddl-predect-new-samples)
- [Discriminative Feature-Oriented dictionary learning \(DFDL\)](#discriminative-feature-oriented-dictionary-learning-dfdl)
- [D2L2R2](#dlr)
- [Fast iterative shrinkage-thresholding algorithm \(FISTA\)](#fast-iterative-shrinkage-thresholding-algorithm-fista)
- [References](#references)
<!-- /MarkdownTOC -->
<a name="notation"></a>
# Notation
* `Y`: signals. Each column is one observation.
* `D`: dictionary.
* `X`: sparse coefficient.
* `d`: signal dimension. `d = size(Y, 1)`.
* `C`: number of classes.
* `c`: class index.
* `n_c`: number of training samples in class `c`. Typically, all `n_c` are the same and equal to `n`.
* `N`: total number of training samples.
* `Y_range`: an array storing range of each class, suppose that labels are sorted in a ascending order.
Example: If `Y_range = [0, 10, 25]`, then:
- There are two classes, samples from class 1 range from 1 to 10, from class 2 range from 11 to 25.
- In general, samples from class `c` range from `Y_range(c) + 1` to `Y_range(c+1)`
- We can observe that number of classes `C = numel(Y_range) - 1`.
* `k_c`: number of bases in class-specific dictionary `c`. Typically, all `n_c` are the same and equal to `k`.
* `k_0`: number of bases in the shared-dictionary
* `K`: total number of dictionary bases.
* `D_range`: similar to `Y_range` but used for dictionary without the shared dictionary.
<a name="sparse-representation-based-classification-src"></a>
# Sparse Representation-based classification (SRC)
* Sparse Representation-based classification implementation [[1]](#fn_src).
* Classification based on SRC.
* Syntax: `[pred, X] = SRC_pred(Y, D, D_range, opts)`
- INPUT:
+ `Y`: test samples.
+ `D`: the total dictionary. `D = [D_1, D_2, ..., D_C]` with `D_c` being the _c-th_ class-specific dictionary.
+ `D_range`: range of class-specific dictionaries in `D`. See also [Notation](#notation).
+ `opts`: options.
* `opts.lambda`: `lambda` for the Lasso problem. Default: `0.01`.
* `opts.max_iter`: maximum iterations of fista algorithm. Default: `100`. [Check this implementation of FISTA](https://github.com/tiepvupsu/FISTA)
- OUTPUT:
+ `pred`: predicted labels of test samples.
+ `X`: solution of the lasso problem.
<a name="online-dictionary-learning-odl"></a>
# Online Dictionary Learning (ODL)
* An implementation of the well-known Online Dictionary Learning method [[2]](#fn_odl).
<a name="cost-function"></a>
## Cost function
<img src = "latex/ODL_cost.png" height = "40"/>
<a name="training-odl"></a>
## Training ODL
* Syntax: `[D, X] = ODL(Y, k, lambda, opts, sc_method)`
- INPUT:
+ `Y`: collection of samples.
+ `k`: number of bases in the desired dictionary.
+ `lambda`: norm 1 regularization parameter.
+ `opts`: option.
+ `sc_method`: sparse coding method used in the sparse coefficient update. Possible values:
- `'fista'`: using FISTA algorithm. See also [`fista`](#fista).
- `'spams'`: using SPAMS toolbox [[12]](#fn_spams).
- OUTPUT:
+ `D, X`: as in the problem.
<a name="lcksvd"></a>
# LCKSVD
Check its [project page](http://www.umiacs.umd.edu/~zhuolin/projectlcksvd.html)
<a name="dictionary-learning-with-structured-incoherence-and-shared-features-dlsi"></a>
# Dictionary learning with structured incoherence and shared features (DLSI)
* An implementation of the well-known DLSI method [[5]](#fn_dls).
<a name="cost-function-1"></a>
## Cost function
<img src = "latex/DLSI_cost.png" height = "50"/>
<a name="training-dlsi"></a>
## Training DLSI
* function `[D, X, rt] = DLSI(Y, Y_range, opts)`
* The main DLSI algorithm
* INPUT:
- `Y, Y_range`: training samples and their labels
- `opts`:
+ `opts.lambda, opts.eta`: `lambda` and `eta` in the cost function
+ `opts.max_iter`: maximum iterations.
* OUTPUT:
- `D`: the trained dictionary,
- `X`: the trained sparse coefficient,
- `rt`: total running time of the training process.
<a name="dlsi-predict-new-samples"></a>
## DLSI predict new samples
* function `pred = DLSI_pred(Y, D, opts)`
* predict the label of new input `Y` given the trained dictionary `D` and
parameters stored in `opts`
<a name="demo"></a>
## Demo
Run `DLSI_top` in Matlab command window.
<a name="dictionary-learning-for-separating-the-particularity-and-the-commonality-copar"></a>
# Dictionary learning for separating the particularity and the commonality (COPAR)
* An implementation of COPAR [[7]](#fn_cor).
<a name="cost-function-2"></a>
## Cost function
<img src = "latex/COPAR_cost.png" height = "50"/>
where:
<img src = "latex/COPAR_cost1.png" height = "50"/>
<a name="training-copar"></a>
## Training COPAR
* function `[D, X, rt] = COPAR(Y, Y_range, opts)`
* INPUT:
- `Y, Y_range`: training samples and their labels
- `opts`: a struct
+ `opts.lambda, opts.eta`: `lambda` and `eta` in the cost function
+ `opts.max_iter`: maximum iterations.
* OUTPUT:
- `D`: the trained dictionary,
- `X`: the trained sparse coefficient,
- `rt`: total running time of the training process.
<a name="copar-predect-new-samples"></a>
## COPAR predect new samples
* function pred = COPAR_pred(Y, D, D_range_ext, opts)
* predict label of the input Y
* INPUT:
- `Y`: test samples
- `D`: the trained dictionary
- `D_range_ext`: range o
没有合适的资源?快使用搜索试试~ 我知道了~
温馨提示
DICTOL - A Dictionary Learning Toolbox in Matlab and Python.zip
资源推荐
资源详情
资源评论
收起资源包目录
DICTOL - A Dictionary Learning Toolbox in Matlab and Python.zip (446个子文件)
norm12.aux 8B
COPAR_cost1.aux 8B
DLSI_cost.aux 8B
COPAR_cost.aux 8B
ODL_cost.aux 8B
myblas.c 12KB
myblas.c 12KB
ompcore.c 12KB
ompcore.c 12KB
ompprof.c 4KB
ompprof.c 4KB
collincomb.c 4KB
col2imstep.c 4KB
omp2mex.c 3KB
omp2mex.c 3KB
rowlincomb.c 3KB
im2colstep.c 3KB
ompmex.c 3KB
ompmex.c 3KB
mexutils.c 2KB
mexutils.c 2KB
mexutils.c 2KB
omputils.c 2KB
omputils.c 2KB
addtocols.c 2KB
sprow.c 2KB
.DS_Store 6KB
DLSI_cost.fdb_latexmk 26KB
COPAR_cost1.fdb_latexmk 26KB
COPAR_cost.fdb_latexmk 26KB
ODL_cost.fdb_latexmk 26KB
norm12.fdb_latexmk 25KB
DLSI_cost.fls 25KB
COPAR_cost1.fls 24KB
COPAR_cost.fls 24KB
ODL_cost.fls 24KB
norm12.fls 24KB
.gitignore 15B
COPAR_cost1.synctex.gz 3KB
DLSI_cost.synctex.gz 3KB
COPAR_cost.synctex.gz 3KB
ODL_cost.synctex.gz 3KB
norm12.synctex.gz 2KB
myblas.h 14KB
myblas.h 14KB
mexutils.h 4KB
mexutils.h 4KB
mexutils.h 4KB
ompcore.h 3KB
ompcore.h 3KB
ompprof.h 3KB
ompprof.h 3KB
omputils.h 2KB
omputils.h 2KB
LICENSE 35KB
DLSI_cost.log 30KB
COPAR_cost1.log 29KB
COPAR_cost.log 29KB
ODL_cost.log 29KB
norm12.log 29KB
ksvd.m 20KB
ksvdver.m 13KB
ksvddenoise.m 10KB
ompdenoise.m 10KB
ompver.m 8KB
FDDL_INIC.m 8KB
mexFistaFlat.m 7KB
omp2.m 6KB
mexProximalTree.m 6KB
mexTrainDL.m 6KB
myLassoWeighted_fista.m 5KB
mexStructTrainDL.m 5KB
myLasso_fista.m 5KB
omp.m 5KB
LRSDL.m 5KB
mexProximalFlat.m 5KB
reggrid.m 5KB
mexTrainDL_Memory.m 5KB
DLSI.m 4KB
COPAR_updateXc2.m 4KB
ompdenoise3.m 4KB
lasso_fista.m 4KB
ompdenoise1.m 4KB
ompdenoise2.m 4KB
COPAR_updateXc.m 4KB
mexFistaTree.m 4KB
mexFistaGraph.m 4KB
mexFistaPathCoding.m 4KB
mexProximalGraph.m 3KB
min_rank_dict.m 3KB
DLSI_updateD.m 3KB
nmf.m 3KB
LRSDL_updateXX0.m 3KB
DICTOL_demo.m 3KB
fista.m 3KB
initialization4LCKSVD.m 3KB
mexStochasticProx.m 3KB
mexIncrementalProx.m 3KB
COPAR_updateD.m 3KB
display_network.m 3KB
共 446 条
- 1
- 2
- 3
- 4
- 5
资源评论
AbelZ_01
- 粉丝: 894
- 资源: 5441
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功