# CNTK Examples: Image - Getting Started
## Overview
|Data: |The MNIST dataset (http://yann.lecun.com/exdb/mnist/) of handwritten digits.
|:---------|:---
|Purpose |This folder contains a number of examples that demonstrate the usage of BrainScript to define basic networks for deep learning on image tasks.
|Network |Simple feed-forward networks including dense layers, convolution layers, drop out and batch normalization for classification and regression tasks.
|Training |Stochastic gradient descent both with and without momentum.
|Comments |There are seven configuration files, details are provided below.
## Running the example
### Getting the data
These examples use the MNIST dataset to demonstrate various network configurations. MNIST dataset is not included in the CNTK distribution but can be easily downloaded and converted by following the instructions in [DataSets/MNIST](../DataSets/MNIST). We recommend you to keep the downloaded data in the respective folder while downloading, as the configuration files in this folder assumes that by default.
### Setup
Compile the sources to generate the cntk executable (not required if you downloaded the binaries).
__Windows:__ Add the folder of the cntk executable to your path
(e.g. `set PATH=%PATH%;c:/src/cntk/x64/Release/;`)
or prefix the call to the cntk executable with the corresponding folder.
__Linux:__ Add the folder of the cntk executable to your path
(e.g. `export PATH=$PATH:$HOME/src/cntk/build/Release/bin/`)
or prefix the call to the cntk executable with the corresponding folder.
### Run
Run the example from the current folder (recommended) using:
`cntk configFile=01_OneHidden.cntk`
or run from any folder and specify the `GettingStarted` folder as the `currentDirectory`,
e.g. running from the `Image` folder using:
`cntk configFile=GettingStarted/01_OneHidden.cntk currentDirectory=GettingStarted`
An Output folder will be created in the `Image/GettingStarted` folder, which is used to store intermediate results and trained models.
## Details
There are seven cntk configuration files in the current folder. These cntk configuration files use BrainScript, a custom script language for CNTK. To learn more about BrainScript, please follow the introduction of [BrainScript Basic Concepts](https://docs.microsoft.com/en-us/cognitive-toolkit/Brainscript-Basic-concepts).
### 01_OneHidden.cntk
This is a simple, one hidden layer network that produces `1.76%` of error. Since this model does not assume any spatial relationships between the pixels, it is often referred as "permutation invariant".
To run this example, use the following command:
`cntk configFile=01_OneHidden.cntk`
In this example, the MNIST images are first normalized to the range `[0,1)`, followed by a single dense hidden layer with 200 nodes. A [rectified linear unit (ReLU)](http://machinelearning.wustl.edu/mlpapers/paper_files/icml2010_NairH10.pdf) activation function is added for nonlinearity. Afterwards, another dense linear layer is added to generate the output label. The training adopts cross entropy as the cost function after softmax.
In the `SGD` block, `learningRatesPerSample = 0.01*5:0.005` indicates using 0.01 as learning rate per sample for 5 epochs and then 0.005 for the rest. More details about the SGD block are explained [here](https://docs.microsoft.com/en-us/cognitive-toolkit/Brainscript-SGD-Block).
The MNIST data is loaded with a simple CNTK text format reader. The train and test datasets are converted by running the Python script in [DataSets/MNIST](../DataSets/MNIST). For more information on the reader block, please refer [here](https://docs.microsoft.com/en-us/cognitive-toolkit/Brainscript-Reader-block).
### 02_OneConv.cntk
In the second example, we add a convolution layer to the network. Convolution layers were inspired by biological process, and has been extremely popular in image-related tasks, where neighboring pixels have high correlation. One of the earliest papers on convolution neural networks can be found [here](http://yann.lecun.com/exdb/publis/pdf/lecun-01a.pdf).
To run this example, use the following command:
`cntk configFile=02_OneConv.cntk`
After normalization, a convolution layer with `16` kernels at size `(5,5)` is added, followed by a ReLU nonlinearity. Then, we perform max pooling on the output feature map, with size `(2,2)` and stride `(2,2)`. A dense layer of 64 hidden nodes is then added, followed by another ReLU, and another dense layer to generate the output. This network achieves `1.22%` error rate, which is better than the previous network.
In practice, one would be stacking multiple convolution layers to improve classification accuracy. State-of-the-art convolution neural networks can achieve lower than 0.5% error rate on MNIST. Interested readers can find more examples in [Classification/ConvNet](../Classification/ConvNet).
### 03_OneConvdropout.cntk
In the third example, we demonstrate the use of dropout layers. Dropout is a network regularization technique that helps combat overfitting, in particular when the network contains many parameters. Dropout, together with ReLU activiation, are the two key techniques that enables Alex Krizhevsky, Ilya Sutskever, and Geoffrey Hinton to win the ILSVRC-2012 competition, which has argueabally changed the course of computer vision research. Their paper can be found [here](https://papers.nips.cc/paper/4824-imagenet-classification-with-deep-convolutional-neural-networks.pdf).
To run this example, use the following command:
`cntk configFile=03_OneConvDropout.cntk`
Compared with the previous example, we added a dropout layer after max pooling. Dropout can also be added after dense layer if needed. The dropout rate is specified in the SGD block, as `dropoutRate = 0.5`.
With dropout, the accuracy of the network improves slightly to `1.10%` error rate.
### 04_OneConvBN.cntk
In the fourth example, we add [batch normalization](https://arxiv.org/abs/1502.03167) to the network. Batch normalization was designed to address the internal covariate shift problem caused by input and parameter changes during training. The technique has been proven to be very useful in training very deep and complicated networks.
In this example, we simply added a batch normalization layer to the `02_OneConv.cntk` network. To run this example, use the following command:
`cntk configFile=04_OneConvBN.cntk`
The network achieves around `0.96%` error rate, which is better than the previous examples. Due to the small training dataset and the extremely simple network, we have to stop the training early (10 epochs) in order to avoid overfitting.
This cntk configuration file also demonstrates the use of custom layer definition in BrainScript. Note `ConvBnReluPoolLayer` and `DenseBnReluLayer` are both custom layers that contains different basic layer types.
### 05_OneConvRegr.cntk
In the fifth example, we show how CNTK can be used to perform a regression task. To simplify our task and not introduce any new datasets, we assume the digit labels of MNIST is a regression target rather than a classification target. We then reuse the same network architecture in `02_OneConv`, only to replace the cost function with squared error. To run this example, use the following command:
`cntk configFile=05_OneConvRegr.cntk`
The trained network achieves root-mean-square error (RMSE) of around 0.05. To see more sophisticated examples on regression tasks, please refer to [Regression](../Regression).
### 06_OneConvRegrMultiNode.cntk
In the sixth example, we show how to train CNTK with multiple process(GPUs) for a regression task. CNTK using MPI for the multiple nodes task, and CNTK currently support four parallel SGD algorithms: DataParallelSGD, BlockMomentumSGD, ModelAveragingSGD, DataParallelASGD. We reuse the same network architecture in `05_OneConvRegr`, only to add a
没有合适的资源?快使用搜索试试~ 我知道了~
资源详情
资源评论
资源推荐
收起资源包目录
CNTK-2-7-Windows-64bit-CPU-Only.zip (932个子文件)
TIMIT.train.scp.fbank.fullpath.rnn.100 11KB
TIMIT.train.scp.mfcc.fullpath.rnn.100 11KB
TIMIT.train.scp.fbank.fullpath.100 10KB
TIMIT.train.scp.mfcc.fullpath.100 10KB
TIMIT.core.scp.fbank.fullpath.rnn.20 2KB
TIMIT.dev.scp.fbank.fullpath.rnn.20 2KB
TIMIT.core.scp.fbank.fullpath.20 2KB
TIMIT.core.scp.scaledloglike.fullpath.20 1KB
TIMIT.core.scp.bottleneck.fullpath.20 1KB
cmudict-0.7b 3.67MB
Package.appxmanifest 2KB
TIMIT.bigram.arpa 80KB
Global.asax 116B
Global.asax 114B
install.bat 1KB
CNTK.core.bs 117KB
InceptionBlocks.bs 7KB
InceptionLayers.bs 5KB
Macros.bs 4KB
InceptionLayers.bs 4KB
Inception-ResNet-V1.bs 3KB
InceptionV3.bs 3KB
BN-Inception.bs 2KB
000000000.chunk 31.82MB
GenerateTemporaryKey.cmd 1KB
slu.forward.lookahead.cmf 3.46MB
slu.forward.cmf 2.77MB
slu.forward.nobn.cmf 2.76MB
slu.forward.backward.cmf 2.09MB
cifar10.ResNet.cmf 1.08MB
cifar10.pretrained.cmf 462KB
TIMIT.train.align_cistate.mlf.cntk 17.08MB
TIMIT.dev.align_cistate.mlf.cntk 1.83MB
TIMIT.core.align_cistate.mlf.cntk 897KB
TIMIT.train.align_dr.mlf.cntk 163KB
rnn.cntk 19KB
TIMIT.dev.align_dr.mlf.cntk 18KB
G2P.cntk 18KB
rnnlm.cntk 12KB
rnnlm.gpu.cntk 12KB
fnnlm.cntk 11KB
fnnlm.cntk 11KB
rnnlm.cntk 11KB
fastrcnn.cntk 9KB
TIMIT.core.align_dr.mlf.cntk 9KB
CNTK_ndl_deprecated.cntk 7KB
ATIS.cntk 7KB
AlexNet_ImageNet.cntk 6KB
VGG19_ImageNet.cntk 6KB
VGG16_ImageNet.cntk 6KB
ResNet152_ImageNet1K.cntk 5KB
ResNet101_ImageNet1K.cntk 5KB
ResNet50_ImageNet1K.cntk 5KB
RegrSimple_CIFAR10.cntk 5KB
ImageHandsOn_Task6.cntk 5KB
ConvNetLRN_CIFAR10_DataAug.cntk 5KB
ResNet34_ImageNet1K.cntk 5KB
InceptionV3.cntk 5KB
ResNet18_ImageNet1K.cntk 5KB
Inception-ResNet-V1.cntk 4KB
ImageHandsOn_Solution5.cntk 4KB
ImageHandsOn_Solution4.cntk 4KB
ImageHandsOn_Task4_Start.cntk 4KB
ResNet110_CIFAR10.cntk 4KB
ResNet20_CIFAR10.cntk 4KB
BN-Inception.cntk 4KB
SLUHandsOn_Solution4.cntk 4KB
ConvNet_CIFAR10_DataAug.cntk 4KB
ImageHandsOn_Solution3.cntk 4KB
01_OneHidden.cntk 4KB
ImageHandsOn_Solution1.cntk 3KB
TrainSimpleTimit.cntk 3KB
ImageHandsOn.cntk 3KB
ImageHandsOn_Solution2.cntk 3KB
07_Deconvolution_BS.cntk 3KB
06_OneConvRegrMultiNode.cntk 3KB
3Classes_bs.cntk 3KB
04_OneConvBN.cntk 3KB
lr_bs.cntk 3KB
SLUHandsOn_Solution2.cntk 3KB
SLUHandsOn_Solution3.cntk 3KB
ConvNet_CIFAR10.cntk 3KB
SLUHandsOn_Solution1.cntk 3KB
CNTK2_lstmp_smbr_ndl_deprecated.cntk 3KB
ConvNet_MNIST.cntk 3KB
CNTK2_smbr_ndl_deprecated.cntk 3KB
SLUHandsOn.cntk 3KB
03_OneConvDropout.cntk 3KB
TIMIT_TrainWithPreTrain_ndl_deprecated.cntk 3KB
CNTK2_lstmp_ndl_deprecated.cntk 3KB
05_OneConvRegr.cntk 3KB
02_OneConv.cntk 3KB
MLP_MNIST.cntk 3KB
TIMIT_LSTM_ndl_deprecated.cntk 3KB
TIMIT_LSTM_ndl_deprecated.cntk 2KB
TIMIT_TrainMultiInput_ndl_deprecated.cntk 2KB
TIMIT_AdaptLearnRate.cntk 2KB
CNTK2_dnn_ndl_deprecated.cntk 2KB
TIMIT_TrainLSTM_ndl_deprecated.cntk 2KB
TIMIT_DNN_ndl_deprecated.cntk 2KB
共 932 条
- 1
- 2
- 3
- 4
- 5
- 6
- 10
爱学习的广东仔
- 粉丝: 1w+
- 资源: 130
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
最新资源
- cad定制家具平面图工具-(FG)门板覆盖柜体
- asp.net 原生js代码及HTML实现多文件分片上传功能(自定义上传文件大小、文件上传类型)
- whl@pip install pyaudio ERROR: Failed building wheel for pyaudio
- Constantsfd密钥和权限集合.kt
- 基于Java的财务报销管理系统后端开发源码
- 基于Python核心技术的cola项目设计源码介绍
- 基于Python及多语言集成的TSDT软件过程改进设计源码
- 基于Java语言的歌唱比赛评分系统设计源码
- 基于JavaEE技术的课程项目答辩源码设计——杨晔萌、李知林、岳圣杰、张俊范小组作品
- 基于Java原生安卓开发的蔚蓝档案娱乐应用设计源码
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功
评论0