# CBAM-Keras
This is a Keras implementation of ["CBAM: Convolutional Block Attention Module"](https://arxiv.org/pdf/1807.06521).
This repository includes the implementation of ["Squeeze-and-Excitation Networks"](https://arxiv.org/pdf/1709.01507) as well, so that you can train and compare among base CNN model, base model with CBAM block and base model with SE block.
## CBAM: Convolutional Block Attention Module
**CBAM** proposes an architectural unit called *"Convolutional Block Attention Module" (CBAM)* block to improve representation power by using attention mechanism: focusing on important features and supressing unnecessary ones.
This research can be considered as a descendant and an improvement of ["Squeeze-and-Excitation Networks"](https://arxiv.org/pdf/1709.01507).
### Diagram of a CBAM_block
<div align="center">
<img src="https://github.com/kobiso/CBAM-keras/blob/master/figures/overview.png">
</div>
### Diagram of each attention sub-module
<div align="center">
<img src="https://github.com/kobiso/CBAM-keras/blob/master/figures/submodule.png">
</div>
### Classification results on ImageNet-1K
<div align="center">
<img src="https://github.com/kobiso/CBAM-keras/blob/master/figures/exp4.png">
</div>
<div align="center">
<img src="https://github.com/kobiso/CBAM-keras/blob/master/figures/exp5.png" width="750">
</div>
## Prerequisites
- Python 3.x
- Keras
## Prepare Data set
This repository use [*Cifar10*](https://www.cs.toronto.edu/~kriz/cifar.html) dataset.
When you run the training script, the dataset will be automatically downloaded.
(Note that you **can not run Inception series model** with Cifar10 dataset, since the smallest input size available in Inception series model is 139 when Cifar10 is 32. So, try to use Inception series model with other dataset.)
## CBAM_block and SE_block Supportive Models
You can train and test base CNN model, base model with CBAM block and base model with SE block.
You can run **CBAM_block** or **SE_block** added models in the below list.
- Inception V3 + CBAM / + SE
- Inception-ResNet-v2 + CBAM / + SE
- ResNet_v1 + CBAM / + SE (ResNet20, ResNet32, ResNet44, ResNet56, ResNet110, ResNet164, ResNet1001)
- ResNet_v2 + CBAM / + SE (ResNet20, ResNet56, ResNet110, ResNet164, ResNet1001)
- ResNeXt + CBAM / + SE
- MobileNet + CBAM / + SE
- DenseNet + CBAM / + SE (DenseNet121, DenseNet161, DenseNet169, DenseNet201, DenseNet264)
### Change *Reduction ratio*
To change *reduction ratio*, you can set `ratio` on `se_block` and `cbam_block` method in `models/attention_module.py`
## Train a Model
You can simply train a model with `main.py`.
1. Set a model you want to train.
- e.g. `model = resnet_v1.resnet_v1(input_shape=input_shape, depth=depth, attention_module=attention_module)`
2. Set attention_module parameter
- e.g. `attention_module = 'cbam_block'`
3. Set other parameter such as *batch_size*, *epochs*, *data_augmentation* and so on.
4. Run the `main.py` file
- e.g. `python main.py`
## Related Works
- Blog: [CBAM: Convolutional Block Attention Module](https://kobiso.github.io//research/research-CBAM/)
- Repository: [CBAM-TensorFlow](https://github.com/kobiso/CBAM-tensorflow)
- Repository: [CBAM-TensorFlow-Slim](https://github.com/kobiso/CBAM-tensorflow-slim)
- Repository: [SENet-TensorFlow-Slim](https://github.com/kobiso/SENet-tensorflow-slim)
## Reference
- Paper: [CBAM: Convolutional Block Attention Module](https://arxiv.org/pdf/1807.06521)
- Paper: [Squeeze-and-Excitation Networks](https://arxiv.org/pdf/1709.01507)
- Repository: [Keras: Cifar10 ResNet example](https://github.com/keras-team/keras/blob/master/examples/cifar10_resnet.py)
- Repository: [keras-squeeze-excite-network](https://github.com/titu1994/keras-squeeze-excite-network)
## Author
Byung Soo Ko / kobiso62@gmail.com
没有合适的资源?快使用搜索试试~ 我知道了~
CBAM-keras:在Keras上实施CBAM

共35个文件
py:18个
pyc:10个
png:4个


CBAM-Keras 这是实现。 该存储库还包括,因此您可以在基本的CNN模型,带有CBAM块的基本模型和带有SE块的基本模型之间进行训练和比较。 CBAM:卷积块注意模块 CBAM提出了一个称为“卷积块注意模块”(CBAM)块的体系结构单元,以通过使用注意机制来提高表示能力:关注重要特征并抑制不必要的特征。 该研究可以被认为是的后代和改进。 CBAM_block的图 每个注意子模块图 ImageNet-1K上的分类结果 先决条件 Python 3.x 凯拉斯 准备数据集 该存储库使用数据集。 当您运行训练脚本时,数据集将被自动下载。 (请注意,您不能使用Cifar10数据集运行Inception系列模型,因为当Cifar10为32时,Inception系列模型中可用的最小输入大小为139。因此,请尝试将Inception系列模型与其他数据集一起使用。) CBAM_block和SE_b
资源详情
资源评论
资源推荐
收起资源包目录









































共 35 条
- 1















司幽幽
- 粉丝: 26
- 资源: 4547

上传资源 快速赚钱
我的内容管理 收起
我的资源 快来上传第一个资源
我的收益
登录查看自己的收益我的积分 登录查看自己的积分
我的C币 登录后查看C币余额
我的收藏
我的下载
下载帮助

会员权益专享
安全验证
文档复制为VIP权益,开通VIP直接复制

评论1