[train hyper-parameters: Namespace(batch_size=8, epochs=50, freeze_layers=True, lr=0.0002, model='s')]
[epoch: 1]
train loss:0.3796 train accuracy:0.5761
val loss:0.3903 val accuracy:0.7697
[epoch: 2]
train loss:0.2947 train accuracy:0.8117
val loss:0.3599 val accuracy:0.7632
[epoch: 3]
train loss:0.2342 train accuracy:0.8234
val loss:0.3392 val accuracy:0.7829
[epoch: 4]
train loss:0.1918 train accuracy:0.8420
val loss:0.3189 val accuracy:0.8180
[epoch: 5]
train loss:0.1625 train accuracy:0.8583
val loss:0.3104 val accuracy:0.8377
[epoch: 6]
train loss:0.1414 train accuracy:0.8715
val loss:0.2999 val accuracy:0.8443
[epoch: 7]
train loss:0.1258 train accuracy:0.8776
val loss:0.2966 val accuracy:0.8706
[epoch: 8]
train loss:0.1154 train accuracy:0.8858
val loss:0.2905 val accuracy:0.8575
[epoch: 9]
train loss:0.1039 train accuracy:0.8942
val loss:0.2848 val accuracy:0.8706
[epoch: 10]
train loss:0.0986 train accuracy:0.8984
val loss:0.2859 val accuracy:0.8838
[epoch: 11]
train loss:0.0903 train accuracy:0.9042
val loss:0.2827 val accuracy:0.8838
[epoch: 12]
train loss:0.0852 train accuracy:0.9106
val loss:0.2800 val accuracy:0.8969
[epoch: 13]
train loss:0.0812 train accuracy:0.9144
val loss:0.2749 val accuracy:0.9035
[epoch: 14]
train loss:0.0782 train accuracy:0.9128
val loss:0.2717 val accuracy:0.9189
[epoch: 15]
train loss:0.0729 train accuracy:0.9242
val loss:0.2774 val accuracy:0.9057
[epoch: 16]
train loss:0.0696 train accuracy:0.9258
val loss:0.2776 val accuracy:0.9057
[epoch: 17]
train loss:0.0684 train accuracy:0.9244
val loss:0.2765 val accuracy:0.9189
[epoch: 18]
train loss:0.0661 train accuracy:0.9272
val loss:0.2742 val accuracy:0.9057
[epoch: 19]
train loss:0.0632 train accuracy:0.9300
val loss:0.2693 val accuracy:0.9211
[epoch: 20]
train loss:0.0603 train accuracy:0.9369
val loss:0.2739 val accuracy:0.9189
[epoch: 21]
train loss:0.0603 train accuracy:0.9319
val loss:0.2713 val accuracy:0.9123
[epoch: 22]
train loss:0.0581 train accuracy:0.9353
val loss:0.2730 val accuracy:0.9320
[epoch: 23]
train loss:0.0562 train accuracy:0.9364
val loss:0.2760 val accuracy:0.9276
[epoch: 24]
train loss:0.0569 train accuracy:0.9362
val loss:0.2725 val accuracy:0.9123
[epoch: 25]
train loss:0.0557 train accuracy:0.9356
val loss:0.2741 val accuracy:0.9342
[epoch: 26]
train loss:0.0536 train accuracy:0.9389
val loss:0.2700 val accuracy:0.9276
[epoch: 27]
train loss:0.0535 train accuracy:0.9436
val loss:0.2732 val accuracy:0.9342
[epoch: 28]
train loss:0.0505 train accuracy:0.9456
val loss:0.2709 val accuracy:0.9408
[epoch: 29]
train loss:0.0484 train accuracy:0.9492
val loss:0.2725 val accuracy:0.9408
[epoch: 30]
train loss:0.0490 train accuracy:0.9444
val loss:0.2719 val accuracy:0.9342
[epoch: 31]
train loss:0.0481 train accuracy:0.9458
val loss:0.2718 val accuracy:0.9342
[epoch: 32]
train loss:0.0490 train accuracy:0.9481
val loss:0.2730 val accuracy:0.9276
[epoch: 33]
train loss:0.0459 train accuracy:0.9494
val loss:0.2751 val accuracy:0.9276
[epoch: 34]
train loss:0.0453 train accuracy:0.9514
val loss:0.2764 val accuracy:0.9342
[epoch: 35]
train loss:0.0456 train accuracy:0.9506
val loss:0.2782 val accuracy:0.9342
[epoch: 36]
train loss:0.0448 train accuracy:0.9494
val loss:0.2773 val accuracy:0.9408
[epoch: 37]
train loss:0.0445 train accuracy:0.9494
val loss:0.2764 val accuracy:0.9342
[epoch: 38]
train loss:0.0449 train accuracy:0.9533
val loss:0.2756 val accuracy:0.9408
[epoch: 39]
train loss:0.0451 train accuracy:0.9558
val loss:0.2787 val accuracy:0.9342
[epoch: 40]
train loss:0.0437 train accuracy:0.9542
val loss:0.2816 val accuracy:0.9342
[epoch: 41]
train loss:0.0419 train accuracy:0.9569
val loss:0.2804 val accuracy:0.9342
[epoch: 42]
train loss:0.0421 train accuracy:0.9569
val loss:0.2833 val accuracy:0.9408
[epoch: 43]
train loss:0.0425 train accuracy:0.9550
val loss:0.2785 val accuracy:0.9408
[epoch: 44]
train loss:0.0412 train accuracy:0.9533
val loss:0.2739 val accuracy:0.9474
[epoch: 45]
train loss:0.0419 train accuracy:0.9531
val loss:0.2768 val accuracy:0.9342
[epoch: 46]
train loss:0.0397 train accuracy:0.9589
val loss:0.2799 val accuracy:0.9408
[epoch: 47]
train loss:0.0407 train accuracy:0.9584
val loss:0.2764 val accuracy:0.9342
[epoch: 48]
train loss:0.0395 train accuracy:0.9594
val loss:0.2801 val accuracy:0.9474
[epoch: 49]
train loss:0.0398 train accuracy:0.9600
val loss:0.2836 val accuracy:0.9408
[epoch: 50]
train loss:0.0403 train accuracy:0.9597
val loss:0.2758 val accuracy:0.9408
听风吹等浪起
- 粉丝: 2w+
- 资源: 2320
最新资源
- 【创新无忧】基于金豺优化算法GJO优化广义神经网络GRNN实现电机故障诊断附matlab代码.rar
- 【创新无忧】基于减法平均优化算法SABO优化相关向量机RVM实现数据多输入单输出回归预测附matlab代码.rar
- 【创新无忧】基于金豺优化算法GJO优化广义神经网络GRNN实现光伏预测附matlab代码.rar
- 【创新无忧】基于金豺优化算法GJO优化极限学习机ELM实现乳腺肿瘤诊断附matlab代码.rar
- 【创新无忧】基于金豺优化算法GJO优化广义神经网络GRNN实现数据回归预测附matlab代码.rar
- 【创新无忧】基于金豺优化算法GJO优化极限学习机KELM实现故障诊断附matlab代码.rar
- 【创新无忧】基于金豺优化算法GJO优化相关向量机RVM实现北半球光伏数据预测附matlab代码.rar
- 【创新无忧】基于金豺优化算法GJO优化相关向量机RVM实现数据多输入单输出回归预测附matlab代码.rar
- 【创新无忧】基于金枪鱼优化算法TSO优化广义神经网络GRNN实现电机故障诊断附matlab代码.rar
- 【创新无忧】基于金枪鱼优化算法TSO优化广义神经网络GRNN实现数据回归预测附matlab代码.rar
- 【创新无忧】基于金枪鱼优化算法TSO优化极限学习机ELM实现乳腺肿瘤诊断附matlab代码.rar
- 【创新无忧】基于金枪鱼优化算法TSO优化广义神经网络GRNN实现光伏预测附matlab代码.rar
- 【创新无忧】基于金枪鱼优化算法TSO优化相关向量机RVM实现北半球光伏数据预测附matlab代码.rar
- 【创新无忧】基于金枪鱼优化算法TSO优化相关向量机RVM实现数据多输入单输出回归预测附matlab代码.rar
- 【创新无忧】基于金枪鱼优化算法TSO优化极限学习机KELM实现故障诊断附matlab代码.rar
- 【创新无忧】基于鲸鱼优化算法WOA优化广义神经网络GRNN实现电机故障诊断附matlab代码.rar
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈