# 4. Decaying the learning rate
```tf.train.exponential_decay(learning_rate, global_step, decay_steps, decay_rate, staircase=False, name=None)```
***学习率的指数衰减***
训练模型时,经常需要在训练过程中减小学习率
这个函数使用指数函数衰减学习率。
公式如下所示:
```
decayed_learning_rate = learning_rate *
decay_rate ^ (global_step / decay_steps)
```
如果参数```staircase```为True,则```global_step / decay_steps```是一个整除,学习率是一个阶梯函数。
```
## Example: decay every 100000 steps with a base of 0.96:
...
global_step = tf.Variable(0, trainable=False)
starter_learning_rate = 0.1
learning_rate = tf.train.exponential_decay(starter_learning_rate, global_step,
100000, 0.96, staircase=True)
# Passing global_step to minimize() will increment it at each step.
learning_step = (
tf.train.GradientDescentOptimizer(learning_rate)
.minimize(...my loss..., global_step=global_step)
)
```
参数:
* learning_rate: A scalar float32 or float64 Tensor or a Python number. The initial learning rate.
* global_step: A scalar int32 or int64 Tensor or a Python number. Global step to use for the decay computation. Must not be negative.
* decay_steps: A scalar int32 or int64 Tensor or a Python number. Must be positive. See the decay computation above.
* decay_rate: A scalar float32 or float64 Tensor or a Python number. The decay rate.
* staircase: Boolean. If True decay the learning rate at discrete intervals
* name: String. Optional name of the operation. Defaults to 'ExponentialDecay'.
Returns:
A scalar Tensor of the same type as learning_rate. The decayed learning rate.
Raises:
* ValueError: if global_step is not supplied.
```tf.train.inverse_time_decay(learning_rate, global_step, decay_steps, decay_rate, staircase=False, name=None)```
***学习率的逆时间衰减***
公式如下所示:
```
decayed_learning_rate = learning_rate / (1 + decay_rate * t)
```
```
## Example: decay 1/t with a rate of 0.5:
...