homework3
February 6, 2019
1 Homework 3 - Berkeley STAT 157
Handout 2/5/2019, due 2/12/2019 by 4pm in Git by committing to your repository.
Formatting: please include both a .ipynb and .pdf file in your homework submission,
named homework3.ipynb and homework3.pdf. You can export your notebook to a pdf either
by File -> Download as -> PDF via Latex (you may need Latex installed), or by simply printing
to a pdf from your browser (you may want to do File -> Print Preview in jupyter first). Please
don’t change the filename.
In [1]: from mxnet import nd, autograd, gluon
import matplotlib.pyplot as plt
2 1. Logistic Regression for Binary Classification
In multiclass classification we typically use the exponential model
p(y|o) = softmax(o)
y
=
exp(o
y
)
P
y
0
exp(o
y
0
)
1.1. Show that this parametrization has a spurious degree of freedom. That is, show that both o
and o + c with c ∈ R lead to the same probability estimate. 1.2. For binary classification, i.e. when-
ever we have only two classes {−1, 1}, we can arbitrarily set o
−1
= 0. Using the shorthand o = o
1
show that this is equivalent to
p(y = 1|o) =
1
1 + exp(−o)
1.3. Show that the log-likelihood loss (often called logistic loss) for labels y ∈ {−1, 1} is thus
given by
− log p(y|o) = log(1 + exp(−y · o))
1.4. Show that for y = 1 the logistic loss asymptotes to o for o → ∞ and to exp(o) for o → −∞.
3 2. Logistic Regression and Autograd
1. Implement the binary logistic loss l(y, o) = log(1 + exp(−y · o)) in Gluon
2. Plot its values for y ∈ {−1, 1} over the range of o ∈ [−5, 5].
1