Some useful functions
import torch
import matplotlib.pyplot as plt
from nbdev.showdoc import *
t = torch.randn((50,50))
t[:5,:5]
stats(t)
Cross Entropy Loss¶
Softmax of our activations is defined by:
$$\hbox{softmax(x)}_{i} = \frac{e^{x_{i}}}{e^{x_{0}} + e^{x_{1}} + \cdots + e^{x_{n-1}}}$$or more concisely:
$$\hbox{softmax(x)}_{i} = \frac{e^{x_{i}}}{\sum_{0 \leq j \leq n-1} e^{x_{j}}}$$where $n$ is the number of classes.
In practice, we will need the log of the softmax when we calculate the loss.
log_softmax(t)
Plotting¶
plotdist(t)
plotdist(t,showsigmas=False)