1.3. Activation Function#

Non-linear activation functions are an important part of deep neural networks that allow them to learn non-linear problems. Activation functions are elementwise (e.g., at each pixel independently).

Pytorch has implemented several non-linear activation function, please check their documentation.

0. Preparation#

Packages#

Let’s start with all the necessary packages to implement this tutorial.

  • numpy is the main package for scientific computing with Python. It’s often imported with the np shortcut.

  • matplotlib is a library to plot graphs in Python.

  • torch is a deep learning framework that allows us to define networks, handle datasets, optimise a loss function, etc.

# importing the necessary packages/libraries
import numpy as np
from matplotlib import pyplot as plt
import torch

Input tensor#

In this notebook, we compute different activation functions for an input tensor ranging between \([-10, 10]\).

input_tensor = torch.arange(-10, 11).float()
print(input_tensor)
tensor([-10.,  -9.,  -8.,  -7.,  -6.,  -5.,  -4.,  -3.,  -2.,  -1.,   0.,   1.,
          2.,   3.,   4.,   5.,   6.,   7.,   8.,   9.,  10.])

1. ReLU (Rectifier Linear Unit)#

ReLU is probably the most famous and widely used non-linear function that clips the negative values:

\(ReLU(x)=(x)^+=max(0,x)\)

relu = torch.nn.ReLU()
plt.plot(input_tensor, relu(input_tensor))
plt.xlabel('Input')
plt.ylabel('Output')
plt.title("ReLU")
plt.show()
../_images/58ed588efba0a649e003b7d5cbcd4da1bb5c35e6a4bfefa582cb58dd86318535.png

2. Other Non-linear functions#

non_linear_funs = {
    'ELU': torch.nn.ELU(),
    'LeakyReLU': torch.nn.LeakyReLU(),
    'PReLU': torch.nn.PReLU(),
    'ReLU6': torch.nn.ReLU6(), 
    'SELU': torch.nn.SELU(),
    'CELU': torch.nn.CELU(),
}
fig = plt.figure(figsize=(12, 10))
for i, (name, fun) in enumerate(non_linear_funs.items()):
    ax = fig.add_subplot(2, 3, i+1)
    ax.plot(input_tensor, fun(input_tensor).detach().numpy())
    ax.set_title(name)
    ax.set_xlabel('Input')
    ax.set_ylabel('Output')
../_images/da03b58c6a5bdb6ec27ae07d588901f071e5bdcb934b7ec7f9c5ab13f4b0b375.png