Activation Functions

Published on: September 20, 2021

Activation Functions

Table of Content

Sigmoid Function

Sigmoid_Function

Function:

Derivative:

Tanh Function

Tanh_Function.png

Function:

Derivative:

Rectified Linear Unit (ReLU)

Rectified_Linear_Unit_(ReLU)

Function:

Derivative:

Leaky ReLU

Leaky_ReLU

Function:

Derivative:

Parametric ReLU

Parameteric_ReLU

Function:

Derivative:

Exponential Linear Unit (ELU)

Exponential_Linear_Unit_(ELU)

Function:

Derivative:

Scaled Exponential Linear Unit (SELU)

Scaled_Exponential_Linear_Unit_(SELU)

Function:

Derivative:

Gaussian Error Linear Unit (GELU)

Gaussian_Error_Linear_Unit_(GELU)

Function:

Derivative:

Sigmoid-Weighted Linear Unit (SiLU) / Swish

Sigmoid_Weighted_Linear_Unit_(SiLU)_Swish

Function:

Derivative:

Softmax Function

Function:

Softplus

SoftPlus

Function:

Derivative:

Mish

Mish_Function

Function:

Derivative:

Resources

More stories

  • Kernel PCA

  • Principal Component Analysis (PCA)

  • Linear Discriminant Analysis (LDA)