activation_relu.RdActivations functions can either be used through layer_activation(), or
through the activation argument supported by all forward layers.
activation_relu(x, alpha = 0, max_value = NULL, threshold = 0) activation_elu(x, alpha = 1) activation_selu(x) activation_hard_sigmoid(x) activation_linear(x) activation_sigmoid(x) activation_softmax(x, axis = -1) activation_softplus(x) activation_softsign(x) activation_tanh(x) activation_exponential(x)
| x | Tensor |
|---|---|
| alpha | Alpha value |
| max_value | Max value |
| threshold | Threshold value for thresholded activation. |
| axis | Integer, axis along which the softmax normalization is applied |
Tensor with the same shape and dtype as x.
activation_selu() to be used together with the initialization "lecun_normal".
activation_selu() to be used together with the dropout variant "AlphaDropout".
activation_selu(): Self-Normalizing Neural Networks