layer_activation_relu.RdRectified Linear Unit activation function
layer_activation_relu(object, max_value = NULL, negative_slope = 0, threshold = 0, input_shape = NULL, batch_input_shape = NULL, batch_size = NULL, dtype = NULL, name = NULL, trainable = NULL, weights = NULL)
| object | Model or layer object |
|---|---|
| max_value | loat, the maximum output value. |
| negative_slope | float >= 0 Negative slope coefficient. |
| threshold | float. Threshold value for thresholded activation. |
| input_shape | Input shape (list of integers, does not include the samples axis) which is required when using this layer as the first layer in a model. |
| batch_input_shape | Shapes, including the batch size. For instance,
|
| batch_size | Fixed batch size for layer |
| dtype | The data type expected by the input, as a string ( |
| name | An optional name string for the layer. Should be unique in a model (do not reuse the same name twice). It will be autogenerated if it isn't provided. |
| trainable | Whether the layer weights will be updated during training. |
| weights | Initial weights for layer. |