layer_dropout.Rd
Dropout consists in randomly setting a fraction rate
of input units to 0 at
each update during training time, which helps prevent overfitting.
layer_dropout(object, rate, noise_shape = NULL, seed = NULL, input_shape = NULL, batch_input_shape = NULL, batch_size = NULL, name = NULL, trainable = NULL, weights = NULL)
object | Model or layer object |
---|---|
rate | float between 0 and 1. Fraction of the input units to drop. |
noise_shape | 1D integer tensor representing the shape of the binary
dropout mask that will be multiplied with the input. For instance, if your
inputs have shape |
seed | integer to use as random seed. |
input_shape | Dimensionality of the input (integer) not including the samples axis. This argument is required when using this layer as the first layer in a model. |
batch_input_shape | Shapes, including the batch size. For instance,
|
batch_size | Fixed batch size for layer |
name | An optional name string for the layer. Should be unique in a model (do not reuse the same name twice). It will be autogenerated if it isn't provided. |
trainable | Whether the layer weights will be updated during training. |
weights | Initial weights for layer. |
Other core layers: layer_activation
,
layer_activity_regularization
,
layer_dense
, layer_flatten
,
layer_input
, layer_lambda
,
layer_masking
, layer_permute
,
layer_repeat_vector
,
layer_reshape
Other dropout layers: layer_spatial_dropout_1d
,
layer_spatial_dropout_2d
,
layer_spatial_dropout_3d