Alpha Dropout is a dropout that keeps mean and variance of inputs to their original values, in order to ensure the self-normalizing property even after this dropout.

layer_alpha_dropout(object, rate, noise_shape = NULL, seed = NULL)

Arguments

object

Model or layer object

rate

float, drop probability (as with layer_dropout()). The multiplicative noise will have standard deviation sqrt(rate / (1 - rate)).

noise_shape

Noise shape

seed

An integer to use as random seed.

Details

Alpha Dropout fits well to Scaled Exponential Linear Units by randomly setting activations to the negative saturation value.

Input shape

Arbitrary. Use the keyword argument input_shape (list of integers, does not include the samples axis) when using this layer as the first layer in a model.

Output shape

Same shape as input.

References

See also