A helper interface for native dropout implementations
IDropout instances operate on an activations array, modifying or dropping values at training time only.
AlphaDropout is a dropout technique proposed by Klaumbauer et al. 2017 - Self-Normalizing Neural Networks https://arxiv.org/abs/1706.02515
This dropout technique was designed specifically for self-normalizing neural networks - i.e., networks using
In conjuction with the aforementioned activation function and weight initialization, AlphaDropout attempts to keep both the mean and variance of the post-dropout activations to the same (in expectation) as before alpha dropout was applied.
Implements standard (inverted) dropout.
Applies additive, mean-zero Gaussian noise to the input - i.e., x = x + N(0,stddev).
Spatial dropout: can only be applied to 3D (time series), 4D (convolutional 2D) or 5D (convolutional 3D) activations.
Copyright © 2020. All rights reserved.