Member-only story
DropBlock: A Better Dropout for CNN’s
Dropout is a regularization technique for neural network models proposed by Srivastava, et al. in their 2014 paper Dropout: A Simple Way to Prevent Neural Networks from Overfitting.
Dropout is a technique where randomly selected neurons are ignored during training. They are “dropped-out” randomly. This means that their contribution to the activation of downstream neurons is temporally removed on the forward pass and any weight updates are not applied to the neuron on the backward pass.
This is how the standard Dropout looks like:
Random pixels were indeed dropped, and that’s exactly the issue.
By randomly dropping independent pixels, we’re not removing semantic information contained in the image, because nearby activations contain closely related information.