Sitemap

Member-only story

DropBlock: A Better Dropout for CNN’s

3 min readAug 8, 2022

--

Dropout is a regularization technique for neural network models proposed by Srivastava, et al. in their 2014 paper Dropout: A Simple Way to Prevent Neural Networks from Overfitting.

Dropout is a technique where randomly selected neurons are ignored during training. They are “dropped-out” randomly. This means that their contribution to the activation of downstream neurons is temporally removed on the forward pass and any weight updates are not applied to the neuron on the backward pass.

This is how the standard Dropout looks like:

Image by author

Random pixels were indeed dropped, and that’s exactly the issue.

By randomly dropping independent pixels, we’re not removing semantic information contained in the image, because nearby activations contain closely related information.

Introducing DropBlock

--

--

Alessandro Lamberti
Alessandro Lamberti

Written by Alessandro Lamberti

Machine Learning Engineer | Computer vision, distributed systems, systems thinking

No responses yet