The Implicit and Explicit Regularization Effects of Dropout

Citation:

C. Wei, S. Kakade, and T. Ma., The Implicit and Explicit Regularization Effects of Dropout. NeurIPS: ArXiv Report, 2020.

Abstract:

Dropout is a widely-used regularization technique, often required to obtain state-of-the-art for a number of architectures. This work demonstrates that dropout introduces two distinct but entangled regularization effects: an explicit effect (also studied in prior work) which occurs since dropout modifies the expected training objective, and, perhaps surprisingly, an additional implicit effect from the stochasticity in the dropout training update. This implicit regularization effect is analogous to the effect of stochasticity in small mini-batch stochastic gradient descent. We disentangle these two effects through controlled experiments. We then derive analytic simplifications which characterize each effect in terms of the derivatives of the model and the loss, for deep neural networks. We demonstrate these simplified, analytic regularizers accurately capture the important aspects of dropout, showing they faithfully replace dropout in practice.

Publisher's Version

See also: 2020
Last updated on 10/11/2021