By clicking "Accept", you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. See our Privacy Policy for more information
Glossary
Dropout
AI DEFINITION

Dropout

Dropout is a regularization technique used in training neural networks. It works by randomly deactivating a percentage of neurons or connections during each iteration. This prevents the model from overfitting by ensuring it does not rely too heavily on specific neurons, encouraging the learning of more generalized and robust features.

How it works:
During training, a binary mask randomly “drops” certain neurons, blocking their contribution to forward propagation and gradient updates. At inference time, all neurons are used, but their activations are scaled according to the dropout rate.

Benefits & use cases:
Dropout is widely applied to improve generalization in convolutional neural networks for computer vision, and to stabilize the training of natural language processing models. It also prevents smaller models from memorizing training data too rigidly.

Related internal links :
https://www.innovatiana.com/en/glossary/overfitting

https://www.innovatiana.com/en/glossary/regularization

https://www.innovatiana.com/en/blog