By clicking "Accept", you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. See our Privacy Policy for more information
Glossary
Epoch
AI DEFINITION

Epoch

In Machine Learning, an epoch is a complete pass through the entire training dataset during the learning process. Every sample in the dataset has been seen once by the model after a single epoch.

Why it matters
Training usually requires multiple epochs to achieve good performance. Too few epochs may result in underfitting, where the model has not learned enough, while too many can lead to overfitting, where the model memorizes the training data instead of generalizing.

Example
With a dataset of 10,000 records and a batch size of 100, one epoch consists of 100 iterations (10,000 ÷ 100).

Practical use cases

An epoch can be thought of as the heartbeat of training: each one represents a full cycle in which the model sees the entire dataset, adjusts its parameters, and prepares for the next round. Progress across epochs is often monitored with learning curves, which track metrics like training loss and validation accuracy.

A common practice is to use early stopping, where training halts once validation performance stops improving, even if the planned number of epochs has not been reached. This prevents overfitting and saves computational resources.

Not all epochs contribute equally—early epochs usually bring the largest improvements, while later ones fine-tune the details. In modern workflows, tools like TensorBoard allow practitioners to visualize how models evolve epoch by epoch, making this concept not just theoretical but an essential part of hands-on deep learning practice.

References

  • Chollet, F. (2021). Deep Learning with Python. Manning.
  • Stanford CS231n: Convolutional Neural Networks for Visual Recognition.