Epoch

The simple definition of epoch is ,
An epoch is one forward pass and one backward pass of all training examples.
Note : epoch and iterations are two different things.
For example , for a set of 1000 images and a batch size of 10, each iteration would process 10 images for a total of 100 such iterations to go over the entire set. This is called one epoch. Training can go on for 100s of epochs.(https://stackoverflow.com/questions/31155388/meaning-of-an-epoch-in-neural-networks-training):
One epoch consists of one full training cycle on the training set. Once every sample in the set is seen, you start again - marking the beginning of the 2nd epoch.

留言