Epoch size represents the total number of iterations the data is run through the optimizer[18]  Too few epochs, then the model will prematurely stop learning and will not grasp the full knowledge of the data, while with too large epochs, the training time will be longer and the model may train itself futilely without learning any new information.

In the Figure 8 below, we can clearly observe that, the increase in the number of epochs leads to the increase in model accuracy and the reduction in error (loss) to a certain point. It also must be noted that, infinitely increasing the epoch number does not increase the accuracy infinitely. For example, in case of the green line, representing the high learning rate, it is observed that no matter the increase in the number of epochs after the black circle depicted point, the model accuracy remains the same, however the drawback in the case is that with greater number of epochs and no accuracy increase, the only activity happening is the futile waste of the computation resource and the time.

significance-of-learning-rate-and-epoch-on-model-accuracy

Figure 8  showing the impact of the learning rate and the epoch size on error [19]

Full Document Available at

https://github.com/beekal/MachieneLearningProjects/blob/master/Deep%20Learning%20Projects/Sales%20Demand%20Forecaster/Report-%20Sales%20Demand%20Forcaster.pdf

Advertisements