EpochAn epoch means put the whole training set into network for training once. .

BatchA whole dataset might be too large for us to compute. So we divide it into several batches, and put batches into the network. This also indicates that we may update the network for multiple times in a single epoch. .

IterationInteration number indicates how many times we need to put a dataset into the network. $DatasetSize=Iter*BatchSize$ .

Quick Stat 1Go here .