Epoch
An epoch means put the whole training set into network for training once.
.
Batch
A whole dataset might be too large for us to compute. So we divide it into several batches, and put batches into the network. This also indicates that we may update the network for multiple times in a single epoch.
.
Iteration
Interation number indicates how many times we need to put a dataset into the network. $DatasetSize=Iter*BatchSize$
.
Quick Stat 1
Go here
.