Overfitting
Overfitting is the phenomenon of a model not performing well, i.e., not making good predictions, because it captured the noise as well as the signal in the training set. In other words, the model is generalizing too little and instead of just characterizing and encoding the signal it’s encoding too much of the noise found in the training set as well. (Another way to think about this is that the model is trying to fit ‘too much’ to the training data).
This means that the model performs well when it’s shown a training example (resulting in a low training loss), but badly when it’s shown a new example it hasn’t seen before.