Overfitting is when a model is optimized to predict the data in the training set in detriment of previously unseen data.
This happens because:
models tend to gravitate toward the simplest way to do good predictions (memorization) - Deep Learning for Coders, Chapter 1
In other words, when a model is overfitted, it basically is “memorizing” the individual items of part of the items in the training set and not extracting the general lessons of the data.