overfitting
When a model learns training data too well, including noise, and fails to generalize to unseen examples. Detected when train accuracy >> test accuracy.
Syntax
ai-fundamentals
if train_acc >> val_acc: model is overfittingExample
ai-fundamentals
# Preventing overfitting:
# 1. Dropout regularization:
nn.Dropout(p=0.5)
# 2. L2 regularization (weight decay):
optimizer = torch.optim.Adam(model.parameters(), weight_decay=1e-4)
# 3. Early stopping:
if val_loss > best_val_loss:
patience_counter += 1
if patience_counter >= patience:
break # stop training
# 4. Data augmentation to increase training diversity