Validation loss vs validation accuracy?
Clear, conceptual explanation: - Validation loss measures how well the model’s predictions match the target values on the validation dataset, using the loss function (e.g., cross-entropy, MSE). Lower loss is better. - Validation accuracy measures the percentage of correct predictions on the validation set (for classification tasks). Higher accuracy is better. They can move in different directions: for example, validation loss can decrease while validation accuracy stays flat or even drops slightly because loss captures confidence/probability calibration whereas accuracy only counts exact matches. Always monitor both; also watch for overfitting (training loss ↓ while validation loss ↑). Use tools like learning curves and confusion matrices to diagnose.