AUC-ROC
Definition
Area Under the Receiver Operating Characteristic Curve — a metric that evaluates a binary classifier's ability to distinguish between classes across all possible classification thresholds.
AUC-ROC provides a threshold-independent measure of a model's discrimination ability. The ROC curve plots the true positive rate (recall) against the false positive rate at every possible classification threshold. AUC (Area Under the Curve) summarizes this curve into a single number between 0 and 1, where 0.5 represents random guessing and 1.0 represents perfect classification. An AUC of 0.85 means there is an 85% chance that the model will rank a randomly chosen positive example higher than a randomly chosen negative example. AUC-ROC is widely used in medical diagnostics, credit scoring, fraud detection, and any application with binary outcomes. Its threshold-independence makes it useful for comparing models without committing to a specific operating point, though it can be overly optimistic on highly imbalanced datasets, where AUC-PR (Precision-Recall) may be more informative.
Related Terms
Accuracy
The proportion of correct predictions out of total predictions made by a model, the simplest and mos...
Precision
The proportion of positive predictions that are actually correct, measuring how reliable a model's p...
Benchmark
A standardized test or dataset used to evaluate and compare the performance of AI models on specific...
Recall
The proportion of actual positive cases that the model correctly identifies, measuring how completel...