Conformal prediction for classifier calibration
Conformal prediction is a powerful framework for probabilistic prediction that provides valid and well-calibrated prediction sets and prediction intervals. It offers a principled approach to quantify and control the uncertainty associated with the predictions.
We have already seen how conformal prediction approaches, such as inductive conformal prediction (ICP) and transductive conformal prediction (TCP), aim to generate sets that have accurate coverage probabilities. To recap, conformal prediction computes p-values and constructs prediction sets by comparing the p-values of each potential label with a selected significance level.
Unlike Platt scaling, histogram binning, and isotonic regression, which focus on calibrating the predicted probabilities or scores, conformal prediction takes a more comprehensive approach by providing prediction sets that encompass the uncertainty associated with the predictions and enhances the reliability...