Hold-One-Out Validation
In this technique, we take the k-fold validation to the logical extreme. Instead of creating k-partitions where, k would be 5 or 10, we choose the number of partitions as the number of available data points. Therefore, we would have only one sample in a partition. We use all the samples except one for training, and test the model on the sample which was held out and repeat this n number of times, where n is the number of training samples. Finally, the average error akin to k-fold validation is computed. The major drawback of this technique is that the model is trained n number of times, making it computationally expensive. If we are dealing with a fairly large data sample, this validation method is best avoided.
Hold-one-out validation is also called Leave-One-Out Cross-Validation (LOOCV). The following visual demonstrates hold-one-out validation for n samples:
The following exercise performs hold-one-out or leave-one-out cross-validation...