Warning
DEPRECATED
-
class sklearn.cross_validation.LeavePOut(n, p)
[source] -
Leave-P-Out cross validation iterator
Deprecated since version 0.18: This module will be removed in 0.20. Use
sklearn.model_selection.LeavePOut
instead.Provides train/test indices to split data in train test sets. This results in testing on all distinct samples of size p, while the remaining n - p samples form the training set in each iteration.
Note:
LeavePOut(n, p)
is NOT equivalent toKFold(n, n_folds=n // p)
which creates non-overlapping test sets.Due to the high number of iterations which grows combinatorically with the number of samples this cross validation method can be very costly. For large datasets one should favor KFold, StratifiedKFold or ShuffleSplit.
Read more in the User Guide.
Parameters: n : int
Total number of elements in dataset.
p : int
Size of the test sets.
Examples
>>> from sklearn import cross_validation >>> X = np.array([[1, 2], [3, 4], [5, 6], [7, 8]]) >>> y = np.array([1, 2, 3, 4]) >>> lpo = cross_validation.LeavePOut(4, 2) >>> len(lpo) 6 >>> print(lpo) sklearn.cross_validation.LeavePOut(n=4, p=2) >>> for train_index, test_index in lpo: ... print("TRAIN:", train_index, "TEST:", test_index) ... X_train, X_test = X[train_index], X[test_index] ... y_train, y_test = y[train_index], y[test_index] TRAIN: [2 3] TEST: [0 1] TRAIN: [1 3] TEST: [0 2] TRAIN: [1 2] TEST: [0 3] TRAIN: [0 3] TEST: [1 2] TRAIN: [0 2] TEST: [1 3] TRAIN: [0 1] TEST: [2 3] .. automethod:: __init__
Please login to continue.