Nearest Neighbors regression

Demonstrate the resolution of a regression problem using a k-Nearest Neighbor and the interpolation of the target using both barycenter and constant weights.

1
2
3
4
5
6
print(__doc__)
 
# Author: Alexandre Gramfort <alexandre.gramfort@inria.fr>
#         Fabian Pedregosa <fabian.pedregosa@inria.fr>
#
# License: BSD 3 clause (C) INRIA

Generate sample data

1
2
3
4
5
6
7
8
9
10
11
import numpy as np
import matplotlib.pyplot as plt
from sklearn import neighbors
 
np.random.seed(0)
X = np.sort(5 * np.random.rand(40, 1), axis=0)
T = np.linspace(0, 5, 500)[:, np.newaxis]
y = np.sin(X).ravel()
 
# Add noise to targets
y[::5] += 1 * (0.5 - np.random.rand(8))

Fit regression model

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
n_neighbors = 5
 
for i, weights in enumerate(['uniform', 'distance']):
    knn = neighbors.KNeighborsRegressor(n_neighbors, weights=weights)
    y_ = knn.fit(X, y).predict(T)
 
    plt.subplot(2, 1, i + 1)
    plt.scatter(X, y, c='k', label='data')
    plt.plot(T, y_, c='g', label='prediction')
    plt.axis('tight')
    plt.legend()
    plt.title("KNeighborsRegressor (k = %i, weights = '%s')" % (n_neighbors,
                                                                weights))
 
plt.show()

../../_images/sphx_glr_plot_regression_001.png

Total running time of the script: (0 minutes 0.119 seconds)

Download Python source code: plot_regression.py
Download IPython notebook: plot_regression.ipynb
doc_scikit_learn
2025-01-10 15:47:30
Comments
Leave a Comment

Please login to continue.