Use SelectFromModel meta-transformer along with Lasso to select the best couple of features from the Boston dataset.
class sklearn.linear_model.PassiveAggressiveRegressor(C=1.0, fit_intercept=True, n_iter=5, shuffle=True, verbose=0
Pipelining We have seen that some estimators can transform data and that some estimators can predict variables. We can also create combined estimators:
sklearn.preprocessing.robust_scale(X, axis=0, with_centering=True, with_scaling=True, quantile_range=(25.0, 75.0), copy=True)
class sklearn.cluster.AgglomerativeClustering(n_clusters=2, affinity='euclidean', memory=Memory(cachedir=None), connectivity=None
sklearn.metrics.hinge_loss(y_true, pred_decision, labels=None, sample_weight=None)
Sometimes looking at the learned coefficients of a neural network can provide insight into the learning behavior. For example if weights look unstructured
sklearn.metrics.silhouette_samples(X, labels, metric='euclidean', **kwds)
class sklearn.decomposition.KernelPCA(n_components=None, kernel='linear', gamma=None, degree=3, coef0=1, kernel_params=None, alpha=1
sklearn.covariance.graph_lasso(emp_cov, alpha, cov_init=None, mode='cd', tol=0.0001, enet_tol=0.0001, max_iter=100, verbose=False
Page 28 of 70