Class reference¶
Support Vector Machines¶
svm.SVC([C, kernel, degree, gamma, coef0, ...]) | C-Support Vector Classification. |
svm.LinearSVC([penalty, loss, dual, eps, C, ...]) | Linear Support Vector Classification. |
svm.NuSVC([nu, kernel, degree, gamma, ...]) | Nu-Support Vector Classification. |
svm.SVR([kernel, degree, gamma, coef0, ...]) | Support Vector Regression. |
svm.NuSVR([nu, C, kernel, degree, gamma, ...]) | Nu Support Vector Regression. Similar to NuSVC, for regression, |
svm.OneClassSVM([kernel, degree, gamma, ...]) | Unsupervised outliers detection |
For sparse data¶
svm.sparse.SVC([kernel, degree, gamma, ...]) | SVC for sparse matrices (csr) |
svm.sparse.NuSVC([nu, kernel, degree, ...]) | NuSVC for sparse matrices (csr) |
svm.sparse.SVR([kernel, degree, gamma, ...]) | SVR for sparse matrices (csr) |
svm.sparse.NuSVR([nu, C, kernel, degree, ...]) | NuSVR for sparse matrices (csr) |
svm.sparse.OneClassSVM([kernel, degree, ...]) | NuSVR for sparse matrices (csr) |
svm.sparse.LinearSVC([penalty, loss, dual, ...]) | Linear Support Vector Classification, Sparse Version |
Logistic Regression¶
logistic.LogisticRegression([penalty, eps, ...]) | Logistic Regression. |
Generalized Linear Models¶
glm.LinearRegression([fit_intercept]) | Ordinary least squares Linear Regression. |
glm.Ridge([alpha, fit_intercept]) | Ridge regression. |
glm.Lasso([alpha, fit_intercept, coef_]) | Linear Model trained with L1 prior as regularizer (aka the Lasso) |
glm.LassoCV([eps, n_alphas, alphas]) | Lasso linear model with iterative fitting along a regularization path |
glm.ElasticNet([alpha, rho, coef_, ...]) | Linear Model trained with L1 and L2 prior as regularizer |
glm.ElasticNetCV([rho, eps, n_alphas, alphas]) | Elastic Net model with iterative fitting along a regularization path |
glm.LARS(n_features[, normalize]) | Least Angle Regression model a.k.a. LAR |
glm.LassoLARS([alpha, max_features, ...]) | Lasso model fit with Least Angle Regression a.k.a. LARS |
glm.lars_path(X, y[, Gram, max_features, ...]) | Compute Least Angle Regression and LASSO path |
For sparse data¶
glm.sparse.Lasso([alpha, coef_, fit_intercept]) | Linear Model trained with L1 prior as regularizer |
glm.sparse.ElasticNet([alpha, rho, coef_, ...]) | Linear Model trained with L1 and L2 prior as regularizer |
Bayesian Regression¶
glm.BayesianRidge([n_iter, eps, alpha_1, ...]) | Bayesian ridge regression |
glm.ARDRegression([n_iter, eps, alpha_1, ...]) | Bayesian ARD regression. |
Naive Bayes¶
naive_bayes.GNB() | Gaussian Naive Bayes (GNB) |
Nearest Neighbors¶
neighbors.Neighbors([k, window_size]) | Classifier implementing k-Nearest Neighbor Algorithm. |
ball_tree.BallTree | Ball Tree for fast nearest-neighbor searches : |
ball_tree.knn_brute(x, pt[, k]) | Brute-Force k-nearest neighbor search. |
Clustering¶
cluster.KMeans([k, init, n_init, max_iter]) | K-Means clustering |
cluster.MeanShift([bandwidth]) | MeanShift clustering |
cluster.SpectralClustering([k, mode]) | Spectral clustering: apply k-means to a projection of the graph laplacian, finds normalized graph cuts. |
cluster.AffinityPropagation([damping, ...]) | Perform Affinity Propagation Clustering of data |
Covariance estimators¶
covariance.Covariance([store_covariance]) | Basic covariance estimator |
covariance.ShrunkCovariance([...]) | Covariance estimator with shrinkage |
covariance.LedoitWolf([store_covariance]) | LedoitWolf Estimator |
Cross-validation¶
cross_val.LeaveOneOut(n) | Leave-One-Out cross validation iterator: |
cross_val.LeavePOut(n, p) | Leave-P-Out cross validation iterator: |
cross_val.KFold(n, k) | K-Folds cross validation iterator: |
cross_val.StratifiedKFold(y, k) | Stratified K-Folds cross validation iterator: |
cross_val.LeaveOneLabelOut(labels) | Leave-One-Label_Out cross-validation iterator: |
cross_val.LeavePLabelOut(labels, p) | Leave-P-Label_Out cross-validation iterator: |
Feature Selection¶
feature_selection.rfe.RFE([estimator, ...]) | Feature ranking with Recursive feature elimination |
feature_selection.rfe.RFECV([estimator, ...]) | Feature ranking with Recursive feature elimination and cross validation |
Feature Extraction¶
feature_extraction.image.img_to_graph(img[, ...]) | Create a graph of the pixel-to-pixel connections with the gradient of the image as a the edge value. |
feature_extraction.text.WordNGramAnalyzer([...]) | Simple analyzer: transform a text document into a sequence of word tokens |
feature_extraction.text.CharNGramAnalyzer([...]) | Compute character n-grams features of a text document |
feature_extraction.text.TermCountVectorizer([...]) | Convert a document collection to a document-term matrix. |
feature_extraction.text.TfidfTransformer([...]) | Transform a count matrix to a TF (term-frequency) |
feature_extraction.text.TfidfVectorizer([...]) | |
feature_extraction.text.SparseHashingVectorizer([...]) | Compute term freq vectors using hashed term space in a sparse matrix |
Pipeline¶
pipeline.Pipeline(steps) | Pipeline of transforms with a final estimator |