Warning: This documentation is for scikits.learn version 0.6.0. — Latest stable version

Contents

6. Class reference

6.1. Support Vector Machines

svm.SVC([C, kernel, degree, gamma, coef0, ...]) C-Support Vector Classification.
svm.LinearSVC([penalty, loss, dual, eps, C, ...]) Linear Support Vector Classification.
svm.NuSVC([nu, kernel, degree, gamma, ...]) Nu-Support Vector Classification.
svm.SVR([kernel, degree, gamma, coef0, ...]) Support Vector Regression.
svm.NuSVR([nu, C, kernel, degree, gamma, ...]) Nu Support Vector Regression.
svm.OneClassSVM([kernel, degree, gamma, ...]) Unsupervised Outliers Detection.

6.1.7. For sparse data

svm.sparse.SVC([kernel, degree, gamma, ...]) SVC for sparse matrices (csr).
svm.sparse.NuSVC([nu, kernel, degree, ...]) NuSVC for sparse matrices (csr).
svm.sparse.SVR([kernel, degree, gamma, ...]) SVR for sparse matrices (csr)
svm.sparse.NuSVR([nu, C, kernel, degree, ...]) NuSVR for sparse matrices (csr)
svm.sparse.OneClassSVM([kernel, degree, ...]) NuSVR for sparse matrices (csr)
svm.sparse.LinearSVC([penalty, loss, dual, ...]) Linear Support Vector Classification, Sparse Version

6.2. Generalized Linear Models

linear_model.LinearRegression([fit_intercept]) Ordinary least squares Linear Regression.
linear_model.Ridge([alpha, fit_intercept]) Ridge regression.
linear_model.Lasso([alpha, fit_intercept]) Linear Model trained with L1 prior as regularizer (aka the Lasso)
linear_model.LassoCV([eps, n_alphas, ...]) Lasso linear model with iterative fitting along a regularization path
linear_model.ElasticNet([alpha, rho, ...]) Linear Model trained with L1 and L2 prior as regularizer
linear_model.ElasticNetCV([rho, eps, ...]) Elastic Net model with iterative fitting along a regularization path
linear_model.LARS([fit_intercept, verbose]) Least Angle Regression model a.k.a. LAR
linear_model.LassoLARS([alpha, ...]) Lasso model fit with Least Angle Regression a.k.a. LARS
linear_model.LogisticRegression([penalty, ...]) Logistic Regression.
linear_model.SGDClassifier([loss, penalty, ...]) Linear model fitted by minimizing a regularized empirical loss with SGD.
linear_model.SGDRegressor([loss, penalty, ...]) Linear model fitted by minimizing a regularized empirical loss with SGD
linear_model.lasso_path(X, y[, eps, ...]) Compute Lasso path with coordinate descent
linear_model.lars_path(X, y[, Xy, Gram, ...]) Compute Least Angle Regression and LASSO path

6.2.14. For sparse data

linear_model.sparse.Lasso([alpha, fit_intercept]) Linear Model trained with L1 prior as regularizer
linear_model.sparse.ElasticNet([alpha, rho, ...]) Linear Model trained with L1 and L2 prior as regularizer
linear_model.sparse.SGDClassifier([loss, ...]) Linear model fitted by minimizing a regularized empirical loss with SGD
linear_model.sparse.SGDRegressor([loss, ...]) Linear model fitted by minimizing a regularized empirical loss with SGD

6.3. Bayesian Regression

linear_model.BayesianRidge([n_iter, eps, ...]) Bayesian ridge regression
linear_model.ARDRegression([n_iter, eps, ...]) Bayesian ARD regression.

6.4. Naive Bayes

naive_bayes.GNB() Gaussian Naive Bayes (GNB)

6.5. Nearest Neighbors

neighbors.Neighbors([n_neighbors, window_size]) Classifier implementing k-Nearest Neighbor Algorithm.
neighbors.NeighborsBarycenter([n_neighbors, ...]) Regression based on k-Nearest Neighbor Algorithm.
ball_tree.BallTree Ball Tree for fast nearest-neighbor searches :
ball_tree.knn_brute(x, pt[, k]) Brute-Force k-nearest neighbor search.

6.6. Gaussian Mixture Models

mixture.GMM([n_states, cvtype]) Gaussian Mixture Model

6.7. Hidden Markov Models

hmm.GaussianHMM([n_states, cvtype, ...]) Hidden Markov Model with Gaussian emissions
hmm.MultinomialHMM([n_states, startprob, ...]) Hidden Markov Model with multinomial (discrete) emissions
hmm.GMMHMM([n_states, n_mix, startprob, ...]) Hidden Markov Model with Gaussin mixture emissions

6.8. Clustering

cluster.KMeans([k, init, n_init, max_iter, ...]) K-Means clustering
cluster.MeanShift([bandwidth]) MeanShift clustering
cluster.SpectralClustering([k, mode]) Spectral clustering: apply k-means to a projection of the graph laplacian, finds normalized graph cuts.
cluster.AffinityPropagation([damping, ...]) Perform Affinity Propagation Clustering of data

6.9. Covariance Estimators

covariance.Covariance([store_covariance]) Basic covariance estimator
covariance.ShrunkCovariance([...]) Covariance estimator with shrinkage
covariance.LedoitWolf([store_covariance]) LedoitWolf Estimator
covariance.ledoit_wolf(X[, return_shrinkage]) Estimates the shrunk Ledoit-Wolf covariance matrix.

6.10. Signal Decomposition

pca.PCA([n_components, copy, whiten]) Principal component analysis (PCA)
pca.ProbabilisticPCA([n_components, copy, ...])
fastica.FastICA([n_components, algorithm, ...]) FastICA; a fast algorithm for Independent Component Analysis
fastica.fastica(X[, n_components, ...]) Perform Fast Independent Component Analysis.

6.11. Cross Validation

cross_val.LeaveOneOut(n) Leave-One-Out cross validation iterator
cross_val.LeavePOut(n, p) Leave-P-Out cross validation iterator
cross_val.KFold(n, k) K-Folds cross validation iterator
cross_val.StratifiedKFold(y, k) Stratified K-Folds cross validation iterator
cross_val.LeaveOneLabelOut(labels) Leave-One-Label_Out cross-validation iterator
cross_val.LeavePLabelOut(labels, p) Leave-P-Label_Out cross-validation iterator

6.13. Feature Selection

feature_selection.rfe.RFE([estimator, ...]) Feature ranking with Recursive feature elimination
feature_selection.rfe.RFECV([estimator, ...]) Feature ranking with Recursive feature elimination and cross validation

6.14. Feature Extraction

feature_extraction.image.img_to_graph(img[, ...]) Create a graph of the pixel-to-pixel connections with the gradient of the image as a the edge value.
feature_extraction.text.RomanPreprocessor Fast preprocessor suitable for roman languages ..
feature_extraction.text.WordNGramAnalyzer([...]) Simple analyzer: transform a text document into a sequence of word tokens
feature_extraction.text.CharNGramAnalyzer([...]) Compute character n-grams features of a text document
feature_extraction.text.CountVectorizer(...) Convert a collection of raw documents to a matrix of token counts
feature_extraction.text.TfidfTransformer([...]) Transform a count matrix to a TF or TF-IDF representation
feature_extraction.text.Vectorizer(...[, ...]) Convert a collection of raw documents to a matrix

6.14.8. For sparse data

feature_extraction.text.sparse.TfidfTransformer([...])
feature_extraction.text.sparse.CountVectorizer(...) Convert a collection of raw documents to a matrix of token counts
feature_extraction.text.sparse.Vectorizer(...) Convert a collection of raw documents to a sparse matrix

6.15. Pipeline

pipeline.Pipeline(steps) Pipeline of transforms with a final estimator