Warning: This documentation is for scikits.learn version 0.7.1. — Latest stable version

Contents

6.2.3. scikits.learn.linear_model.RidgeCV

class scikits.learn.linear_model.RidgeCV(alphas=array([ 0.1, 1., 10. ]), fit_intercept=True, score_func=None, loss_func=None, cv=None)

Ridge regression with built-in cross-validation.

By default, it performs Generalized Cross-Validation, which is a form of efficient Leave-One-Out cross-validation. Currently, only the n_features > n_samples case is handled efficiently.

Parameters :

alphas: numpy array of shape [n_alpha] :

Array of alpha values to try. Small positive values of alpha improve the conditioning of the problem and reduce the variance of the estimates. Alpha corresponds to (2*C)^-1 in other linear models such as LogisticRegression or LinearSVC.

fit_intercept : boolean

Whether to calculate the intercept for this model. If set to false, no intercept will be used in calculations (e.g. data is expected to be already centered).

loss_func: callable, optional :

function that takes 2 arguments and compares them in order to evaluate the performance of prediciton (small is good) if None is passed, the score of the estimator is maximized

score_func: callable, optional :

function that takes 2 arguments and compares them in order to evaluate the performance of prediciton (big is good) if None is passed, the score of the estimator is maximized

See also

Ridge

Methods

fit(X, y[, sample_weight]) Fit Ridge regression model
predict(X) Predict using the linear model
score(X, y) Returns the coefficient of determination of the prediction
__init__(alphas=array([ 0.1, 1., 10. ]), fit_intercept=True, score_func=None, loss_func=None, cv=None)
fit(X, y, sample_weight=1.0, **params)

Fit Ridge regression model

Parameters :

X : numpy array of shape [n_samples, n_features]

Training data

y : numpy array of shape [n_samples] or [n_samples, n_responses]

Target values

sample_weight : float or numpy array of shape [n_samples]

Sample weight

cv : cross-validation generator, optional

If None, Generalized Cross-Validationn (efficient Leave-One-Out) will be used.

Returns :

self : Returns self.

predict(X)

Predict using the linear model

Parameters :

X : numpy array of shape [n_samples, n_features]

Returns :

C : array, shape = [n_samples]

Returns predicted values.

score(X, y)

Returns the coefficient of determination of the prediction

Parameters :

X : array-like, shape = [n_samples, n_features]

Training set.

y : array-like, shape = [n_samples]

Returns :

z : float