L2 regularization using glmnet with 10-fold cross-validation, VIF calculation, and shrinkage factor analysis
10-fold CV to find optimal lambda.min and lambda.1se with glmnet (alpha=0).
Variance Inflation Factor calculation to quantify multicollinearity levels.
Compare Ridge vs OLS coefficients with shrinkage factor calculation.
Provide a dataset with a numeric target and array of features. Features are automatically scaled using scale() before fitting.
Uses glmnet with alpha=0 (pure Ridge), calculates both R² and adjusted R², provides coefficient paths across lambda values, and includes comprehensive residual diagnostics with Q-Q plots.
From preprocessing to robust reporting
Scale predictors using scale() for consistent regularization.
10-fold CV with cv.glmnet to find optimal lambda values.
Calculate VIF, shrinkage factor, and generate diagnostic plots.
Ridge regression handles multicollinearity effectively—shrinking coefficients toward zero while keeping all predictors in the model.
The implementation uses glmnet with alpha=0, provides both lambda.min and lambda.1se options, calculates VIF to quantify collinearity, computes shrinkage factor comparing Ridge to OLS, and includes comprehensive diagnostics with coefficient paths, CV error curves, and residual analysis.
Note: Ridge keeps all features (no selection). Uses lambda.min by default. For feature selection, use Lasso or Elastic Net.
Tune Ridge and improve generalization
Read the article: Ridge Regression