Combine Lasso and Ridge regression with cross-validated lambda selection for automatic feature selection and regularization
Control L1/L2 balance: alpha=0 for Ridge, alpha=1 for Lasso, alpha=0.5 for balanced Elastic Net.
Automatic lambda selection using cv.glmnet with lambda.min and lambda.1se options.
Automatic feature selection with coefficient paths and importance rankings.
Provide a dataset with a numeric target and multiple features for prediction. Features are standardized by default (configurable).
Uses glmnet package with cross-validation to select optimal lambda. Reports selected features, coefficient paths, MSE curves, and diagnostic plots including Q-Q plots and residual analysis.
From tuning to interpretable diagnostics
Convert to matrix format, optionally standardize features (default=TRUE).
Use cv.glmnet to find optimal lambda; fit at lambda.1se for conservative selection.
Generate coefficient paths, lambda curves, residual plots, and feature importance.
Elastic Net combines the best of Ridge and Lasso regression—handling correlated predictors while performing automatic feature selection.
The implementation uses glmnet with cross-validation, selects lambda using the 1-standard-error rule for robust feature selection, provides coefficient paths showing feature importance across regularization strengths, and includes comprehensive diagnostics with Q-Q plots and residual analysis.
Note: Uses lambda.1se (more conservative) rather than lambda.min for better feature selection. Alpha parameter controls L1/L2 mix (default 0.5).
Tune Elastic Net and get stable, sparse models
Read the article: Elastic Net