Flexible Bayesian penalized regression modelling

Bayesian lasso, horseshoe and horseshoe+ linear, logistic regression and count regression
1.3K Downloads
Updated 30 Nov 2020

View License

This is a comprehensive, user-friendly toolbox implementing the state-of-the-art in Bayesian linear regression, logistic and count regression. The toolbox provides highly efficient and numerically stable implementations of ridge, lasso, horseshoe, horseshoe+, log-t and g-prior regression. The lasso, horseshoe, horseshoe+ and log-t priors are recommended for data sets where the number of predictors is greater than the sample size, and the log-t prior provides adaptation to unknown levels of sparsity. The toolbox allows predictors to be assigned to logical groupings (potentially overlapping, so that predictors can be part of multiple groups). This can be used to exploit a priori knowledge regarding predictors and how they may be related to each other (for example, in grouping genetic data into genes and collections of genes such as pathways).

Count regression is now supported through implementation of Poisson and geometric regression models. To support analysis of data with outliers, we provide two heavy-tailed error models in our implementation of Bayesian linear regression: Laplace and Student-t distribution errors. Most features are straightforward to use and the toolbox can work directly with MATLAB tables (including automatically handling categorical variables), or you can use standard MATLAB matrices.

The toolbox is very efficient and can be used with high-dimensional data. Please see the scripts in the directory "examples\" for examples on how to use the toolbox, or type "help bayesreg" within MATLAB. An R version of this toolbox is now available on CRAN. To install the R package, type "install.packages("bayesreg")" within R.

To cite this toolbox:
Makalic E. & Schmidt, D. F.
High-Dimensional Bayesian Regularised Regression with the BayesReg Package
arXiv:1611.06649 [stat.CO], 2016

UPDATE VERSION 1.9.1 (30/11/2020):
Latest updates:
-Fix count regression for Matlab 2020a and 2020b releases.

PLEASE NOTE:
The package now handles logistic regression without the need for MEX files, but big speed-ups can be obtained when using compiled code, so this is recommended. To compile the C++ code, run compile.m from the bayesreg directory within MATLAB; compilation requires the MS Visual Studio Professional or the GNU g++ compiler. Alternatively, for convenience, the pre-compiled MEX files (MATLAB R2017a) for Windows, Linux and Mac OSX can be downloaded from the following URL:

http://www.emakalic.org/blog/

To use these, all you need to do is download them and unzip into the "bayesreg" folder.

Cite As

Enes Makalic and Daniel F. Schmidt (2016). High-Dimensional Bayesian Regularised Regression with the BayesReg Package, arXiv:1611.06649 [stat.CO]

Daniel F. Schmidt and Enes Makalic (2020). Log-Scale Shrinkage Priors and Adaptive Bayesian Global-Local Shrinkage Estimation, arXiv:1801.02321 [math.ST]

Daniel F. Schmidt and Enes Makalic (2019). Bayesian Generalized Horseshoe Estimation of Generalized Linear Models. ECML PKDD 2019: Machine Learning and Knowledge Discovery in Databases. pp 598-613

MATLAB Release Compatibility
Created with R2016a
Compatible with any release
Platform Compatibility
Windows macOS Linux
Categories
Find more on Statistics and Machine Learning Toolbox in Help Center and MATLAB Answers

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!
Version Published Release Notes
1.9.1.0

-Fix count regression for Matlab 2020a and 2020b releases.

1.9.0.2

-Updated bayesreg help file

1.9.0.1

-Updated "Cite As" field

1.9

-Added support for count regression via Poisson and geometric regression models
-Added sparsity adaptive log-t shrinkage prior (option 'logt')
-Improved sparsification [br_sparsify]

1.8.0.1

-Updated the "Cite As" field in the toolbox description

1.8.0.0

-Added function "br_sparsify()" to sparsify posterior coefficient estimates; three sparsification methods currently available (see "br_example15")
-Improved br_summary() printing of categorical data (see "br_example5")
-Minor updates and fixes

1.7.0.0

-improved sampling speed for large design matrices
-improved sampling speed when block sampling with Gaussian data
-improved sampling efficiency of the horseshoe+ sampler

1.6.0.0

-Display the Widely Applicable Akaike's Information Criterion (WAIC) instead of DIC in summary output
-Implemented block sampling of betas for data with large numbers of predictors (options 'blocksample and 'blocksize')

1.5.0.0

- written a new parallelised C++ implementation of sampling code for logistic regression
- efficient MATLAB implementation of logistic regression sampling included; works even when MEX files are not available but not as fast

1.4.0.0

- Added option ‘groups’ which allows grouping of variables into potentially overlapping groups
- Grouping works with HS, HS+ and lasso
- Fixed a bug with g priors and logistic models
- Updated examples to demonstrate grouping and toolbox description

1.3.0.0

- Tidied up the summary display
- Added support for MATLAB tables
- Added support for categorical predictors
- Added a prediction function
- Updated and improved the example scripts
- Fix bug in computation of R2

1.2.0.0

Version 1.2
-This version implements Zellner's g-prior for linear and logistic regression. The g-prior only works with full rank matrices. The examples in "examples_bayesreg.m" have been updated to include a g-prior example.

1.1.0.0

Version 1.1
-Moved all display code to a separate function called "summary()". Now the summary table can be produced on demand after sampling.
-Updated "examples_bayesreg.m" to include examples of the new "summary()" command.

1.0.0.0

Updated description to include links to the full version of the toolbox.