- Telephone No. :
**+44 (0)207 594 8568** - E-Mail Address :
**c.holmes@ic.ac.uk**

This page contains Matlab code to implement the methods used in my book with Denison, Holmes and Smith. You should find the code fully documented and stand-alone. I hope that those with little or no Matlab experience should still be able to follow the code.

Instructions on how to run the programs are found at the top of each program file. In Matlab, comments are preceeded with a % symbol.

All of the methods are fully automatic using default priors and default MCMC algorithms. Hence the code can be run with no input from the user (other than the data). Of course, if possible we recommend incorporating subjective prior information and tuning the MCMC algorithm, see the header in each program file for more details.

Please feel free to send me any suggestions for improvements to the code or on the readability of my comments in the programs.

I hope to have all of the remaining programs up soon: currently missing – decision trees; generalised regression splines; multivariate Gaussian response models.

1.
**Bayesian MARS model for Gaussian response data**:
Chapters 3 and 4: Here is
the code.

a. This program is stand-alone and can be used to produce a prediction on a test set (see the header to the program).

b. You can also use it to store every model from the MCMC chain and then use this program to make forecasts.

2.
**Bayesian Radial Basis Function (RBF) model for Gaussian
response data**: Chapters 3 and 4: Here is
the code.

a. This program is stand-alone and can be used to produce a prediction on a test set (see the header to the program).

b. You can also use it to store every model from the MCMC chain and then use this program to make forecasts.

3.
**Bayesian Multivariate Linear Spline (MLS) model for
Gaussian response data**: Chapters 3 and 4: Here is
the code.

a. This program is stand-alone and can be used to produce a prediction on a test set (see the header to the program). The model is in effect a Bayesian local linear method and produces local linear coefficients at the test points plus credible intervals on these estimates.

b. You can also use it to store every model from the MCMC chain and then use this program to make forecasts.

4.
**Bayesian Partition Model (BPM) for Gaussian response data**:
Chapter 7: Here
is the code.

a.
The BPM is a probabilitic version of Learning Vector
Quantization. It quantizes the covariate (predictor) space using a Voronoi
tiling and then fits a simple (conjugate) Normal-Normal probability model to
the data within each region.

b.
The program above predicts on a test set, but you can
also run it to dump out the model samples and then use this
program to make forecasts.

For Classification
Problems (Multinomial response data):

5.
**Bayesian Partition Model (BPM) for q-class classification**
(multinomial response data): Chapter 7: Here
is the code.

a. The BPM is a probabilitic version of Learning Vector Quantization. It quantizes the covariate (predictor) space using a Voronoi tiling and then fits a simple (conjugate) Multinomial-Dirichlet probability model to the data within each region.

b. The program above predicts on a test set, but you can also run it to dump out the model samples and then use this program to make predictions.

6.
**Bayesian k-Nearest-Neighbour**: Chapter 8: Here is the
code.

a. This program also includes a weighted nearest-neighbour version.