The Bayesian framework for model comparison and regulation is demonstrated by studying interpolation and classification problems modelled with both linear and non-linear models. This framework quantitatively embodies `Occam's razor'. Over-complex and under-regularised models are automatically inferred to be less probable, even though their flexibility allows them to fit the data better. The relationship of the Bayesian learning framework to 'active learning' is examined. Objective functions are discussed which measures the expected informativeness of candidate data measurements, in the context of both interpolation and classification problems. The concepts and methods described in this thesis are quite general and will be applicable to other data modelling problems whether they involve regression, classification or density estimation. (Author/publisher)
Samenvatting