Machine learning encompasses probabilistic and statistical techniques that can build models from large quantities of extensional information (examples) with minimal dependence on intensional information (domain knowledge). This focus of machine learning is reflected in the never-ending quest for "off-the-shelf" classifiers. To generalize to unseen data, however, we must make use...
How can an agent generalize its knowledge to new circumstances? To learn
effectively an agent acting in a sequential decision problem must make intelligent action selection choices based on its available knowledge. This dissertation focuses on Bayesian methods of representing learned knowledge and develops novel algorithms that exploit the represented...
Gibbs sampling method is an important tool used in parameter estimation for many probabilistic models. Specifically, for many scenarios, it is difficult to generate high-dimensional data samples from its joint distribution. The Gibbs sampling provides a way to draw high-dimensional data via the conditional distributions which are typically easier to...