4/29/2023 0 Comments Docear conditional stylesThis characteristic allows a model to be updated by simply adding a new data point without the need for retraining. Non-parametric here means that a model’s parameters are not explicitly taken into account, but are inferred from the data, instead. Gaussian process (GP) models (Rasmussen and Williams, 2006) provide a non-parametric framework for Bayesian inference over function spaces. Kernels allow abstracting feature maps to even infinite-dimensional feature spaces without having to deal with the elements of such spaces directly, leading to non-parametric methods. Predictions are then performed via inference over the posterior distribution of the model conditioned on the data. 2.1 Bayesian learningīayesian learning quantifies model uncertainty by placing a belief distribution over the model we are trying to learn and updating that belief according to the observed data. GPs are also closely related to the topic of reproducing kernel Hilbert spaces (Section 2.4), which have their own variety of applications. 1.5 OutlineĬhapter 5 addresses the problem of planning under localisation uncertainty, providing a method to learn terrain roughness models while safely navigating a mobile robot.īased on the investigation in the previous chapter, as a final contribution, Chapter 6 provides a method with theoretical guarantees to solve more general optimisation problems under uncertain inputs. BO algorithms may be able to use probability distributions, instead of point estimates, to learn models of an objective function. In addition, estimates of the query location can be provided as probability distributions by localisation systems (Thrun et al., 2006) Thrun2006. Rafael Oliveira, “Bayesian Optimisation for Planning under Uncertainty”, PhD Thesis, The University of Sydney, 2019.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |