Title | Heteroscedastic Gaussian Processes and Random Features: Scalable Motion Primitives with Guarantees |
Publication Type | Conference Paper |
Year of Publication | 2023 |
Authors | Caldarelli, E, Chatalic, A, Colom´e, A`a, Rosasco, L, Torras, C |
Conference Name | 7th Conference on Robot Learning (CoRL 2023 |
Date Published | 11/2023 |
Conference Location | Altanta, GA |
Abstract | Heteroscedastic Gaussian processes (HGPs) are kernel-based, non- parametric models that can be used to infer nonlinear functions with time-varying noise. In robotics, they can be employed for learning from demonstration as mo- tion primitives, i.e. as a model of the trajectories to be executed by the robot. HGPs provide variance estimates around the reference signal modeling the tra- jectory, capturing both the predictive uncertainty and the motion variability. How- ever, similarly to standard Gaussian processes they suffer from a cubic complexity in the number of training points, due to the inversion of the kernel matrix. The un- certainty can be leveraged for more complex learning tasks, such as inferring the variable impedance profile required from a robotic manipulator. However, suit- able approximations are needed to make HGPs scalable, at the price of potentially worsening the posterior mean and variance profiles. Motivated by these obser- vations, we study the combination of HGPs and random features, which are a popular, data-independent approximation strategy of kernel functions. In a the- oretical analysis, we provide novel guarantees on the approximation error of the HGP posterior due to random features. Moreover, we validate this scalable motion primitive on real robot data, related to the problem of variable impedance learning. In this way, we show that random features offer a viable and theoretically sound alternative for speeding up the trajectory processing, without sacrificing accuracy. |
URL | https://proceedings.mlr.press/v229/caldarelli23a/caldarelli23a.pdf |
Associated Module:
CBMM Relationship:
- CBMM Funded