Multi-Task Gaussian Process Movement Primitives for Kinesthetic Teaching
Abstract:
Movement primitives have been a very popular approach for learning robot movements from demonstrations. Nevertheless, they usually depend on a large number of hyper-parameters which also may vary depending on the complexity of the demonstration. In this paper, we deal with this challenge by presenting a novel, self-tuned and compact – in terms of hyper-parameters – representation of movement primitives. The method models a demonstrated movement by utilizing Gaussian Process (GP) models which are able to learn movements of various complexities with a minimum and fixed number of hyper-parameters. This is achieved by defining the mean function of GP as a PD controller which attracts a movement to the desired state. Furthermore, a locally periodic covariance function is used as kernel which guarantees the stability and can also model complex movements. Additionally we employ multi-task learning on GP for modeling all the robot’s joints with a single model which significantly facilitates data-driven learning. The method has been evaluated on its ability to learn and generate movements with various starting states, goal states and execution time horizons. Furthermore, it is compared with DMPs on two manufacturing assembly tasks, namely snap-fitting and screwing where it is found to perform better with 5 and 12 times less hyper-parameters respectively.
Downloads: