Learning Movement Assessment Primitives for Force Interaction Skills
We present a novel, reusable and task-agnostic primitive for assessing the outcome of a force-interaction robotic skill, useful e.g. for applications such as quality control in industrial manufacturing. The proposed method is easily programmed by kinesthetic teaching, and the desired adaptability and reusability are achieved by machine learning models. The primitive records sensory data during both demonstrations and reproductions of a movement. Recordings include the end-effector’s Cartesian pose and exerted wrench at each time step. The collected data are then used to train Gaussian Processes which create models of the wrench as a function of the robot’s pose. The similarity between the wrench models of the demonstration and the movement’s reproduction is derived by measuring their Hellinger distance. This comparison creates features that are fed as inputs to a Naive Bayes classifier which estimates the movement’s probability of success. The evaluation is performed on two diverse robotic assembly tasks – snap-fitting and screwing – with a total of 5 use cases, 11 demonstrations, and more than 200 movement executions. The performance metrics prove the proposed method’s capability of generalization to different demonstrations and movements.