Learning from demonstrations without parameter tuning for an industrial cobot
Authors/Creators
Description
State-of-the-art learning from demonstration (LfD) methods for collaborative robots—skill transfer and generalization through a set of demonstrations—require manually tuning intrinsic parameters. Hence, LfD cannot be used readily in industrial contexts without experts. We propose a parameter-free method based on probabilistic movement primitives, where all the parameters are pre-determined using Jensen-Shannon divergence and Bayesian Optimization method, so users do not perform any tuning. This method learns motions from a small dataset of user demonstrations, and generalizes the motion to various scenarios and conditions with no manual tuning. We evaluate our method in field tests where the cobot works with Schindler workers in the field. We show errors between the cobot end-effector and target positions ranging from 0 to 1.479±0.351mm, and no task failures for all tests. Questionnaires completed by Schindler workers highlighted our method's ease of use, feeling of safety, and the accuracy of the reproduced motion.
Files
file.pdf
Files
(869.7 kB)
| Name | Size | Download all |
|---|---|---|
|
md5:554eef101e8e82d237b5e6e4c3e4f654
|
869.7 kB | Preview Download |