Towards Collaborative Robots as Intelligent Co-workers in Human-Robot Joint Tasks.
Recently there has been an increasing demand for collaborative robots able to interact and cooperate with people in several human environments, sharing physical space, and working closely with humans in joint tasks.
Endowing robots with learning and cognitive capabilities is a key for natural and efficient cooperation with the human co-worker. In particular, these abilities improve and facilitate the use of collaborative robots in the joint assembly task, especially in smart manufacturing contexts.
In this paper, we report the results of the implementation of a neuro-inspired model—based on Dynamic Neural Fields—for action selection in a Human-Robot join action scenario.
We test the model in a real construction scenario where the robot Sawyer selects and verbalizes, at each step, the next part to be mounted and outputs an appropriate action to insert it, together with its human partner. The two-dimensional Action Execution Layer allows the representation of the components object and action in the same field. The results reveal that the robot can compute valid decisions for different workspace layouts and for situations where there are missing pieces.