Interacting with a Social Robot Affects Visual Perception of Space
Description
Human partners are very effective at coordinating in space and
time. Such ability is particular remarkable considering that visual
perception of space is a complex inferential process, which is
affected by individual prior experience (e.g. the history of previous
stimuli). As a result, two partners might perceive differently the
same stimulus. Yet, they find a way to align their perception, as
demonstrated by the high degree of coordination observed in
sports or even in everyday gestures as shaking hands. Robots
would need a similar ability to align with their partner's
perception. However, to date there is no knowledge of how the
inferential mechanism supporting visual perception operates
during social interaction. In the current work, we use a humanoid
robot to address this question. We replicate a standard protocol
for the quantification of perceptual inference in a HRI setting.
Participants estimated the length of a set of segments presented
by the humanoid robot iCub. The robot behaved in one condition
as a mechanical arm driven by a computer and in another
condition as an interactive, social partner. Even if the stimuli
presented were the same in the two conditions, length perception
was different when the robot was judged as an interactive agent
rather than a mechanical tool. When playing with the social robot,
participants relied significantly less on stimulus history. This
result suggests that the brain changes optimization strategies
during interaction and lay the foundations to design humanaware
robot visual perception.
Files
[02]Interacting with a Social Robot Affects Visual Perception of Space.pdf
Files
(1.7 MB)
Name | Size | Download all |
---|---|---|
md5:741f874f299f188c88797411b3e83752
|
1.7 MB | Preview Download |