Published September 1, 2020 | Version v1
Conference paper Open

Interacting with a Social Robot Affects Visual Perception of Space

Description

Human partners are very effective at coordinating in space and
time. Such ability is particular remarkable considering that visual
perception of space is a complex inferential process, which is
affected by individual prior experience (e.g. the history of previous
stimuli). As a result, two partners might perceive differently the
same stimulus. Yet, they find a way to align their perception, as
demonstrated by the high degree of coordination observed in
sports or even in everyday gestures as shaking hands. Robots
would need a similar ability to align with their partner's
perception. However, to date there is no knowledge of how the
inferential mechanism supporting visual perception operates
during social interaction. In the current work, we use a humanoid
robot to address this question. We replicate a standard protocol
for the quantification of perceptual inference in a HRI setting.
Participants estimated the length of a set of segments presented
by the humanoid robot iCub. The robot behaved in one condition
as a mechanical arm driven by a computer and in another
condition as an interactive, social partner. Even if the stimuli
presented were the same in the two conditions, length perception
was different when the robot was judged as an interactive agent
rather than a mechanical tool. When playing with the social robot,
participants relied significantly less on stimulus history. This
result suggests that the brain changes optimization strategies
during interaction and lay the foundations to design humanaware
robot visual perception.

Files

[02]Interacting with a Social Robot Affects Visual Perception of Space.pdf

Additional details

Funding

wHiSPER – investigating Human Shared PErception with Robots 804388
European Commission