Service Incident: New DOI registrations are working again. Re-registration of failed DOI registrations (~500) are still affected by the service incident at DataCite (our DOI registration agency).
Published March 15, 2019 | Version v1
Journal article Open

Do We Adopt the Intentional Stance Toward Humanoid Robots?

  • 1. Social Cognition in Human-Robot Interaction Unit, Istituto Italiano di Tecnologia, Genoa, Italy; School of Computer Science, Faculty of Science and Engineering, Manchester University, Manchester, United Kingdom
  • 2. Social Cognition in Human-Robot Interaction Unit, Istituto Italiano di Tecnologia, Genoa, Italy; Dipartimento di Informatica, Bioingegneria, Robotica e Ingegneria dei Sistemi, Università di Genova, Genoa, Italy
  • 3. Social Cognition in Human-Robot Interaction Unit, Istituto Italiano di Tecnologia, Genoa, Italy

Description

In daily social interactions, we need to be able to navigate efficiently through our social environment. According to Dennett (1971), explaining and predicting others’ behavior with reference to mental states (adopting the intentional stance) allows efficient social interaction. Today we also routinely interact with artificial agents: from Apple’s Siri to GPS navigation systems. In the near future, we might start casually interacting with robots. This paper addresses the question of whether adopting the intentional stance can also occur with respect to artificial agents. We propose a new tool to explore if people adopt the intentional stance toward an artificial agent (humanoid robot). The tool consists in a questionnaire that probes participants’ stance by requiring them to choose the likelihood of an explanation (mentalistic vs. mechanistic) of a behavior of a robot iCub depicted in a naturalistic scenario (a sequence of photographs). The results of the first study conducted with this questionnaire showed that although the explanations were somewhat biased toward the mechanistic stance, a substantial number of mentalistic explanations were also given. This suggests that it is possible to induce adoption of the intentional stance toward artificial agents, at least in some contexts.

Files

dataset_INSTANCE N=106.csv

Files (40.1 kB)

Name Size Download all
md5:c0242d69dc192d31ec5ad207886f8316
31.2 kB Preview Download
md5:da4932adc12a486514e0034910efa4da
3.3 kB Download
md5:05a5f094e30e7534df4b308668f07215
5.6 kB Download

Additional details

Funding

InStance – Intentional stance for social attunement 715058
European Commission