Conference paper Open Access
Adrian B. Latupeirissa; Claudio Panariello; Roberto Bresin
<?xml version='1.0' encoding='UTF-8'?> <record xmlns="http://www.loc.gov/MARC21/slim"> <leader>00000nam##2200000uu#4500</leader> <controlfield tag="005">20200622160243.0</controlfield> <controlfield tag="001">3898928</controlfield> <datafield tag="700" ind1=" " ind2=" "> <subfield code="a">Claudio Panariello</subfield> </datafield> <datafield tag="700" ind1=" " ind2=" "> <subfield code="a">Roberto Bresin</subfield> </datafield> <datafield tag="856" ind1="4" ind2=" "> <subfield code="s">9569228</subfield> <subfield code="z">md5:1935be933145a549b5fa0af1bd80e15b</subfield> <subfield code="u">https://zenodo.org/record/3898928/files/SMCCIM_2020_paper_174.pdf</subfield> </datafield> <datafield tag="542" ind1=" " ind2=" "> <subfield code="l">open</subfield> </datafield> <datafield tag="260" ind1=" " ind2=" "> <subfield code="c">2020-06-17</subfield> </datafield> <datafield tag="909" ind1="C" ind2="O"> <subfield code="p">openaire</subfield> <subfield code="p">user-smc</subfield> <subfield code="o">oai:zenodo.org:3898928</subfield> </datafield> <datafield tag="100" ind1=" " ind2=" "> <subfield code="a">Adrian B. Latupeirissa</subfield> </datafield> <datafield tag="245" ind1=" " ind2=" "> <subfield code="a">Exploring emotion perception in sonic HRI</subfield> </datafield> <datafield tag="980" ind1=" " ind2=" "> <subfield code="a">user-smc</subfield> </datafield> <datafield tag="540" ind1=" " ind2=" "> <subfield code="u">https://creativecommons.org/licenses/by/3.0/legalcode</subfield> <subfield code="a">Creative Commons Attribution 3.0 Unported</subfield> </datafield> <datafield tag="650" ind1="1" ind2="7"> <subfield code="a">cc-by</subfield> <subfield code="2">opendefinition.org</subfield> </datafield> <datafield tag="520" ind1=" " ind2=" "> <subfield code="a"><p>Despite the fact that sounds produced by robots can affect the interaction with humans, sound design is often an overlooked aspect in Human-Robot Interaction (HRI). This paper explores how different sets of sounds designed for expressive robot gestures of a humanoid Pepper robot can influence the perception of emotional intentions. In the pilot study presented in this paper, it has been asked to rate different stimuli in terms of perceived affective states. The stimuli were audio, audio-video and video only and contained either Pepper&rsquo;s original servomotors noises, sawtooth, or more complex designed sounds. The preliminary results show a preference for the use of more complex sounds, thus confirming the necessity of further exploration in sonic HRI.&nbsp;</p></subfield> </datafield> <datafield tag="773" ind1=" " ind2=" "> <subfield code="n">doi</subfield> <subfield code="i">isVersionOf</subfield> <subfield code="a">10.5281/zenodo.3898927</subfield> </datafield> <datafield tag="024" ind1=" " ind2=" "> <subfield code="a">10.5281/zenodo.3898928</subfield> <subfield code="2">doi</subfield> </datafield> <datafield tag="980" ind1=" " ind2=" "> <subfield code="a">publication</subfield> <subfield code="b">conferencepaper</subfield> </datafield> </record>
All versions | This version | |
---|---|---|
Views | 160 | 160 |
Downloads | 136 | 136 |
Data volume | 1.3 GB | 1.3 GB |
Unique views | 144 | 144 |
Unique downloads | 116 | 116 |