TY - UNPB
T1 - Consequential sound induces illusory distortions in the perception and prediction of robot motion
AU - Currie, Joel
AU - Giannaccini, Maria Elena
AU - Bach, Patric
N1 - Data Availability. The experimental files, analysis scrips and datasets generated during and/or analyzed during the current study are available from the corresponding author’s github repository: https://github.com/jwgcurrie/Robot-action-perception-consequential-sound
PY - 2024/2/17
Y1 - 2024/2/17
N2 - For efficient human-robot interaction, human operators need to be able to efficiently represent the robot’s movements in space and predict its next steps. However, according to frameworks of Bayesian multisensory integration, features outside the motion itself – like the consequential sounds a robot makes while it moves – should affect how otherwise identical motions are perceived. Here, we translate an established psychophysical task from experimental psychology to a human-robot interaction context, and can measure these distortions to motion perception. In two series of preregistered studies, participants watched a humanoid robot make (forward and backward) reaching movements. When the robot hand suddenly disappeared, they reported its last seen location, either with the mouse cursor (Experiment 1a and 1b) or by matching it to probe stimuli in different locations (Experiment 2a and 2b). The results revealed that even small changes to the robot’s consequential sound robustly affect participants’ visuospatial representation of its motions, so that the motion appeared to extend further in space when accompanied by slightly (100 ms) longer sounds compared to slightly shorter sounds (100 ms shorter). Moreover, these sound changes do not only affect where people currently locate the robot’s motion, but where they anticipate its future steps. These findings show that sound design is an effective medium for manipulating how people represent otherwise identical robot actions and coordinate its interactions with it. The study acts as proof of concept that psychophysical tasks provide a promising tool to measure how design parameters influence the perception and prediction of robot motion.
AB - For efficient human-robot interaction, human operators need to be able to efficiently represent the robot’s movements in space and predict its next steps. However, according to frameworks of Bayesian multisensory integration, features outside the motion itself – like the consequential sounds a robot makes while it moves – should affect how otherwise identical motions are perceived. Here, we translate an established psychophysical task from experimental psychology to a human-robot interaction context, and can measure these distortions to motion perception. In two series of preregistered studies, participants watched a humanoid robot make (forward and backward) reaching movements. When the robot hand suddenly disappeared, they reported its last seen location, either with the mouse cursor (Experiment 1a and 1b) or by matching it to probe stimuli in different locations (Experiment 2a and 2b). The results revealed that even small changes to the robot’s consequential sound robustly affect participants’ visuospatial representation of its motions, so that the motion appeared to extend further in space when accompanied by slightly (100 ms) longer sounds compared to slightly shorter sounds (100 ms shorter). Moreover, these sound changes do not only affect where people currently locate the robot’s motion, but where they anticipate its future steps. These findings show that sound design is an effective medium for manipulating how people represent otherwise identical robot actions and coordinate its interactions with it. The study acts as proof of concept that psychophysical tasks provide a promising tool to measure how design parameters influence the perception and prediction of robot motion.
UR - https://doi.org/10.31219/osf.io/47uam
U2 - 10.31219/osf.io/47uam
DO - 10.31219/osf.io/47uam
M3 - Preprint
BT - Consequential sound induces illusory distortions in the perception and prediction of robot motion
PB - OSF
ER -