IIT Projects Search

investigating Human Shared PErception with Robots
Abstract

Perception is a complex process, where prior knowledge exerts a fundamental influence over what we see. Incorporating previous experience, or priors, into the current percept helps the brain cope with the uncertainty resulting from sensory and neural noise and ambiguity. As a consequence, the very same physical stimulus can be perceived differently by two observers, given different internal priors, which can be rapidly formed through short sensory experience and this phenomenon might be exacerbated in elderly, who suffer of reduced sensory acuity. Although recently a high degree of interest has emerged in the effect of recent experience on visual perception, there is no knowledge of how perceptual inference shapes visual perception during social interaction. However, this is a crucial question, as during interaction the brain is faced with two potentially conflicting goals: maximizing individual perceptual stability using internal priors, or maximizing perceptual alignment with the partner to facilitate coordination, by limiting the reliance on individual priors. Perceptual alignment is at the basis of all everyday joint activities, as passing an object, and it acquires even more importance in all contexts with high coordination demands, as sports, dance, music, where temporal and spatial precision in the perception of external stimuli and of the partner’s actions are fundamental for task achievement. wHiSPER studies, for the first time, how basic mechanisms of perceptual inference are modified during interaction, by moving the investigation from an individual, passive approach to an interactive shared context, where two agents dynamically influence each other. To allow for scrupulous and systematic control, wHiSPER uses a humanoid robot as an interactive agent, serving as investigation probe, whose perceptual and motor decision are fully under experimenter’s control. One of the crucial limits to the study of perception during interaction has so far been the impossibility of maintaining rigorous control on the stimulation, while allowing for a direct involvement in a dynamic exchange with another agent. The robotic platform makes it possible to port the stimuli used in perceptual investigations to the domain of online collaboration, bringing controllability and repeatability to an embodied and interactive context. The robot becomes either the stimulus presenter or the co-actor in experiments with increasing levels of interactivity, and complements more traditional screen-based investigations, adopted for baseline experiments. In summary wHiSPER exploits rigorous psychophysical methods, Bayesian modeling and humanoid technologies to provide the first comprehensive account of how visual perception of spatial and temporal world properties changes during interaction with both humans and robots

LogoEnteFinanziatore EU
Project information
Logo whisper
Acronym
wHiSPER
Start date
01/03/2019
End date
31/12/2024
Role
Coordinator
Funds
European
People involved
Alessandra Sciutti
Alessandra Sciutti
COgNiTive Architecture for Collaborative Technologies
Budget
Total budget: 1.749.375,00€
Total contribution: 1.749.375,00€
Link