IIT Projects Search

InStance

H2020 ERC - Starting Grant 2017-2022

Intentional stance for social attunement - InStance

Sommario: The InStance project focuses on the question of whether (and under what conditions) people adopt an intentional mindset towards robots, a mindset that is typically adopted towards other humans. An intentional mindset is what the philosopher Daniel Dennett termed “intentional stance” - predicting and explaining behaviour with reference to the agent’s mental states such as beliefs, desires and intentions. To give an example: when I see a person gazing at a glass filled with water and extending their arm in its direction, I automatically surmise that the person intends to grasp it, because they feel thirst, believe that water will ease their thirst, and hence want to drink water from the glass. The terms “intend”, “feel” or “believe” all refer to mental states, and the assumption is that through referring to mental states, I can understand and explain someone else’s behaviour. However, for non-intentional systems (such as man-made artefacts), we often adopt the design stance - assuming that the system’s has been designed to behave in particular way (for example the car will slow down when one pushes the brakes not because the car intends to be slower, but because the car has been designed to slow down when the brake pedal is pushed). Adopting either the intentional stance or the design stance is crucial not only for predicting others’ behaviour but presumably also for becoming engaged in a social interaction. That is, when I adopt the intentional stance, I direct my attention to where somebody is pointing, and hence we establish joint focus of attention, thereby becoming socially attuned. On the contrary, if I see that a machine’s artificial arm is pointing somewhere, I might be unwilling to attend there, as I do not believe that the machine wants to show me something, i.e., there is no intentional communicative content in the gesture. This raises the question: to what extent are humans ready to adopt the intentional stance towards robots with human-like appearance, and to attune socially with them? It might be that once a robot imitates human-like behaviour at the level of subtle (and often implicit) social signals, humans might automatically perceive its behaviour as reflecting mental states. This would presumably evoke social cognition mechanisms to the same (or similar) extent as in human-human interactions, allowing social attunement. By social attunement we mean a collection of mechanisms of social cognition that the brain employs during interactions with others: for example, joint attention, or visual-spatial perspective taking. Joint attention is a mechanism through which two or more individuals attend the same event or object in the environment. Engagement in joint attention often happens through directing others’ attention to where one is attending through, for example, gaze direction or a pointing gesture. Visual-spatial perspective taking is a mechanism that allows for taking someone else’s perspective in representation of space (for example, I understand that my “right” is “left” for my interaction partner, who is sitting opposite to me). In daily interactions with other humans we employ such mechanisms automatically. But would we employ similar mechanisms also in interaction with humanoid robots? The objectives of the InStance project are to understand various factors that contribute to activating mechanisms of social attunement in interaction with humanoid robots, with a special focus on intentional stance. Does adopting intentional stance affect social attunement? What are the conditions for adopting the intentional stance, and social attunement? In the InStance project we are specifically focusing on subtle characteristics of the robot behaviour, such as human-likeness of its movement parameters, mutual gaze, contingency of gaze behaviour or social intentions carried in communicative social signals. In addition, we are interested in how cultural embedding or familiarity with artificial agents impacts adopting the intentional stance and social attunement. We use interactive protocols with a humanoid robot iCub, and we employ cognitive neuroscience methods and well-controlled experiments. We measure how the human brain responds to the robot’s behaviour, whether we can observe behavioural and neural markers of social attunement (joint attention, visuo-spatial perspective taking or theory of mind) in interaction with iCub.

Total budget: 1.499.937,00€

Total contribution: 1.499.937,00€