Paper out: how people understand the goals of other people’s actions

Observers translate information about other agents' higher-order goals into expectations about their forthcoming action kinematics.

Social perception relies on the ability to understand the higher-order goals that drive other people's behaviour. Under predictive coding views, this ability relies on a Bayesian-like hypothesis-testing mechanism, which translates prior higher-order information about another agent's goals into perceptual predictions of the actions with which these goals can be realised and tests these predictions against the actual behaviour. We tested this hypothesis in three preregistered experiments. Participants viewed an agent's hand next to two possible target objects (e.g., donut, hammer) and heard the agent state a higher-order goal, which could be fulfilled by one of the two objects (e.g., “I'm really hungry!”). The hand then reached towards the objects and disappeared at an unpredictable point mid-motion, and participants reported its last seen location. The results revealed the hypothesized integration of prior goals and observed hand trajectories. Reported hand disappearance points were predictively shifted towards the object with which the goal could be best realised. These biases were stronger when goal statements were explicitly processed (Experiment 1) than when passively heard (Experiment 2), more robust for more ambiguous reaches, and they could not be explained by attentional shifts towards the objects or participants' awareness of the experimental hypotheses. Moreover, similar biases were not elicited (Experiment 3) when the agent's statements referred to the same objects but did not specify them as action goals (e.g., “I'm really not hungry!”). These findings link action understanding to predictive/Bayesian mechanisms of social perception and Theory of Mind and provide the first evidence that prior knowledge about others' higher-level goals cascades to lower-level action expectations, which ultimately influence the visuospatial representation of others' behaviour.

McDonough, K., Parrotta, E., Enwereuzor, C, Bach, P. (2025). Observers translate information about other agents’ higher-order goals into expectations about their forthcoming action kinematics. Cognition. 1061112. Open Access Version

Multiple commentaries on our theoretical paper about a new view of motor imagery

Special issue on "Special Issue on the Neurocognitive Mechanisms of Motor Imagery Practice” in Psychological Research

Hommel, B. Exorcizing the homunculus from ideomotor/simulation theory: a commentary on Bach et al. (2022), Frank et al. (2023), and Rieger et al. (2023) Link

O’Shea, H., Bek, J. The complex interplay between perception, cognition, and action: a commentary on Bach et al. 2022. Link

Grosprêtre, S. Motor imagery from brain to muscle: a commentary on Bach et al., (2022).Psychological Research 88, 1805–1807 (2024). Link

Vannuscorps, G. When does imagery require motor resources? A commentary on Bach et al., 2022. Link

Nalborczyk, L., Longcamp, M., Gajdos, T. et al. Towards formal models of inhibitory mechanisms involved in motor imagery: a commentary on Bach et al. (2022). Link

Our original article is here:

Bach, P., Frank, C., & Kunde, W. (2022). Why motor imagery is not really motoric: Towards a reconceptualization in terms of effect-based action control. Psychological Research. https://doi.org/10.1007/s00426-022-01773-w. PublisherPDF

Proceedings paper on perspective taking with a robot

More Than Meets the Eye? An Experimental Design to Test Robot Visual Perspective-Taking Facilitators Beyond Mere-Appearance.

Visual Perspective Taking (VPT) underpins human social interaction, from joint action to predicting others’ future actions and mentalizing about their goals and affective/mental states. Substantial progress has been made in developing artificial VPT capabilities in robots. However, as conventional VPT tasks rely on the (non-situated, dis- embodied) presentation of robots on computer screens, it is unclear how a robot’s socially reactive and goal-directed behaviours prompt people to take its perspective. We provide a novel experimental para- digm that robustly measures the extent to which human interaction partners take a robot’s visual perspective during face-to-face human- robot-interactions, by measuring how much a robot’s visual perspec- tive is spontaneously integrated with one’s own. The experimental task design of our upcoming user study allows us to investigate the role of robot features beyond its human-like appearance, which have driven research so far, targeting instead its socially reactive behaviour and task engagement with the human interaction partner

Currie, J., Mcdonough, K. L., Wykowska, A., Giannaccini, M. E., & Bach, P. (2024). More Than Meets the Eye? An Experimental Design to Test Robot Visual Perspective-Taking Facilitators Beyond Mere-Appearance. In 2024 ACM/IEEE International Conference on Human-Robot Interaction (pp. 359-363). Open Access Version

Paper out on a new interoceptive illusion

Heart is deceitful above all things: Threat expectancy induces the illusory perception of increased heartrate

It has been suggested that our perception of the internal milieu, or the body's internal state, is shaped by our beliefs and previous knowledge about the body's expected state, rather than being solely based on actual interoceptive experiences. This study investigated whether heartbeat perception could be illusorily distorted towards prior subjective beliefs, such that threat expectations suffice to induce a misperception of heartbeat frequency. Participants were instructed to focus on their cardiac activity and report their heartbeat, either tapping along to it (Experiment 1) or silently counting (Experiment 2) while ECG was recorded. While completing this task, different cues provided valid predictive information about the intensity of an upcoming cutaneous stimulation (high- vs. low- pain). Results showed that participants expected a heart rate increase over the anticipation of high- vs. low-pain stimuli and that this belief was perceptually instantiated, as suggested by their interoceptive reports. Importantly, the perceived increase was not mirrored by the real heart rate. Perceptual modulations were absent when participants executed the same task but with an exteroceptive stimulus (Experiment 3). The findings reveal, for the first time, an interoceptive illusion of increased heartbeats elicited by threat expectancy and shed new light on interoceptive processes through the lenses of Bayesian predictive processes, providing tantalizing insights into how such illusory phenomena may intersect with the recognition and regulation of people's internal states.

Parrotta, E., Bach, P., Perrucci, M. G., Costantini, M., & Ferri, F. (2024). The Heart Is Deceitful Above All Things: illusory perception of heartbeat is induced by pain expectation. Cognition, 245, 105719. Open Access Version

Paper out on multi sensory integration when observing actions of robot agents

Sonic Sleight of Hand: Sound Induces Illusory Distortions in the Perception and Prediction of Robot Action

For efficient human–robot interaction, human operators need to be able to efficiently represent the robot’s movements in space and predict its next steps. However, according to frameworks of Bayesian multisensory integration, features outside the motion itself—like the sounds a robot makes while it moves—should affect how otherwise identical motions are perceived. Here, we translate an established psychophysical task from experimental psychology to a human–robot interaction context, which can measure these distortions to motion perception. In two series of preregistered studies, participants watched a humanoid robot make forward and backward reaching movements. When the robot hand suddenly disappeared, they reported its last seen location, either with the mouse cursor (Experiment 1a and 1b) or by matching it to probe stimuli in different locations (Experiment 2a and 2b). The results revealed that even small changes to the robot’s sound robustly affect participants’ visuospatial representation of its motions, so that the motion appeared to extend further in space when accompanied by slightly (100 ms) longer sounds compared to slightly shorter sounds (100 ms shorter). Moreover, these sound changes do not only affect where people currently locate the robot’s motion, but where they anticipate its future steps. These findings show that sound design is an effective medium for manipulating how people represent otherwise identical robot actions and coordinate its interactions with it. The study acts as proof of concept that psychophysical tasks provide a promising tool to measure how design parameters influence the perception and prediction of robot motion.

Currie, J., Giannaccini, E.M., & Bach, P. (2024). Sonic Sleight of Hand: Sound Induces Illusory Distortions in the Perception and Prediction of Robot Action. International Journal of Social Robotics, 1-19. PublisherPDF

Two more preprints

Parrotta, E., Bach, P., Pezzulo, G., Costantini, M., Ferri, F. (2023). Exposure to false cardiac feedback alters pain perception and anticipatory cardiac frequency. eLife12:RP90013. Preprint

Parrotta, E., McDonough, K., Enwereuzor, C, Bach, P., (2023). Observers translate information about other agents’ higher-order goals into expectations about their forthcoming action kinematics. OSF. Preprint

Four new lab preprints

Currie, J., Giannaccini, E.M., & Bach, P. (2023). Consequential sound induces illusory distortions in the perception and prediction of robot motion. OSFPreprints. Preprint.

Parrotta, E., McDonough, K., Bach, P. (2023). Imagery as predicted perception: imagery predictively biases perceptual judgments of action kinematics. PsyArXiv. Preprint.

Parrotta, E., Bach, P., Perrucci, M. G., Costantini, M., & Ferri, F. (2022). Heart Is Deceitful Above All Things: illusory perception of heartbeat is induced by pain expectation. bioRxiv, 2022-09. Preprint.

Pulling, V., Phillips, L.H., Bach, P., Newlands, A. & Jackson, M. (2023). Using Predictions to Resolve Emotional Ambiguity: Facial Expression Intensity Influences the Reliance on Prior Expectation. PsyArXiv, DOI: 10.31234/osf.io/f75cv. Preprint.

Our pre-registered replication of our prior paper is out

Expectations of efficient actions bias social perception: a pre-registered online replication

Humans take a teleological stance when observing others' actions, interpreting them as intentional and goal directed. In predictive processing accounts of social perception, this teleological stance would be mediated by a perceptual prediction of an ideal energy-efficient reference trajectory with which a rational actor would achieve their goals within the current environmental constraints. Hudson and colleagues (2018 Proc. R. Soc. B285, 20180638. (doi:10.1098/rspb.2018.0638)) tested this hypothesis in a series of experiments in which participants reported the perceived disappearance points of hands reaching for objects. They found that these judgements were biased towards the expected efficient reference trajectories. Observed straight reaches were reported higher when an obstacle needed to be overcome than if the path was clear. By contrast, unnecessarily high reaches over empty space were perceptually flattened. Moreover, these perceptual biases increased the more the environmental constraints and expected action trajectories were explicitly processed. These findings provide an important advance to our understanding of the mechanisms underlying social perception. The current replication tests the robustness of these findings and whether they uphold in an online setting.

McDonough, K., Bach, P. (2023). Expectations of efficient actions bias social perception: a preregistered online replication. Royal Society Open Science, 10(2). PublisherPDFData

A twitter thread on the paper can be found here.

Our theoretical paper on the mechanisms of motor imagery is out

Why motor imagery isn’t really motoric: Towards a reconceptualization in terms of effect-based action control.

Overt and imagined action seem inextricably linked. Both have similar timing, activate shared brain circuits, and motor imagery influences overt action and vice versa. Motor imagery is, therefore, often assumed to recruit the same motor processes that govern action execution, and which allow one to play through or simulate actions offline. Here, we advance a very different conceptualization. Accordingly, the links between imagery and overt action do not arise because action imagery is intrinsically motoric, but because action planning is intrinsically imaginistic and occurs in terms of the perceptual effects one want to achieve. Seen like this, the term ‘motor imagery’ is a misnomer of what is more appropriately portrayed as ‘effect imagery’. In this article, we review the long-standing arguments for effect-based accounts of action, which are often ignored in motor imagery research. We show that such views provide a straightforward account of motor imagery. We review the evidence for imagery-execution overlaps through this new lens and argue that they indeed emerge because every action we execute is planned, initiated and controlled through an imagery-like process. We highlight findings that this new view can now explain and point out open questions.

Bach, P., Frank, C., & Kunde, W. (2022). Why motor imagery is not really motoric: Towards a reconceptualization in terms of effect-based action control. Psychological Research. https://doi.org/10.1007/s00426-022-01773-w. PublisherPDF

Special issue on predictive processing

Predictive Mechanisms in Action, Perception, Cognition, and Clinical Disorders

Patric, Anila D’Mello, Phil Corlett and Liron Rosenkrantz organized a special issue on predictive processing. It is now published. Check out the eight articles here — see below for our editorial.

D'Mello, A. M., Bach, P., Corlett, P. R., & Rozenkrantz, L. (2022). Predictive mechanisms in action, perception, cognition, and clinical disorders. Frontiers in Human Neuroscience, 598. Publisher

Preprint on the links between motor imagery and action planning

Why motor imagery isn’t really motoric: Towards a reconceptualization in terms of effect-based action control.

Overt and imagined action seem inextricably linked. Both follow similar timings, activate shared brain circuits, and motor imagery influences overt action and vice versa. Motor imagery is therefore often assumed to rely on the motor processes governing action execution itself, which allow one to play through or simulate actions offline. Here, we advance a very different conceptualization. In this view, the links between imagery and overt action do not arise because action imagery is intrinsically motoric, but because action planning is intrinsically imaginistic and occurs in terms of the perceptual effects we want to achieve. Viewed like this, the term 'motor imagery' is a misnomer of what is more appropriately portrayed as 'effect imagery'. In this article, we review the evidence for imagery-execution overlaps through this new lens and argue that they indeed emerge because every action we execute is planned, initiated and controlled through an imagery-like process. We highlight findings that this new view can now explain and point out open questions

Bach, P., Frank, C., & Kunde, W. (2021, October 23). Why motor imagery isn’t really motoric: Towards a reconceptualization in terms of effect-based action control. PsyArxiv.

Special issue on action affordances

Behavioral and Neural Bases of Object Affordance Processing and Its Clinical Implications.

Patric, Sanjay Kumar and Dimitrios Kourtis organized a special issue on affordance processing. It is now published. Check out the eight articles here — see below for our editorial.

Kumar, S., Bach, P. & Kourtis, D (2021). Editorial to the Special Issue: Behavioral and Neural Bases of Object Affordance Processing and its Clinical Implications. Frontiers in Human Neuroscience. Publisher

We're recruiting a post doc

1xBJsGYV_400x400.jpg

Come work with us in Aberdeen!

We are currently recruiting for a 42 months postdoc, to work on the Leverhulme Trust funded project “Social perception as Bayesian Hypothesis Testing and Revision”. The deadline for applications in the 27th of April.

The project investigates how predictions help people make sense of the behavior of others, and which neuro-cognitive mechanisms underlie these abilities. The work will be lead by Patric Bach in Aberdeen, in collaboration with Elsa Fouragnan and Giorgio Ganis in Plymouth and Paul Downing in Bangor. The project will run for 42 months.

Dr. Katrina McDonough already works as postdoctoral researcher on the grant and leads the behavioral research stream. We are looking to recruit a second full time post-doctoral researcher with expertise in neuro-imaging methods (EEG and/or fMRI) and good programming skills. If interested, please email Patric Bach and have a look at the role description and project description.

Our paper on predictive social perception in autism is out


Predictive action perception from explicit intention information in autism

Esrc_logo.png

Social difficulties in autism spectrum disorder (ASD) may originate from a reduced top-down modulation of sensory information that prevents the spontaneous attribution of intentions to observed behaviour. However, although people with autism are able to explicitly reason about others’ mental states, the effect of abstract intention information on perceptual processes has remained untested. ASD participants (n = 23) and a neurotypical (NT) control group (n = 23) observed a hand either reaching for an object or withdrawing from it. Prior to action onset, the participant either instructed the actor to “Take it” or “Leave it”, or heard the actor state “I’ll take it” or “I’ll leave it”, which provided an explicit intention that was equally likely to be congruent or incongruent with the subsequent action. The hand disappeared before completion of the action, and participants reported the last seen position of the tip of the index finger by touching the screen. NT participants exhibited a predictive bias in response to action direction (reaches perceived nearer the object, withdrawals perceived farther away), and in response to prior knowledge of the actor’s intentions (nearer the object after “Take it”, farther away after “Leave it”). However, ASD participants exhibited a predictive perceptual bias only in response to the explicit intentions, but not in response to the motion of the action itself. Perception in ASD is not immune from top-down modulation. However, the information must be explicitly presented independently from the stimulus itself, and not inferred from cues inherent in the stimulus.

Hudson, M., Nicholson, Kharko, A., T., McKenzie, R., & Bach, P. (2021). Predictive Action Perception from Explicit Intention Information in Autism. Psychonomic Bulletin and Review. PublisherPDFData

New paper on perspective taking


Is Implicit Level-2 Visual perspective taking embodied? Perceptual simulation of others’ perspectives is not impaired by motor restriction.

Embodied accounts of visual perspective taking suggest that judgements from another person’s perspective are less effortful if one’s own body position aligns with that of the other person, indicating a causal role of posture in visual perspective taking. Using our adapted mental rotation paradigm, here we tested whether movement has a causal role in perspective taking, by restricting participants’ movement in half of the experimental trials. Here we show, using our previously validated task, that the perceptual representation of another’s visual perspective is not influenced by participants’ ability to move. These data therefore rule out active physical movement as a causal explanation of visual perspective taking and instead argue that postural readjustments when making judgements from another’s perspective are a bodily consequence of the mental transformations of a person’s actual to imagined position in space.

Ward, E., Bach, P., McDonough, K., & Ganis, G. (2022). Is Implicit Level-2 Visual perspective taking embodied? Perceptual simulation of others’ perspectives is not impaired by motor restriction. Quarterly Journal of Experimental Psychology. PDFPreprintData

New paper out, with Kim Schenke, Natalie Wyer, and Steve Tipper


Predictive person models elicit motor biases: The face-inhibition effect revisited

Using an established paradigm, we tested whether people derive motoric predictions about an actor’s forthcoming actions from prior knowledge about them and the context in which they are seen. In two experiments, participants identified famous tennis and soccer players using either hand or foot responses. Athletes were shown either carrying out or not carrying out their associated actions (swinging, kicking), either in the context where these actions are typically seen (tennis court, soccer pitch) or outside these contexts (beach, awards ceremony). Replicating prior work, identifying non-acting athletes revealed the negative compatibility effects: viewing tennis players led to faster responses with a foot than a hand, and vice versa for viewing soccer players. Consistent with the idea that negative compatibility effects result from the absence of a predicted action, these effects were eliminated (or reversed) when the athletes were seen carrying out actions typically associated with them. Strikingly, however, these motoric biases were not limited to In-Context trials but were, if anything, more robust in the Out-of-Context trials. This pattern held even when attention was drawn specifically to the context (Experiment 2). These results confirm that people hold motoric knowledge about the actions that others typically carry out and that these actions are part of perceptual representations that are accessed when those others are re-encountered, possibly in order to resolve uncertainty in person perception.

Picture7.jpg

Schenke, K. C., Wyer, N., Tipper, S., & Bach, P. (2020). Predictive person models elicit motor biases: the face-inhibition effect revisited. Quarterly Journal of Experimental Psychology, 74(1), 54-67. PublisherPDF — Data

The 3rd paper from Katrina's PhD is out -- published in JEP:HPP


Affordance Matching Predictively Shapes the Perceptual Representation of Others’ Ongoing Actions

Predictive processing accounts of social perception argue that action observation is a predictive process, in which inferences about others’ goals are tested against the perceptual input, inducing a subtle perceptual confirmation bias that distorts observed action kinematics toward the inferred goals. Here we test whether such biases are induced even when goals are not explicitly given but have to be derived from the unfolding action kinematics. In 2 experiments, participants briefly saw an actor reach ambiguously toward a large object and a small object, with either a whole-hand power grip or an index-finger and thumb precision grip. During its course, the hand suddenly disappeared, and participants reported its last seen position on a touch-screen. As predicted, judgments were consistently biased toward apparent action targets, such that power grips were perceived closer to large objects and precision grips closer to small objects, even if the reach kinematics were identical. Strikingly, these biases were independent of participants’ explicit goal judgments. They were of equal size when action goals had to be explicitly derived in each trial (Experiment 1) or not (Experiment 2) and, across trials and across participants, explicit judgments and perceptual biases were uncorrelated. This provides evidence, for the first time, that people make online adjustments of observed actions based on the match between hand grip and object goals, distorting their perceptual representation toward implied goals. These distortions may not reflect high-level goal assumptions, but emerge from relatively low-level processing of kinematic features within the perceptual system.

Picture8.jpg

McDonough, K.L., Costantini, M., Hudson, M., Ward, E. & Bach, P. (2020). Affordance matching predictively shapes the perceptual representation of others’ ongoing actions. Journal of Experimental Psychology: Human Perception and Performance. PublisherPDFData

Grant from the Leverhulme Trust!

1xBJsGYV_400x400.jpg

We are grateful for the Leverhulme Trust for awarding us £462.995 to investigate how predictions help people make sense of the behavior of others, and which neuro-cognitive mechanisms underlie these abilities. The work will be lead by Patric Bach in Aberdeen, in collaboration with Elsa Fouragnan and Giorgio Ganis in Plymouth and Paul Downing in Bangor.

The project will run from May 2020 to December 2023. Please see here for for a project overview.

Dr. Katrina McDonough will work as postdoctoral researcher and lead the behavioral research stream. We are looking to recruit a second full time post-doctoral researcher with expertise in neuro-imaging methods (EEG and/or fMRI) and good programming skills.

While recruitment is currently on hold until the COVID-19 situation is clearer, please email Patric Bach (patric.bach@abdn.ac.uk) if you are interested in the position.