
Real-world social cognition requires processing and adapting to multiple dynamic information streams. Interpreting neural activity in such ecological conditions remains a key challenge for neuroscience. This study leverages advancements in de-noising techniques and multivariate modeling to extract interpretable EEG signals from pairs of participants (male-male, female-female, and male-female) engaged in spontaneous dyadic dance. Using multivariate temporal response functions (mTRFs), we investigated how music acoustics, self-generated kinematics, other-generated kinematics, and social coordination uniquely contributed to EEG activity. Electromyogram recordings from ocular, face, and neck muscles were also modeled to control for artifacts. The mTRFs effectively disentangled neural signals associated with four processes: (I) auditory tracking of music, (II) control of self-generated movements, (III) visual monitoring of partner movements, and (IV) visual tracking of social coordination. We show that the first three neural signals are driven by event-related potentials: the P50-N100-P200 triggered by acoustic events, the central lateralized movement-related cortical potentials triggered by movement initiation, and the occipital N170 triggered by movement observation. Notably, the (previously unknown) neural marker of social coordination encodes the spatiotemporal alignment between dancers, surpassing the encoding of self-or partner-related kinematics taken alone. This marker emerges when partners can see each other, exhibits a topographical distribution over occipital areas, and is specifically driven by movement observation rather than initiation. Using data-driven kinematic decomposition, we further show that vertical bounce movements best drive observers’ EEG activity. These findings highlight the potential of real-world neuroimaging, combined with multivariate modeling, to uncover the mechanisms underlying complex yet natural social behaviors.
Significance statement Real-world brain function involves integrating multiple information streams simultaneously. However, due to a shortfall of computational methods, laboratory-based neuroscience often examines neural processes in isolation. Using multivariate modeling of EEG data from pairs of participants freely dancing to music, we demonstrate that it is possible to tease apart physiologically established neural processes associated with music perception, motor control, and observation of a partner’s movement. Crucially, we identify a previously unknown neural marker of social coordination that encodes the spatiotemporal alignment between dancers, beyond self-or partner-related kinematics alone. These findings highlight the potential of computational neuroscience to uncover the biological mechanisms underlying real-world social and motor behaviors, advancing our understanding of how the brain supports dynamic and interactive activities.