Social Cognitive and Affective Neuroscience
Research Vision
The Centre for Social, Cognitive and Affective Neuroscience (cSCAN) is an interdisciplinary UofG Research Centre that brings together complementary expertise in human behavioral science, cognitive psychology, and computer science to understand the fundamental mechanisms of human social interaction/communication, perception and cognition, including testing their validity in explicit models of human-digital agent interaction. To do so, our research teams collectively address key questions via a fully integrated hypothesize-formalize-validate loop:
Hypothesize-Formalize
- Social Interaction. How do humans communicate using coordinated, dynamic multimodal signals (face, body voice, language) to navigate their social environments (in ≠ cultural, social and physical contexts)?
- Social Perception. How does the brain process these complex signals for perception, decision and action?
- Social Cognition. How do inter/intrapersonal factors (culture, language, emotion, bias, disease, dysfunction) affect these processes?
Validate
- Social Robotics/AI. How can we formalize and validate fundamental knowledge into “digital twins” (socially interactive virtual humans/social robots) in real-world settings and close the hypothesize-test-formalize loop?
Unique selling points (USPs). Our research is driven by a unique combination of state-of-the-art technologies that enable the real-time tracking, modelling, generation, and transformation of multi-modal human social signals, plus highly-powered data-driven methods and statistical frameworks that can derive causal explanations of human social interaction. Our research environment reflects our deep appreciation for the importance of interdisciplinarity, team-work, collaboration and ambition.
Team
Prof Rachael Jack
Professor of Computational Social Cognition
Head of cSCAN
Prof Stacy Marsella
Professor of Interdisciplinary Study of Social Interactions
Wolfson Chair of Excellence
Prof Hui Yu
Dr Pablo Arias Sarah
Dr Chen Zhou
Select Grants
- 2023–2027, EU Horizon MSCA Staff Exchanges, Affective Computing Models: from Facial Expression to Mind-Reading, Yu, €1.65M total (€825k to Yu, UK PI)
- 2022–2024, Marie Skłodowska-Curie Independent Fellowship, Studying Social Interactions with Audiovisual Transformations, ARIAS £190k
- 2021–2024, Leverhulme Early Career Research Fellowship, Equipping Artificial agents Psychologically-Derived Dynamic Facial Expressions, CHEN £180k
- 2020–2023, Engineering and Physical Sciences Research Council, From Data and Theory to Computational Models of More Effective Virtual Human Gestures, MARSELLA, £435k
- 2019–2027, SOCIAL, UKRI Centre for Doctoral Training, Socially Intelligent Artificial Agents (SOCIAL AI), MARSELLA + colleagues, £4.9M
- 2018–2024, European Research Council Starting Grant, Computing the Face Syntax of Social Communication, JACK £1.9M
- 2018–2023, Marie Curie Innovative Training Network, ENTWINE: The European Network on Informal Care, CROSS + Laban, €4M
- 2018–2022, Philip Leverhulme Prize in Psychology, Using Art & Neuroscience to Inform Next Generation AI, CROSS £100k
- 2016–2023, European Research Council Starting Grant, Mechanisms and Consequences of Attributing Socialness to Artificial Agents, CROSS €1.8M
Selected Publications
- Arias-Sarah, P., Bedoya, D., Daube, C., Aucouturier, J. J., Hall, L., & Johansson, P. (2024). Aligning the smiles of dating dyads causally increases attraction. Proceedings of the National Academy of Sciences, 121(45), e2400369121
- Raviv, L., Jacobson, S. L., Plotnik, J. M., Bowman, J., Lynch, V., & Benítez-Burraco, A. (2023). Elephants as an animal model for self-domestication. Proceedings of the National Academy of Sciences, 120(15), e2208607120. https://doi.org/10.1073/pnas.2208607120
- Snoek, L., Jack, R. E., Schyns, P. G., Garrod, O. G. B., Mittenbühler, M., Chen, C., Oosterwijk, S., & Scholte, H. S. (2023). Testing, explaining, and exploring models of facial expressions of emotions. Science Advances, 9(6), eabq8421. https://doi.org/10.1126/sciadv.abq8421
- Raviv, L., Lupyan, G., Green, S. C. (2022). How variability shapes learning and generalization. Trends in Cognitive Science. https://doi.org/10.1016/j.tics.2022.03.007
- Liu, M., Duan, Y., Ince, R. A. A., Chen, C., Garrod, O. G. B., Schyns, P. G., & Jack, R. E. (2022). Facial expressions elicit multiplexed perceptions of emotion categories and dimensions. Current Biology, 32(1), 200-209. https://doi.org/ 10.1016/j.cub.2021.10.035
- Cross, E. S., & Ramsey, R. (2021). Mind Meets Machine: Towards a Cognitive Science of Human-Machine Interactions. Trends in Cognitive Sciences, 25(3), 200–212. https://doi.org/10.1016/j.tics.2020.11.009
- Barrett, L. F., Adolphs, R., Marsella, S., Martinez, A. M., & Pollak, S. D. (2019). Emotional expressions reconsidered: Challenges to inferring emotion from human facial movements. Psychological science in the public interest, 20(1), 1-68. https://doi.org/10.1177/1529100619832930
Technology
cSCAN facilities include a wide range of state-of-the-art equipment, including:
- Head-mounted eye-tracker with real-world tracking
- Vicon-based motion capture lab
- Qualisys Motion Capture lab
- Xsens motion capture suit for portable motion capture
- VR labs with Vive Pro Eye and Oculus Quest headsets
- Character animation system for fully articulated, interactive characters
- DI3D face capture system with software to analyse images
- Di4D stereo photogrammetry and facial motion capture system
- Real-time 3D facial animation and rendering system
- 3D facial identity database and synthetic facial identity generation system
- Suite of humanoid and non-humanoid social robots
Research Partnerships
Joint East China Normal University (ECNU)-UofG Centre for Intelligent Agent Research
Aims to advance interdisciplinary research at the cutting-edge intersection of Psychology and AI
Directors: Prof. Shuguan Kuai, ECNU, Prof. Rachael Jack, UofG
Multimodal Social Interaction Research (MOSAIC)
International monthly online event to discuss grand challenges in multimodal social interaction research
Organizers: Prof. Rachael Jack, Dr Pablo Arias Sarah, UofG; Dr Limor Raviv, Dr Anita Slominska, Max Planck Institute (MPI) for Psycholinguistics, Netherlands
Illusion of the Fortnight
International online broad audience event showcasing illusions (any species, any sensory channel)
Organizers: Prof. Rachael Jack and Yuening Yan, cSCAN, UofG; Prof Fiona Macpherson, Dr Derek Brown and Tammy-Ann Husselman, Centre for the Study of Perceptual Experience (CSPE), Philosophy, UofG