Comparing Affective Feedback Styles for Social Robots using Augmented Reality Prototyping
Supervisor: Dr Shaun Macdonald
School: Computing Science
Description:
The need for people to communicate effectively with robots is an imminent and crucial societal challenge. The companion social robot market in healthcare is valued at $3B (20% CAGR), while the rise of Industry 4.0 has led to the widespread adoption of robots and drones across sectors, from automotive assembly to warehouse management and even the service industry. As emotional expression and interpretation are fundamental to human interaction, affective feedback will be core aspect in enhancing robot-user communication. Robots can use affective feedback to enrich this communication in many ways, such as helping indicate a robot’s state, facilitate natural interaction, and provide empathetic responses.
Despite its potential, there is no clear consensus on how to best implement emotional feedback in robotics. For example, should robots use affective sound in the form of abstract tones, music, words, or animal-like sounds? Should they display facial expressions to be emoji-like, realistic, human-like, animal-like or stylized in other ways? The practical constraints of robot development have limited the ability of prior work to comparatively evaluate such differences, with most prototypes adopting a single expressive approach per modality, leaving the design space unclear.
This project will address this gap by systematically evaluating different types of affective feedback in a social robotics scenario, from emotive audio cues to facial expressions and emojis. To facilitate this wide comparative evaluation without practical limitations of robot development and modification, the student will use our state-of-the-art Augmented Reality (AR) robot prototyping tool that allows for different audio and visuals to be projected onto an existing social robot in real-time to develop and evaluate affective cues. This will provide critical design insights for emotionally intelligent robots, from care companions and household assistants to industrial co-workers.
Project Timeline:
- Weeks 1-2: Catalogue prior work implementations of robot affective feedback, in coordination with supervisor expertise.
- Weeks 3-5: Design and deploy a range of alternative styles of audiovisual affective cues onto a simple social robot using the AR prototyping framework.
- Weeks 6-7: Conduct a comparative evaluation user study, in which users experience interacting with the robot using different affective feedback in AR.
- Weeks 8-10: Analyse results and contribute findings to an academic manuscript.
This study will help shape the future of human-robot interaction by informing the design of expressive, emotionally aware robots.