Recognising the Deafblind Manual Fingerspelling Alphabet using Optical Sensors
Supervisor: Dr Euan Freeman
School: Computing Science
Description:
Almost 500,000 people in the UK have dual sensory loss (i.e., loss of vision and hearing), a number that will increase as the population ages. People with dual sensory loss often encounter difficulty communicating with others, leading to loss of independence, increased feelings of isolation, and having negative effects on mental health and wellbeing. Touch can be used for communicating with people with dual sensory loss, e.g., through tactile alphabets like Deafblind Manual. Deafblind Manual is an adapted version of the British Sign Language fingerspelling alphabet, which is used to spell out the individual letters in words directly to the hand of another person. Through its use of touch, Deafblind Manual can be effective in allowing people with dual sensory loss to communicate with others.
Academics in the School of Computing Science are beginning to investigate the development of interactive systems that use the Deafblind Manual fingerspelling alphabet for communication, by using hand-tracking sensors for recognising symbols in the alphabet and a mid-air haptics device for presenting tactile sensations to the hand. Such a system could enhance social connectedness by enabling people with dual sensory loss to remotely communicate with other people, without the need to be in the same place. It could improve access to information and services by providing an appropriate representation of digital information. This could also help people independently learn to communicate using Deafblind Manual, helping them integrate into the deafblind community and enhancing their ability to communicate with others with dual sensory loss.
The aim of this internship is to develop an approach for recognising Deafblind Manual symbols using optical hand tracking sensors. There are novel challenges to overcome in recognising these symbols: (i) they require tracking two hands independently, (ii) the signing hand partially occludes the other hand (i.e., fingers visually mask part of the other hand), and (iii) there are variations in hand size and signing styles that need to be considered. This project will establish a dataset of features extracted from Deafblind Manual symbols and explore the feasibility of using machine learning to recognise these symbols.
Project outline:
• Learn about development using the Leap Motion and Intel RealSense sensors (2 weeks)
• Implement a hand tracking feature set for data collection (2 weeks)
• Collect a dataset of people performing Deafblind Manual symbols (1-2 weeks)
• Implement a deep learning classifier that recognises Deafblind Manual symbols, trained
on the collected dataset (3-4 weeks)