PROGRAM 2

Speech & Language Intelligence

RP2-2

Augmenting Non-Verbal Language Usage for People with Complex Communication Needs

The ability to conduct daily communication is of critical importance to individuals as well as society. People with complex communication needs do not possess the necessary cognitive abilities and/or motor skills to conduct verbal conversation. They often rely on AAC (which are approaches other than speech and texts for compensating communication impairments) to supplement their verbal interactions. They also need to undergo regular treatments and interventions to acquire non-verbal-based daily communication skills. ​

The aim of this research project is to apply recent and continuous research breakthroughs in AI and robotics to support non-verbal language usage for people with complex communication needs – through the design, implementation, and validation of new AI-based speech & language technologies for universal usability, especially in users with communicative impairments. Our objectives include:

  • To construct a core language model (in Hong Kong’s daily communication context) and its personalized sub-models of AAC symbols for non-verbal AAC users in Hong Kong.

 

  • To implement neuro-linguistic inspired deep learning algorithms for AAC symbol prediction and recommendation.

 

  • The adaptation, development, and validation of multimodal HCI methods (such as gestural input, sign language, and eye-gazing) that can accommodate users’ physical or cognitive limitations when interacting with our AAC applications. ​

 

In this demonstration, we show one of our AAC client applications – which is targeted at users with cerebral palsy and normal intelligence. We have integrated a Tobii eye tracker into our application as an eye-gazing-based input method; so that users with severe physical limitations can express themselves by eye movement. The client application is connected to our cloud AAC server, which hosts a language model and provides intelligent AAC symbols recommendations based on the patterns of users’ input. We also collaborate with a local company to develop AAC clients on social robots for young children with special needs.