Computers That Can Read Minds May Help Fight Autism
[vc_row][vc_column][vc_column_text]By Roger Highfield, for the Telegraph.co.uk.
Visitors to a major science exhibition are to help teach computers how to read confusion, mirth and other expressions. It is hoped that this will lead to the development of ways to help people with autism recognise emotions.
The thousands of visitors to next week’s Royal Society summer science exhibition are being invited to take part in research with “emotionally aware” computers designed to mind-read by analysing facial expressions.
The developers hope that one day smart adverts based on their technology will be able to tell when passers-by look glum and try to sell them something cheering, from an anti-depressant to a holiday.
They are also working with colleagues in America to develop a headset version of the system to help people who find it difficult to read others’
facial expressions and emotions, as happens with autism and Asperger’s syndrome. The headset would interpret other people’s moods and communicate them to the wearer.
A prototype has been developed at the University of Cambridge and will be unveiled at the exhibition in London.
Peter Robinson, professor of computer technology at the university,
said: “Imagine a computer that could pick the right emotional moment to try to sell you something, a future in which mobile telephones, cars and websites could read our minds and react to our moods.”
The computer program takes data from a camera to locate and track 24 facial “feature points”, such as the edge of the nose, the eyebrows and the corners of the mouth.
So far, 20 key facial movements, such as a nod or shake of the head, a raise of the eyebrow or pull on the corner of the mouth, have been linked to underlying emotions, using actors to indicate different facial expressions to the computer.
The team is refining the system with “real” people’s expressions and hopes that the exhibition will generate valuable new data to improve the program’s ability to read faces.
Prof Robinson said: “The system can cope with the variation in people’s facial composition, for example if you have a round or thin face or if you wear glasses or have a beard.
“However, there are small variations in the way people express the same emotion. My colleagues working at the Massachusetts Institute of Technology are fine-tuning the system by testing it with real people’s reactions to everyday life using cameras attached to neck-braces.”
The technology is also being developed for use in cars to improve driver safety. The team is recording the faces of volunteers in driving conditions and monitoring facial movements to identify more complex expressions linked to confusion, boredom or tiredness.
“We are working with a major car company and it is possible that this technology could feature in cars within five years,” Prof Robinson said.
The exhibition, which is held at the Royal Society, is free and runs from July 3 to 6.[/vc_column_text][/vc_column][/vc_row]