Mind-reading computer deciphers words from brainwaves before they are spoken!
TOKYO, Japan (PNN) - January 6, 2016 - A mind-reading device that can decipher words from brainwaves without them being spoken has been developed by Japanese scientists, raising the prospect of telepathic communication.
Researchers have found the electrical activity in the brain is the same when words are spoken and when they are left unsaid.
By looking for the distinct wave forms produced before speaking, the team was able to identify words such as 'goo', 'scissors' and 'par' when spoken in Japanese.
The scientists behind the technology said they can identify brainwaves associated with syllables or letters of the Japanese alphabet, meaning it may be possible to decode entire words and sentences without the need for any of them being physically spoken.
To “listen” to the unspoken words, the researchers used a method called electroencephalogram, or EEG.
This technology records electrical activity from the brain using an array of electrodes on the scalp to detect the brainwaves.
The team focused on a part of the brain known as Broca's area, which is thought to be involved in language processing and speech.
Lead author Professor Yamazaki Toshimasa, an expert in brain computer interfaces at Kyushu Institute of Technology in Japan's Fukuoka Prefecture, and his team asked 12 men, women and children to recite a series of words, measuring their brainwaves as they did so.
They found each syllable produced a distinct brainwave activity from the initial thought to the actual utterance. Activity could be seen up to two seconds before a word was spoken.
By compiling a database of different sounds, the researchers found it is possible to match these brainwave patterns to words, even if they are not spoken.
According to a paper presented at a conference organized by the Institute of Electronics, Information and Communication Engineers, the team's algorithms were able to correctly identify the Japanese words 'haru' and 'natsu', meaning summer and spring, 25% and 47% of the time.
They found it could identify single characters up to 88% of the time.
Professor Toshimasa believes the technology could be used to help people who have lost the ability to speak, or have become paralyzed, to communicate.
So far they have trained the system to recognize seven Japanese words, but hope to expand it in the future.
He said, “It could help with communication with aged people.”
He said the technology could also be adapted to allow people to control robots through the power of thought by helping the machines interpret instructions from brain activity.
Elsewhere, astronauts or deep-sea divers could use it improve communication in outer space or underwater where sounds can be distorted or difficult to transmit.
Professor Toshimasa explained, “Applications such as manipulating robots also become possible.”
The technology was also able to successfully identify the brainwaves associated with the Japanese worlds for 'will, 'one', 'turning,' and 'do' with between 80% and 90% accuracy.