California-based researchers are showing off the latest version of their mind-reading artificial intelligence (AI) algorithm. In a Digital Trends article, the deep-learning AI can read a person’s thought patterns in order to identify the song playing from your device – and in your head.
Apps like Shazam employ similar machine learning that let them identify a song by listening to it. But this is on a wholly different level of intelligence.
Researchers from the University of California, Berkeley (UC Berkeley) started working on their AI in 2014. Study author Brian Pasley and his teammates attached electrodes to the heads of volunteers and measured brain activity while the participants were speaking.
After finding out the connection between brain activity and speech, they combined their accumulated brain data with a deep-learning algorithm. The AI then proceeded to turn the thoughts of a human being into digitally-synthesized speech with some accuracy.
In 2018, the UC Berkeley team have brought their AI to the next level of mind-reading. The improved deep-learning AI demonstrated 50 percent greater accuracy than its predecessor. It was better able to read the brain activity of a pianist and predict what sounds the musician is thinking of.
(Read more)