Reading minds with YouTube
Be prepared: this will be the most amazing thing you have heard this week. Or ever.
Scientists at UC Berkeley have found a way to reconstruct moving images that are in the minds of their subjects. In other words, they’ve made a mind-reading machine. How? With an fMRI scanner and YouTube.
Subjects at the university’s Gallant Lab were first shown a random selection of movie trailers. Scientists then generated a reconstruction of the moving image as it appeared in the subject’s own mind! The two images are compared below:
The reconstructed image was generated by using computer models to associate the shapes, edges and motion of moving images on the screen with corresponding brain activity. Once these models have been refined and tested, they are let loose on 18 million seconds of random YouTube clips, selecting the 100 clips that they predict will generate the most similar brain activity to the original footage (the top ones are outlined in blue below, for each of the three subjects). These top 100 clips are then averaged to generate the reconstruction (outlined in green below).
“This is a major leap toward reconstructing internal imagery,” said Professor Jack Gallant, a UC Berkeley neuroscientist and coauthor of the study published in the journal Current Biology. “We are opening a window into the movies in our minds.”
As research progresses, it may soon be possible to even watch movies of people’s dreams or memories, or see inside the minds of coma patients. It’s truly the stuff of science fiction – but, of course, we expect nothing less amazing from the place that brought us the eLEGS.