{"id":274511,"date":"2023-11-15T05:00:00","date_gmt":"2023-11-15T10:00:00","guid":{"rendered":"https:\/\/platohealth.ai\/the-biggest-questions-is-it-possible-to-really-understand-someone-elses-mind\/"},"modified":"2023-11-15T13:29:40","modified_gmt":"2023-11-15T18:29:40","slug":"the-biggest-questions-is-it-possible-to-really-understand-someone-elses-mind","status":"publish","type":"post","link":"https:\/\/platohealth.ai\/the-biggest-questions-is-it-possible-to-really-understand-someone-elses-mind\/","title":{"rendered":"The Biggest Questions: Is it possible to really understand someone else\u2019s mind?","gt_translate_keys":[{"key":"rendered","format":"text"}]},"content":{"rendered":"
<\/div>\n
\n
\n
\n
\n

Technically speaking, neuroscientists have been able to read your mind for decades. It\u2019s not easy, mind you. First, you must lie motionless within the narrow pore of a hulking fMRI scanner, perhaps for hours, while you watch films or listen to audiobooks. Meanwhile, the machine will bang and knock as it records the shifting patterns of blood flow within your brain\u2014a proxy for neural activity. The researchers, for whose experiment you have volunteered, will then feed the moment-to-moment pairings of blood flow and movie frames or spoken words to software that will learn the particularities of how your brain responds to the things it sees and hears.<\/p>\n

None of this, of course, can be done without your consent; for the foreseeable future, your thoughts will remain your own, if you so choose. But if you do elect to endure those claustrophobic hours in the scanner, the software will learn to generate a bespoke reconstruction of what you were seeing or listening to, just by analyzing how blood moves through your brain. <\/p>\n<\/p><\/div>\n<\/div>\n

\n
\n

Back in 2011, UC Berkeley neuroscientists trained such a program to create ethereal doubles<\/a> of the videos their subjects had been watching. More recently, researchers have deployed generative AI tools, like Stable Diffusion and GPT, to create far more realistic, if not entirely accurate, reconstructions of films<\/a> and podcasts<\/a> based on neural activity. Given the hype, and financial investment, that generative AI has attracted, this kind of stimulus reconstruction technology will inevitably continue to improve\u2014especially if Elon Musk\u2019s Neuralink succeeds in bringing brain implants to the masses. <\/p>\n

But as exciting as the idea of extracting a movie from someone\u2019s brain activity may be, it is a highly limited form of \u201cmind reading.\u201d To really experience the world through your eyes, scientists would have to be able to infer not just what film you are watching but also what you think about it, how it makes you feel, and what it reminds you of. These interior thoughts and feelings are far more difficult to access. Scientists have managed to infer<\/a> which specific object, out of two possibilities, someone was dreaming about; but in less constrained settings, such approaches struggle.<\/p>\n<\/p><\/div>\n<\/div>\n