Our thoughts are private – or at least they were. Breakthroughs in neuroscience and artificial intelligence are changing that assumption while at the same time inviting new questions around ethics, privacy, and the horizons of brain/computer interaction.
Research published last week from Queen Mary University in London describes an application of a deep neural network that can determine a person’s emotional state by analyzing wireless signals that are used like radar. In this research, participants watched a video while radio signals were sent toward them and measured when they bounced back. Analysis of body movements revealed “hidden” information about an individual’s heart and breathing rates. From these findings, the algorithm can determine one of four basic emotion types: anger, sadness, joy, and pleasure. The researchers proposed this work could help manage health and well-being and be used to perform tasks like detecting depressive states.
Ahsan Noor Khan, a Ph.D. student and first author of the study, said: “We’re now looking to investigate how we could use low-cost existing systems, such as Wi-Fi routers, to detect emotions of a large number of people gathered, for instance in an office or work environment.” This could be useful for HR departments to assess how new policies introduced in a meeting are being received, regardless of what the recipients might say. Outside of an office, police could use this technology to look for emotional changes in a crowd that might lead to violence.
The research team plans to examine the public acceptance and ethical concerns around the use of this technology. Such circumstances would not be surprising and conjure up a very Orwellian idea of the ‘thought police’ from 1984. In this novel, the thought police watchers are experts at reading people’s faces to ferret out beliefs unsanctioned by the state, though they never mastered learning precisely what a person was thinking.
This is not the only thought technology example on the horizon with dystopian potential. In “Crocodile,”
An episode of Netflix’s series Black Mirror, the show portrayed a memory-reading technique used to investigate accidents for insurance purposes. The “corroborator” device used a square node placed on a victim’s temple, then displayed their memories of an event on screen. The investigator says the memories: “may not be accurate, and they’re often emotional. But by collecting a range of recollections from yourself and any witnesses, we can help build a corroborative picture.”
If this seems farfetched, consider that researchers at Kyoto University in Japan developed a method to “see” inside people’s minds using an fMRI scanner, which detects changes in blood flow in the brain. Using a neural network, they correlated these with images shown to the individuals and projected the results onto a screen. Though far from polished, this was essentially a reconstruction of their thinking. One prediction estimates this technology could be in use by the 2040s.
Brain-computer interfaces (BCI) are making steady progress on several fronts. In 2016, research at Arizona State University showed a student wearing a swim cap with nearly 130 sensors connected to a computer to detect the student’s brain waves.
The student is controlling the flight of three drones with his mind. The device lets him move the drones simply by thinking directional commands: up, down, left, right.
Advance a few years to 2019, and the headgear is far more streamlined. Now there are brain-drone races.
Besides the flight examples, BCIs are being developed for medical applications. MIT researchers have developed a computer interface that can transcribe words that the user verbalizes internally but does not speak aloud. A wearable device with electrodes picks up neuromuscular signals in the jaw and face that are triggered by internal verbalizations, also referred to as sub-vocalizations. The signs are fed to a neural network trained to correlate these signals with particular words. The idea behind this development is to meld humans and machines “such that computing, the internet, and AI would weave into human personality as a ‘second self.’” Those who cannot speak could use the technology to communicate as the sub-vocalizations could connect to a synthesizer that would say the words.
Chip implants could be coming soon.
The ultimate BCI could be that proposed by Neuralink, owned by Elon Musk. Unlike the previous examples, Neuralink promises direct implants into the brain. The near-term goal of Neuralink and others is to build a BCI that can cure a wide variety of diseases. Longer-term, Musk has a grander vision: He believes this interface will be necessary for humans to keep pace with increasingly powerful AI. Just last week, Musk announced that human trials of the implants could begin later this year. He claims the company already has a monkey with “a wireless implant in [his] skull with tiny wires which can play video games with his mind.”
The advancements in BCI are beginning to match what science fiction authors have dreamed up in works of fiction. In The Resisters, a new novel by Gish Jen, a “RegiChip” is implanted at birth into all of those deemed “Surplus,” meaning there will not be work for them in the aftermath of mass automation. Instead, they will be issued a universal basic income and have no responsibilities but to consume, to keep the automated economy operating efficiently. The RegiChip tracks everyone, their physical location, and activities to complete a surveillance society. Of course, the RegiChip, like all digital technologies, has the potential to be hacked.
Cognitive scientists have said that the mind is the software of the brain. Increasingly, physical software can meld with and augment the human mind. If AI-enabled BCI achievements seem unbelievable, it stands to reason that BCI breakthroughs in the not-too-distant future could be genuinely momentous. Will the technology be harnessed for positive use cases to cure diseases or mind control? As with most technology, there will likely be both good and bad. Software is poised to eat the mind. Our unexpressed thoughts remain private, but that may not be true shortly.