Navigation auf uzh.ch

Suche

UZH News

Linguistics

Tapping into People’s Thoughts

Machines that read our thoughts and translate them into spoken language – what sounds like science fiction could be a real possibility in the future, says linguist Balthasar Bickel. Scientists in the NCCR Evolving Language are researching mind-reading, both to gain a better understanding of it and to warn of the dangers.
Roger Nickl; English translation by Caitlin Stephens
Researchers at UZH can work out people's sentence structure plans before they start speaking.


Reading people’s thoughts before they say them is impossible, right? No longer – a research team in San Francisco has managed to do just that. Using electrodes placed directly onto the cerebral cortex, the researchers were able to measure the brain activity so precisely that they could work out what patients were silently saying to themselves. The possibilities are fascinating, for example in terms of helping people who are unable to speak or who suffer from a severe speech impediment. In the future they could express themselves via machines that read their thoughts and translate them into spoken language. At the same time, the prospect of neurotechnological devices having access to our innermost secrets is extremely worrying. “Thoughts are free, as the adage goes,” says linguist Balthasar Bickel. “If we were able to directly tap into other people’s thoughts before they said anything, it would be incredibly dangerous.” One aim of the National Center of Competence in Research Evolving Language is therefore to reflect on and contribute to responsible research into mind-reading. The leader of the NCCR, Balthasar Bickel, explains why in this interview.

Balthasar Bickel, what exactly is being investigated in connection with mind-reading at the NCCR Evolving Language?

Balthasar Bickel: At the moment we don’t even know what is possible in this area. We are therefore running various projects looking at what happens in the brain when a person is planning to say something out loud. You start planning a sentence a few seconds before you speak, and there’s a lot going on in the brain before you open your mouth. We are getting better and better at measuring what goes on in the brain.

What exactly can be measured?

Bickel: Our colleagues in Geneva can already rather precisely identify what sounds the brain is planning. In our team at UZH, we have been able to detect plans for short sentence structures about two seconds before the person starts speaking. It’s more difficult to detect the planning of meaning. At this point we are quite far from managing to do that. But more and more progress is being made – mind-reading is moving out of the realm of science fiction. We are at the forefront, and we want to find out what is possible and explore the consequences. We also want to bring the discussion into the political arena. Much too little is known at present and that worries me a lot: When we talk about digitalization, we think of smartphones, databases and Zoom meetings, and how we use them. But those things are nothing compared to the revolution just ahead of us in neurotechnology. It has the potential to change the way we communicate with each other on a much more fundamental level.

In what way?

Bickel: Human communication is based on us generally giving only abstract hints of what we are actually thinking. The aim of a conversation between two people is for them to work out together what each of them really means.

So we are constantly interpreting what we are hearing?

Bickel: Exactly, our communication is based on this interpretation mechanism, a kind of natural mind-reading. If we could read thoughts directly with machines, it would be a radical change of huge evolutionary significance.

What would radically change?

Bickel: The whole design of human language would change. The way speech works is that we use sounds or gestures to convey only very abstract concepts. Everything else has to be figured out by creative interpretation. It is the listener’s job to interpret. The speaker then corrects the listener if they feel they have been misunderstood. These interpretive connections are what our language, our vocabulary and our grammatical structures are based on. All this would be superfluous if we could read thoughts directly.

Because we would be able to grasp the specific content of a thought?

Bickel: Exactly. That is still science fiction, but we are getting ever nearer to it. For example, under certain experimental conditions using a simple ECG we can now discover whether someone is planning to speak about an agent (“Lisa threw the stone”) or not (“she slept”). Ten years ago that was held to be impossible. Perhaps in the future it will be possible to access the phases that precede the planning of meaning – i.e. at the interface of concrete thoughts and abstract speech planning, and outside of experimental conditions, even.

Philosophically speaking, do thoughts exist without language?

Bickel: Depending on the definition, there is certainly something like extralinguistic thinking, but there are more and more indications that thinking is influenced and formed by the languages we learn. The more our thoughts are linguistically formed, the more easily they can be detected with the same methods we use to detect sentence and word plans.

So will speaking or writing no longer be necessary if we can just exchange neural patterns between our brains?

Bickel: Yes, in principle we could reach a point where we would be able to digitally transmit linguistically formed, or partially formed, thoughts directly to another person. There’s another possible consequence too: Because we are currently forced to use sounds, writing and gestures for communication, our language has become linear. We have to put things in a certain order. But if we transmit thoughts digitally, we can exchange different sentences simultaneously – like computers, which work in parallel.

So we would become computers?

Bickel: It’s possible. The fact is that right now we have no idea whatsoever about what the consequences might be. Future developments could even be far more dramatic than what I’ve described to you. But equally, things might take a completely different turn. It is hard to imagine what would happen when several different sentences and their contents are supplied directly to the brain in parallel without the information being filtered through the eyes, ears or another sense. Research in this area is going to happen, whether we get involved or not. The aim of our research is therefore to find out what might even be possible and to raise awareness among the public about these topics. We want to show what is in the realm of possibility and what is absurd science fiction, and in particular to point out where the dangers lie.