Navigation auf uzh.ch

Suche

UZH News

Reading Minds

Thanks to neurotechnology, it could soon be possible to read people’s thoughts, says linguist Balthasar Bickel. While fascinating from a me­dical perspective, it’s also incredibly dangerous. We sat down with the linguistics expert to discuss the future of language and its origins.
Roger Nickl
Balthasar Bickel
“Mind reading is still science fiction, but we are getting ever closer to this fiction becoming a reality. The coming dangers are comparable to the development of the atomic bomb”, says Balthasar Bickel.

Balthasar Bickel, you're co-director of the new National Center of Competence in Research (NCCR) project entitled Evolving Language, which researches the de­ve­lopment and evolution of language. Currently, digitalization is radically trans­forming our society. What kind of impact does this have on language?

Balthasar Bickel: neurotechnology. Today scientists have access to sophisticated computer-assisted methods of decoding neural signals. By analyzing electrical activity in the brain, it will soon be possible to predict what someone is thinking before they say it. This would be fantastic, for example, for aphasia patients who can no longer speak. They would then have the possibility to express themselves via a computerized interface.

Could you tell us what’s already possible today?

Bickel: A team in the United States recently succeeded in using an electrocorticogram – those are the electrodes that get placed directly on the brain during an operation – to reconstruct what a patient said to themselves non-vocally. He didn't say anything – just thought it.

That sounds creepy.

Bickel: It is. As amazing as it is from a therapeutic or medical perspective, it means that we can directly intervene in person-to-person communication. Thoughts are sup­po­sedly free, and we express only what we want to. If we can start reading what peo­ple silently think to themselves – in words only, without sounds or gestures – this is unbelievably dangerous. Just think of the potential uses in politics or the military. It’s horrifying.

So lie detector tests are obsolete?

Bickel: To a certain extent, yes. For instance, we can find out what someone is planning to say but then refrains from expressing it at the last moment. These are horror scenarios. This will fundamentally change the way we are able to communicate with one another. It would be a quantum leap in communication, which is very signi­fi­cant from an evolutionary perspective.

What's your take on all of this?

Bickel: Within the Evolving Language research group, we believe that the coming dangers are comparable to the development of the atomic bomb. Yes, at the moment it’s still mostly science fiction, but we are getting ever closer to this fiction becoming a reality. The topic of mind reading was an important driver for launching the new NCCR. We still know far too little about it, and this concerns me. Our goal is to raise the pu­blic’s awareness of this issue. We want to show where the line is between absurd science fiction and all-too-possible science fact.

How will you achieve this?

Bickel: We have to keep pace with the latest frontiers of brain-computer interface (BCI) research in order to understand it better. For this reason, we are also using neu­rotechnology – applied ethically and with the goal of making our methods and find­ings public. We also want to make recommendations to policymakers once we see that a certain development has dangerous implications for society. At the moment, we are running several research projects that analyze what happens in the brain when someone is constructing a sentence.

So you’re reading the grammar of the brain?

Bickel: We are reading how the brain plans grammatical structures. For instance, we can say with a relatively high degree of accuracy what kind of sentence someone is planning to say, and our colleagues in Geneva can say with a relatively high degree of accuracy what sounds someone has in their head. Recording meaning is still the great challenge at the moment. This is still our worst area, but even here progress is being made. There's a lot more that we can expect to see.

Another topic of digitalization is our increasing interaction with machines – bots like Siri and Alexa. How is that changing language?

Bickel: We have to ask the question of why, for example, communicating with bots ac­tu­ally doesn't work all that well. Colleagues of mine are researching why people over-enunciate when speaking with Siri.

Like how we talk to older people with hearing loss?

Bickel: Yes, and these computer systems are particularly bad at understanding it, be­cause they are trained to respond to normal language. It’s an interesting phe­no­me­non. We obviously don’t quite put our full faith in machines. But the question is: Do we have to retrain ourselves – or retrain the machines?

Will we get used to machines that talk?

Bickel: I think this will be surprisingly easy for us. We also got used to the telephone. That was a disruption in our way of communicating that shouldn't be underestimated. Suddenly, a central factor of our communication – gestures – was no longer there. This was probably a bigger leap than communicating with talking machines.

Evolving Language is also dealing with the topic of linguistic diversity. Today, languages like English and Mandarin are globally dominant, while many smaller languages are dying out. What does this increasing loss of linguistic diversity mean for us?

Bickel: This is a problematic development that we need to keep an eye on. The con­stant splitting off into different dialects and languages is a natural, biologically in­flu­enced feature of human communication. If we change ourselves in this regard and increasingly lose out on diversity, there will be consequences. Take this example: In New Guinea there are people who have more than 20 words for sugar cane. This nu­anced vocabulary is a reflection of an extremely refined body of knowledge about this crop. If this nuance disappears, the culture also loses its nuanced way of inter­acting with sugar cane. Interactions with the environment become less specialized, more prone to conflict.

Why?

Bickel: Because people then don’t pay attention to the various types of sugar cane. Maybe they then get replaced by one variety of sugar cane that is used for everything. This would be a loss for biodiversity.

So a reduction in vocabulary size and linguistic diversity leads to a reduction of biodiversity?

Bickel: Linguistic diversity and biodiversity go hand in hand. There’s a lot of evidence for this. If linguistic diversity disappears, which is currently the case, this relationship gets impacted, and we start dealing with the natural world in a less natural way.

So our current debates on sustainability also need to take linguistic diversity into consideration?

Bickel: Absolutely. This is extremely important and has been neglected up till now. Another consequence of language death: With every language that dies out, people lose an important part of their identity. Everywhere where languages are going extinct or have just gone extinct, you can find a heightened risk of ethnic conflict. This has not yet been proven empirically but there are hints that point in this direction.

What’s the reason for that?

Bickel: One hypothesis is that people who lose their native language seek out new identities for themselves. These new identities often involve political or religious fac­tors, which raises the risk of conflict.

You talked about how mind reading is tied to an impending evolutionary leap in the way we communicate. But how about when we look back at the history of language – what were the other important evolutionary leaps?

Bickel: A decisive leap was surely the explosion of our vocabularies, the possibility of constantly coming up with new terms, thereby changing language permanently. That was a critical development within human evolution.

When did this leap take place?

Bickel: Neanderthals – and also Homo erectus, which came earlier – probably already had language skills similar to those of modern humans. We also know that Neander­thals had very similar hearing abilities to ours. They also had the same FOXP2 gene that we have, which allows for things like sophisticated pronunciation abilities. Taken together, we have to assume that human language was already around in the time of Homo heidelbergensis, so around 500,000 years ago, and quite possibly already much earlier than that.

Flexible language, the ability to constantly coin new terms and link them with one another, sets us apart from other animals. How did it come about?

Bickel: The development of a highly pronounced ability to learn was surely the deci­sive leap when it comes to inventing words and creating expressions. Scientists are in less agreement when it comes to how complex syntax arose. There are a lot of dif­fer­ent theories, but little data. Researchers haven't been able to shed much light on this. That's why we want to approach the topic again and look into the roots of syntax and grammar. To this end, we are investigating how humans and other primates per­ceive events, for example. We hypothesize that perception is fundamental to how grammar works.

Can you give a specific example?

Bickel: Take the sentence “The gorilla ate a banana.” “Gorilla” is the agent, “ate” is the event, and “banana” is the victim, or the target. These three categories are funda­men­tal to the syntax of all languages. We assume that they are already ancho­red in our pre-linguistic perception of the world. That means we experience the world within these categories. Now, the important question is whether this holds true for our relati­ves, the other primates. To answer this question, we are currently obser­ving apes. We show apes scenes that are of interest to them and analyze whether they move their eyes and pay attention in similar ways to humans. We hope that this will shed light on the relationship between language and perception and that we will get some clues as to the basis of syntax and grammar.

The new NCCR is bringing together researchers from linguistics, biology and neuroscience to answer these kinds of questions. What is the value of this interdisciplinary approach?

Bickel: The cooperation between linguistics, biology and neuroscience makes it pos­sible to really tackle the big questions about how language arose and where it is evol­ving to next. We can learn a lot from each other. Traditionally, linguistics has been cha­ra­cterized by grand theories that place great emphasis on the differences between humans and animals. Maybe we’ll have to put that more into perspective. The human­i­ties sometimes treats humans as entities that are completely decoupled from their evolutionary history – as if we have somehow cut the link to our animal origins. I think it is much more productive not to emphasize the human-animal gap, but rather to search for ways to bridge it.

In the best case scenario, your new NCCR is set to run for 12 years. What do you hope to achieve in this time?

Bickel: Ideally, we will have understood neurotech mind-reading developments so well that we can advise policymakers and are able to implement these new technologies in areas where it’s ethically responsible and sensible. To make progress here, we first have to explore the biology behind the brain’s language abilities and what social con­di­tions led to their development. This means we need to understand how they arose over the course of our evolution. We want to know where they come from, how and when they developed and how they differ from those of other animals. I hope that in twelve years’ time we are able to draw a phylogenetic tree of our capacity for lan­guage. Just like we can show the evolutionary development of the eye, I’d like to be able to tell the origin story of language from the animal kingdom all the way up to modern humans.