Navigation auf uzh.ch

Suche

UZH News

Privacy

Privacy on the Web

Modern technologies are dangerous. Does this mean that individuals should be as seamlessly protected as possible? UZH professor of law Florent Thouvenin believes that this, the current approach to data protection, is misguided.
Thomas Müller

Kategorien

“These days you can forget real transparency. Users still have no clear idea of what Google, Facebook & Co. are doing with their data,” says legal expert Florent Thouvenin, associate professor for Information and Communications Law and board member of the UZH Digital Society Initiative (DSI). (Picture: Jos Schmid)

 

Privacy is important for many people. Despite this they share their personal photos, mobile number, and current mood on the internet – and not just with their Facebook friends, but with the company itself. They ask Google questions they wouldn’t dare ask their friends or partner. And they voluntarily carry devices around with them that can be used to track their every move.

Researchers call this phenomenon the privacy paradox. Data protection experts like to explain it by pointing out that many users aren’t aware of all the places their data is collected. They speak of peer pressure and the generally contradictory nature of human beings. According to this version that’s simply how it is: the way people think and act is often inconsistent.

Changing need for protection

For Florent Thouvenin, people’s irrationality is only half the story. “There’s more to it than that,” says the professor of Information and Communications Law at UZH. For him a much more plausible explanation is that those affected have a very different notion of what privacy means – and thus different needs in terms of protection – than the notion on which today’s data protection framework is based.

Current data protection goes back to concepts from the 1970s, in other words concepts that predate the internet by a quarter-century. While society and technology have undergone profound changes in the meantime, the regulations, in Thouvenin’s view, have remained more or less unchanged. Not only this, but again and again the emergence of data protection laws has been influenced by political developments. For example the “secret files” scandal in Switzerland showed that the state had been busy spying on its own citizens since the beginning of the twentieth century. The same goes for the EU’s new General Data Protection Regulation, which is supposed to create a harmonized legal framework for data protection in the EU for the first time and comes into effect at the end of May this year. This regulation was heavily influenced by the Snowden affair. In Germany, which is taking a leading role in European data protection, the trauma of World War II and fears of a totalitarian state continue to resonate.

Flirts and scientific data

The users of online media who are now so generous with their data obviously have very different priorities. Are they too young or naive to grasp the threat? Rather than naivety, Florent Thouvenin puts it down to another, differentiated perspective, pointing out that privacy is a highly complex, multifaceted concept. He suggests that maybe protecting privacy is important to these users if it’s about people they actually know – their neighbor, their former spouse, co-workers, or boss. They don’t care if chat data on their latest flirt is stored on some server, as long as it doesn’t find its way to their partner. A scientist isn’t bothered if the research they’re doing for their next paper leaves traces on Google or in scientific databases. But they will certainly want to avoid a competitor getting access to the information and poaching their ideas.

But this is only conjecture on Thouvenin’s part. “For this reason we should be taking a fundamental, interdisciplinary look at the concept of privacy, and gathering empirical evidence of people’s needs in this area,” he concludes. He’s currently working with colleagues to develop a four-year research project with the working title “Rethink Privacy” under the aegis of the Center for Information Technology, Society, and Law (ITSL) and the Digital Society Initiative (DSI) at UZH. They’re particularly interested in the differences between different generations and legal systems.

Vague fears

Florent Thouvenin’s criticism is fundamental. That also has to do with his academic biography. Originally his research focused primarily on copyright, patent, and trademark law, also areas heavily influenced by technological change. This allowed Thouvenin (now aged 42) to start looking at data protection with a fresh eye and, most importantly, to ask questions. So far he’s failed to come up with satisfactory answers.

Unlike most colleagues guided by personal conviction, Thouvenin doesn’t automatically assume that citizens should be protected from data processing by governments and companies. Of course he knows how other people see it, “but I’m not convinced,” he says. Inquiries as to where actual problems were arising as a result of data processing were fruitless. “Heretically you could say that the present data protection legislation is mainly based on a vague fear – many people feel somehow uneasy because they don’t understand what happens to their data and what that might mean for them.”

Indeed the present data protection arrangements don’t work optimally. A key principle is that data processing should really be transparent for those affected, the data subjects. But these days you can forget real transparency. Thouvenin criticizes the fact that users still have no clear idea of what Google, Facebook, & Co. are doing with their data. Given the approximate data information available, even experts don’t really understand precisely how and for what purposes these organizations process individual data.

Not only that, but under existing data protection law, a person can only give legally valid consent for the processing of their data if they’re aware of the consequences. In practice that remains a dead letter. As we all know from our own experience, in most cases we click a field to confirm that we agree to the data privacy policy, without having really informed ourselves. Surveys show that up to 90 percent of users don’t read statements of this sort. Informed consent is thus the “biggest lie on the internet,” and the notion of acceptance a “misguided concept.”

Even though he sometimes doesn’t mince words, Florent Thouvenin isn’t trying to sledgehammer established approaches. His aim is data protection that works – and addresses precisely the points that are important to individual citizens. Some data protection principles he roundly defends. For example, given the potential leaks he sees data security as “fundamental.” He’s also a proponent of the principle of transparency: if you don’t know when and how “your” data are processed, you can’t decide whether you want to consent or refuse.

Other principles he sees as problematic because they make it more difficult to use data and thus stand in the way of potential opportunities for society, business, and research, without actually protecting those affected. This is the case for data minimization (the requirement that only the personal data required for a particular purpose be processed) and storage limitation (data may only be stored for as long as necessary for this purpose); and to a limited extent also for the principle of purpose limitation (personal data may be gathered and processed only for specified, clear, and legitimate purposes).

In many settings these three principles hinder the procurement, use, or storage of data, even though this data could potentially be of great benefit, for example in medical research. The storage limitation requirement, for example, means that data has to be deleted as quickly as possible, even though it could prove to be useful in the future. Purpose limitation makes it impossible to analyze data for purposes that nobody had yet thought of at the time it was gathered.

Individualized prices

While it still remains to be seen precisely what potential lies in big data, it’s already clear that the promise of progress in personalized medicine can only be delivered on if the processing of personal data for research purposes is made much easier. Thouvenin in no way denies that data processing can have negative consequences. In his view, however, the concrete problems first have to be identified before appropriate solutions can be worked out.

One problem that falls into this category is forms of discrimination made possible by data processing, for example individualizing the prices of staple items or excluding someone from taking out insurance. “Politicians and society at large have to ask these questions and decide, for example, what kinds of insurance – health insurance, say – solidarity is key to, and where insurance companies can also charge their customers different premiums.”

As things stand at present this type of approach can be more or less ruled out because, Thouvenin claims, data protection makes data processing into a problem that has to be subject to pervasive regulation on the basis of what are often purely hypothetical risks. This ultra-preventive approach, he says, makes it considerably more difficult for businesses and researchers to use personal data. It thus gives rise to substantial costs without generating any clearly identifiable benefit for the people concerned. According to Thouvenin, the alternative should be an approach that addresses the areas where problems actually exist and enables the potential that lies in data to be harnessed. This would, for example, make it possible to create better medical therapies and to treat individual people in a fairer and more differentiated fashion.