You are here

25 May 2018 is an important date: that is when the new European privacy legislation will finally become law. Not a moment too soon, according to professor Joris van Hoboken, who leads the new research chair Fundamental Rights and the Digital Transformation. “Governments and corporations alike are currently pushing and crossing the boundaries of what they can and should do.”

 

We meet Joris van Hoboken at the 11th annual privacy and data protection conference at the Hallen van Schaarbeek, where Brussels is presenting itself as ‘the capital of privacy’. It is a cause near and dear to the brand new VUB professor: he is active at the Instituut voor Informatierecht at the University of Amsterdam and has recently begun work at VUB as head of the research chair Fundamental Rights and the Digital Transformation, which is supported by Microsoft. 

“The research chair focuses on the question of how we can protect fundamental rights while everywhere around us major digital transformations take place at an ever increasing rate. I’m talking about rights such as privacy, freedom of expression, freedom of information and the right to non-discrimination. My intuition - and the research I carried out earlier - tells me these rights are at peril, or at least we should be concerned about their future. 

The research chair focuses on the question of how we can protect fundamental rights while everywhere around us major digital transformations take place at an ever increasing rate.
-
Joris Van Hoboken

The prevailing idea seems to be that large tech companies and governments are flirting with the lines of what is legal and just.

It is inherent to the culture of Silicon Valley to forever look for the boundaries - and push them even further. That became very clear in the case of Uber. Especially in the tracking of surfing behaviour there are a lot of overzealous ‘cowboys’. The same goes for advertisers. It is as if the Far West is never far away. Sometimes it is good to push boundaries, it compells you as a society to question things. But on the other side of the spectrum we find fundamental rights which in Europe are valued dearly. Think e.g. of the ruling of the Court of Justice of the European Union regarding the right to be forgotten. Or the data retention directive, that made short work of the obligation to retain personal information. You see attempts from the British government now to reintroduce that retention obligation, they don’t want to give that up. It is an extremely interesting research field, with constant technological evolutions and the consequent attempts to create a legal framework for them. It is also a hot topic with broad layers of society, which only adds to the appeal.

 

Europe has a new privacy directive, the much discussed General Data Protection Regulation. How much should we expect of GDPR?

That is one of the many things we examine within the research chair. A major issue is that there remains a lot of room for interpretation, especially where it concerns certain key notions of the law. When do we speak of personal information? And what do we consider personal information when we know there are abundant technological options to anonimise that data? There remain scores of unanswered questions. Another interesting aspect will be what the national privacy commissions will focus on in their upholding of the legislation. It will be impossible to uphold every aspect of it, so choices will need to be made.

 

(Continue reading below the picture)

The responsibility for compliance to GDPR will first and foremost come to lie with corporations and governments themselves.

Yes! They will become responsible to respect its fundamental principles. One of the advantages of the new law is that it will be easier to penalise those who are non-compliant, e.g. through very high fines. Many platforms that distribute information nowadays - i.e. social media - bare a large responsibility and expect a large deal from artificial intelligence to help them counter the spreading of illegal content and hate speech. I am very sceptical about that. Algorithms are not as neutral as we’d like to believe. As long as you’re part of the majority, everything is fine. But not so when you are a statistical outlier. Certain groups are systematically disadvantaged through algorithms. I don’t think that is a good thing. That will also be part of our research.

 

How did VUB come to you for this research chair?

I’ve been active on topics like privacy, data storage and the management of platforms and internet services. I also spent three years at New York University, which gives me a fair idea of how these topics are looked upon over there as well. For Microsoft, which funds the chair, that is an added bonus. When I received the offer, I didn’t hesitate for a second.

Certain groups are systematically disadvantaged through algorithms. I don’t think that is a good thing.
-
Joris Hoboken

How big is the influence of Microsoft on the research?

We have full academic freedom, that is explicitly described in the agreement we have with Microsoft. They actually have a very good reputation in that regard, and are very active in the debate we cover. VUB also prepared everything meticulously and assessed whether there was enough scientific potential in the research chair. It clearly didn’t come about overnight.