Written by Ine van Zeeland, Heritiana Ranaivoson & Luciano Morganti
In the beginning of 2019, the French Data Protection Authority (CNIL) handed Google a 50 million Euro fine for not properly explaining the consequences of its profiling activities. Professionals in the media and advertising industries pricked up their ears. While enforcement actions unnerve many in the media sector, underlying justifications provide clarity on the interpretation of contested provisions in data protection legislation. Clarity is sorely needed.
‘Lack of clarity’ emerged as the main theme with regard to personal data protection in a roundtable discussion organized in February between stakeholders in the Belgian media sector (news media, telecom providers, consumer advocates, academia, regulators, law firms, ad tech, and intermediaries). Interpretations of requirements differ between companies, between companies and consumers/users, and between regulators and companies. The consequence of this is that nobody knows for sure what is allowed and what is not.
Structural distrust
One aspect that is problematic in that respect is the lack of clarity among so-called ‘data subjects’ (consumers, Internet users, citizens) about what happens with their personal information. Legislation like the EU’s General Data Protection Regulation (GDPR) and the upcoming California Consumer Privacy Act (CCPA) emphasizes that data subjects should at least know what happens with data about them and, in many cases, consent to the use of those data. Legal aspects apart, growing privacy sensitivity among users provokes a concomitant distrust of data collection practices like (cookies) tracking, leading to a rise in the deployment of cookies and advertising blockers.
Industry professionals who took part in the roundtable discussion complained about this distrust as all they want to do is make life easier for their users with personalized services and relevant advertising. “People want personalization, they just don’t know it”, “People think we do much worse things than we actually do”, were comments made by participants.
A complex and obscure ecosystem
One could simplistically argue that to reassure users and comply with the law, media companies have to put more effort into explaining what happens with personal data. But explaining what happens with personal data in the online environment is not that straightforward. Automated systems like (news) recommendation systems or personalization algorithms are notoriously difficult to explain. This is especially true when they are based on machine learning. The online advertising ecosystem itself is too complex for many to follow, in particular when it comes to programmatic advertising.
As we have argued in a recent Policy Brief, this last point leads to a disparate impact of GDPR requirements between big tech companies, such as Google and Facebook, on the one hand, and smaller players in the media and advertising industry on the other hand. The effects of requiring explicit consent for user tracking vary in such a manner that they give the upper hand to the large multinationals. Users know Facebook and Google and might expect some kind of use of their data by these ‘first party’ players, where they have an account. But research shows that most users do not expect a large number of third parties in the rest of the online advertising ecosystem. Interestingly, users see this sharing of data with third parties as a norm violation. If asked for consent, users would refuse it. As a result, the big platforms will own more and richer user data than smaller players whose brands are unknown to the average internet user.
The need for clear delineations
However, being big also means you attract more attention. Data protection authorities all over Europe are looking into the advertising practices of Google and Facebook, often prompted by complaints from advocacy groups like noyb, Panoptykon and La Quadrature du Net, and privacy browser Brave. The 50 million Euro fine for Google was the result of one such complaint. The regulator in this case based the fine on the byzantine way explanations were provided by Google, requiring several clicks and perusal of various documents for users to find out what their data were used for and never being clear on which specific data were gathered, nor what the consequences of the data processing operations would be. A factor that was noted as relevant was the number of Google services that users interact with, making the data processing particularly massive and intrusive.
What we learn from Google fine is that regulators place great value on clear and accessible explanations to users. Unfortunately, the case does not provide much clarity on how to be clear and accessible. However, to a certain extent, the media industry can devise its own benchmarks and standards, providing yardsticks to regulators still dealing with complaints.
Coming to acceptable explanations
Following up on the roundtable discussion, MediaRoad partner imec-SMIT-Vrije Universiteit Brussel is setting up a study to ascertain what different stakeholders consider to be acceptable explanations when asking for user consent in the online media environment. The aim of the study is to come to a consistent approach to explaining personal data usage throughout the media sector, or alternatively, to offer insights on the acceptability of explanations in the eyes of different stakeholder groups.
If you wish to have your opinion heard in this study, please email dataprotectionontheground@vub.be (before 1 July 2019).
—
Details on the Chair
The VUB Chair “Data Protection On the Ground” (DPOG) promotes the investigation into actual practices of data privacy in organizations and the dissemination of best practices. The focus of its research is on developments in four sectors: smart cities, health, media, and banking. The Chair is coordinated by the research center imec-SMIT (Studies on Media, Innovation & Technology) in collaboration with the research group LSTS (Law, Science Technology & Society), and supported by BNP Paribas Fortis. For more: www.dataprotectionontheground.be