Scholarship Reporter Newsletter

December 2018

Why a Right to Legibility of Automated Decision-Making Exists in the General Data Protection Regulation

This papers analyzes the GDPR’s “right to explanation.” The authors make a clear distinction between different levels of information and of consumers’ awareness;  they propose a new concept — algorithmic “legibility” — focused on combining transparency and comprehensibility.

The authors argue that a systemic interpretation is needed in this field. They show how a systemic interpretation of Articles 13–15 and 22 GDPR is necessary and recommend a “legibility test” that data controllers should perform in order to comply with the duty to provide meaningful information about the logic involved in automated decision-making.

Abstract

The aim of this contribution is to analyse the real borderlines of the ‘right to explanation’ in the GDPR and to discretely distinguish between different levels of information and of consumers’ awareness in the ‘black box’ society. In order to combine transparency and comprehensibility we propose the new concept of algorithm ‘legibility’.

We argue that a systemic interpretation is needed in this field, since it can be beneficial not only for individuals but also for businesses. This may be an opportunity for auditing algorithms and correcting unknown machine biases, thus similarly enhancing the quality of decision-making outputs.

Accordingly, we show how a systemic interpretation of Articles 13–15 and 22 GDPR is necessary, considering in particular that: the threshold of minimum human intervention required so that the decision-making is ‘solely’ automated (Article 22(1)) can also include nominal human intervention; the envisaged ‘significant effects’ on individuals (Article 22(1)) can encompass as well marketing manipulation, price discrimination, etc; ‘meaningful information’ that should be provided to data subjects about the logic, significance and consequences of decision-making (Article 15(1)(h)) should be read as ‘legibility’ of ‘architecture’ and ‘implementation’ of algorithmic processing; trade secret protection might limit the right of access of data subjects, but there is a general legal favour for data protection rights that should reduce the impact of trade secrets protection.

In addition, we recommend a ‘legibility test’ that data controllers should perform in order to comply with the duty to provide meaningful information about the logic involved in an automated decision-making.

"Why a Right to Legibility of Automated Decision-Making Exists in the General Data Protection Regulation" by G. Malgieri, G. Comandé International Data Privacy Law, Volume 7, Issue 4, 1 November 2017, Pages 243–265