Impossible Explanations? Beyond explainable AI in the GDPR from a COVID-19 use case scenario

Ronan Hamon, Hendrik Junklewitz, Gianclaudio Malgieri, Paul De Hert, Laurent Beslay, Ignacio Sanchez

Research output: Chapter in Book/Report/Conference proceedingConference paper

16 Citations (Scopus)

Abstract

We present a case study of a real-life scenario designed to illustrate the application of an AI-based automated decision making process for the medical diagnosis of COVID-19 patients. The scenario exemplifies the trend in the usage of increasingly complex machine-learning algorithms with growing dimensionality of data and model parameters. Based on this setting, we analyse the challenges of providing human legible explanations in practice and we discuss their legal implications following the General Data Protection Regulation (GDPR).
Original languageEnglish
Title of host publicationFAccT 2021 - Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency
PublisherACM
Pages549-559
Number of pages11
ISBN (Electronic)9781450383097
DOIs
Publication statusPublished - 3 Mar 2021
EventFAccT '21: Proceedings of the 2021 ACM Conference on Fairness, Accountability, and TransparencyMarch 2021 -
Duration: 25 Mar 2021 → …

Publication series

NameFAccT 2021 - Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency

Conference

ConferenceFAccT '21: Proceedings of the 2021 ACM Conference on Fairness, Accountability, and TransparencyMarch 2021
Period25/03/21 → …

Fingerprint

Dive into the research topics of 'Impossible Explanations? Beyond explainable AI in the GDPR from a COVID-19 use case scenario'. Together they form a unique fingerprint.

Cite this