Impossible Explanations? Beyond explainable AI in the GDPR from a COVID-19 use case scenario

Ronan Hamon, Hendrik Junklewitz, Gianclaudio Malgieri, Paul De Hert, Laurent Beslay, Ignacio Sanchez

Research output: Chapter in Book/Report/Conference proceedingConference paper

4 Citations (Scopus)

Abstract

We present a case study of a real-life scenario designed to illustrate the application of an AI-based automated decision making process for the medical diagnosis of COVID-19 patients. The scenario exemplifies the trend in the usage of increasingly complex machine-learning algorithms with growing dimensionality of data and model parameters. Based on this setting, we analyse the challenges of providing human legible explanations in practice and we discuss their legal implications following the General Data Protection Regulation (GDPR).
Original languageEnglish
Title of host publicationProceedings of ACM FaaCT. ACM, 2021 New York, NY, USA, 2 pages. https://doi.org/10.1145/1234567890
Number of pages2
Publication statusPublished - 2021

Fingerprint

Dive into the research topics of 'Impossible Explanations? Beyond explainable AI in the GDPR from a COVID-19 use case scenario'. Together they form a unique fingerprint.

Cite this