Fairness is central to the ethical and responsible development and use of AI systems, with a large number of frameworks and formal notions of algorithmic fairness being available. However, many of the fairness solutions proposed revolve around technical considerations and not the needs of and consequences for the most impacted communities. We therefore want to take the focus away from definitions and allow for the inclusion of societal and relational aspects to represent how the effects of AI systems impact and are experienced by individuals and social groups. In this paper, we do this by means of proposing the ACROCPoLis framework to represent allocation processes with a modeling emphasis on fairness aspects. The framework provides a shared vocabulary in which the factors relevant to fairness assessments for different situations and procedures are made explicit, as well as their interrelationships. This enables us to compare analogous situations, to highlight the differences in dissimilar situations, and to capture differing interpretations of the same situation by different stakeholders.
|Title of host publication||Proceedings of the 6th ACM Conference on Fairness, Accountability, and Transparency, FAccT 2023|
|Publisher||Association for Computing Machinery|
|Number of pages||12|
|Publication status||Published - 12 Jun 2023|
|Event||6th ACM Conference on Fairness, Accountability, and Transparency, FAccT 2023 - Chicago, United States|
Duration: 12 Jun 2023 → 15 Jun 2023
|Name||ACM International Conference Proceeding Series|
|Conference||6th ACM Conference on Fairness, Accountability, and Transparency, FAccT 2023|
|Period||12/06/23 → 15/06/23|
Bibliographical noteFunding Information:
We would like to thank the anonymous reviewers for their thoughtful feedback, which has led us to substantially expand the discussion of ACROCPoLis in this paper.
© 2023 Owner/Author.
Copyright 2023 Elsevier B.V., All rights reserved.
- Algorithmic fairness
- responsible AI
- social impact of AI
- socio-technical processes