Explainable artificial intelligence (XAI) is a rapidly growing research field that has received a lot of attention during the last few years. An important goal of the field is to use its methods to detect (social) bias and discrimination. Despite these positive intentions, aspects of XAI can be in conflict with feminist approaches and values. Therefore, our conceptual contribution brings forward both a careful assessment of current XAI methods, as well as visions for carefully doing XAI from a feminist perspective. We conclude with a discussion on the possibilities for caring XAI, and the challenges that might lie along the way.

Careful Explanations: A Feminist Perspective on XAI

State, Laura;
2023

Abstract

Explainable artificial intelligence (XAI) is a rapidly growing research field that has received a lot of attention during the last few years. An important goal of the field is to use its methods to detect (social) bias and discrimination. Despite these positive intentions, aspects of XAI can be in conflict with feminist approaches and values. Therefore, our conceptual contribution brings forward both a careful assessment of current XAI methods, as well as visions for carefully doing XAI from a feminist perspective. We conclude with a discussion on the possibilities for caring XAI, and the challenges that might lie along the way.
2023
Settore INF/01 - Informatica
EWAF’23: European Workshop on Algorithmic Fairness
Winterthur (Svizzera)
07-09.06.2023
European Workshop on Algorithmic Fairness (EWAF 2023)
RWTH Aachen University
explainable AI; feminism; care work
   Artificial Intelligence without Bias
   NoBIAS
   European Commission
   Horizon 2020 Framework Programme
   860630
File in questo prodotto:
File Dimensione Formato  
state_careful_explanations.pdf

accesso aperto

Descrizione: Careful Explanations: A Feminist Perspective on XAI
Tipologia: Published version
Licenza: Creative Commons
Dimensione 189.77 kB
Formato Adobe PDF
189.77 kB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11384/140505
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact