Many high-performing machine learning models are not in- terpretable. As they are increasingly used in decision scenarios that can critically affect individuals, it is necessary to develop tools to better un- derstand their outputs. Popular explanation methods include contrastive explanations. However, they suffer several shortcomings, among others an insufficient incorporation of background knowledge, and a lack of interactivity. While (dialogue-like) interactivity is important to better communicate an explanation, background knowledge has the potential to significantly improve their quality, e.g., by adapting the explanation to the needs of the end-user. To close this gap, we present reasonx, an explanation tool based on Con- straint Logic Programming (CLP). reasonx provides interactive con- trastive explanations that can be augmented by background knowledge, and allows to operate under a setting of under-specified information, leading to increased flexibility in the provided explanations. reasonx computes factual and constrative decision rules, as well as closest con- strative examples. It provides explanations for decision trees, which can be the ML models under analysis, or global/local surrogate models of any ML model. While the core part of reasonx is built on CLP, we also provide a pro- gram layer that allows to compute the explanations via Python, making the tool accessible to a wider audience. We illustrate the capability of reasonx on a synthetic data set, and on a a well-developed example in the credit domain. In both cases, we can show how reasonx can be flexibly used and tailored to the needs of the user

Reason to explain : interactive contrastive explanations (REASONX)

State, Laura;
2023

Abstract

Many high-performing machine learning models are not in- terpretable. As they are increasingly used in decision scenarios that can critically affect individuals, it is necessary to develop tools to better un- derstand their outputs. Popular explanation methods include contrastive explanations. However, they suffer several shortcomings, among others an insufficient incorporation of background knowledge, and a lack of interactivity. While (dialogue-like) interactivity is important to better communicate an explanation, background knowledge has the potential to significantly improve their quality, e.g., by adapting the explanation to the needs of the end-user. To close this gap, we present reasonx, an explanation tool based on Con- straint Logic Programming (CLP). reasonx provides interactive con- trastive explanations that can be augmented by background knowledge, and allows to operate under a setting of under-specified information, leading to increased flexibility in the provided explanations. reasonx computes factual and constrative decision rules, as well as closest con- strative examples. It provides explanations for decision trees, which can be the ML models under analysis, or global/local surrogate models of any ML model. While the core part of reasonx is built on CLP, we also provide a pro- gram layer that allows to compute the explanations via Python, making the tool accessible to a wider audience. We illustrate the capability of reasonx on a synthetic data set, and on a a well-developed example in the credit domain. In both cases, we can show how reasonx can be flexibly used and tailored to the needs of the user
2023
Settore INF/01 - Informatica
1st World Conference on eXplainable Artificial Intelligence, xAI 2023
prt
2023
Explainable Artificial Intelligence : first world conference : xAI 2023, Lisbon, Portugal, July 26–28, 2023 : proceedings
Springer
9783031440632
explainable AI; Contrastive Explanations; Background Knowledge; Interactivity; Constraint Logic Programming
   Artificial Intelligence without Bias
   NoBIAS
   European Commission
   Horizon 2020 Framework Programme
   860630

   Science and technology for the explanation of AI decision making
   XAI
   European Commission
   Horizon 2020 Framework Programme
   834756
File in questo prodotto:
File Dimensione Formato  
state_reason_to_explain.pdf

accesso aperto

Descrizione: Reason to explain: Interactive contrastive explanations (REASONX)
Tipologia: Submitted version (pre-print)
Licenza: Solo Lettura
Dimensione 591.4 kB
Formato Adobe PDF
591.4 kB Adobe PDF
Pagine 421-425 da state_reason_to_explain.pdf

Accesso chiuso

Tipologia: Published version
Licenza: Non pubblico
Dimensione 196.92 kB
Formato Adobe PDF
196.92 kB Adobe PDF   Richiedi una copia
Pagine 425-427 state_reason_to_explain-3.pdf

Accesso chiuso

Tipologia: Published version
Licenza: Non pubblico
Dimensione 114.71 kB
Formato Adobe PDF
114.71 kB Adobe PDF   Richiedi una copia
Pagine 428-29 da state_reason_to_explain-5.pdf

Accesso chiuso

Tipologia: Published version
Licenza: Non pubblico
Dimensione 8.02 MB
Formato Adobe PDF
8.02 MB Adobe PDF   Richiedi una copia
Pagine 430-33 da state_reason_to_explain-6.pdf

Accesso chiuso

Tipologia: Published version
Licenza: Non pubblico
Dimensione 8.08 MB
Formato Adobe PDF
8.08 MB Adobe PDF   Richiedi una copia
Pagin434 da state_reason_to_explain-10.pdf

Accesso chiuso

Tipologia: Published version
Licenza: Non pubblico
Dimensione 79.17 kB
Formato Adobe PDF
79.17 kB Adobe PDF   Richiedi una copia
Pagine 435-37 da state_reason_to_explain-11.pdf

Accesso chiuso

Tipologia: Published version
Licenza: Non pubblico
Dimensione 71.78 kB
Formato Adobe PDF
71.78 kB Adobe PDF   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11384/140503
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact