We investigate the problem of step-wise explaining how to solve constraint satisfaction problems. More specifically, we study howto explain the inference steps that one can take during propagation. The main challenge is finding a sequence ofsimple explanations, where each explanation should aim to be cognitively as easy as possible for a human to verify and understand. This contrasts with the arbitrary combination of facts and constraints that the solver may use when propagating. We identify the explanation-production problem of finding the best sequence of explanations for the maximal consequence of a CSP. We propose the use of a cost function to quantify how simple an individual explanationof an inference step is. Our proposed algorithm iteratively constructs the explanation sequence, agnostic of the underlying constraint propagation mechanisms, by using an optimistic estimate of the cost function, to guide the search for the best explanation at each step. Using reasoning by contradiction, we develop a mechanism to break the most difficult steps up and give the user the ability to zoom in on specific parts of the explanation.
Periode13 sep 2020
EvenementstitelExplainable Logic-Based Knowledge Representation
LocatieRhodes, Greece
Mate van erkenningInternational