A Logic Formal Validation Model for the Explanations Generation in an Intelligent Assistant
Authors / Editors
Research Areas
No matching items found.
Publication Details
Output type: Other
Author list: Frausto J, Elizalde F, Reyes A
Publication year: 2008
Start page: 9
End page: 14
Number of pages: 6
ISBN: 978-0-7695-3441-1
Languages: English-Great Britain (EN-GB)
Unpaywall Data
Open access status: closed
Abstract
Given that users provided with explanations have a better performance than ones without them, we have developed an explanation generation mechanism based on the selection of the most relevant variable. However explanations generated automatically by an Intelligent Assistant System (IAS) and those generated manually by a human expert could have inconsistences. In this work, we present a formal validation model that uses first order logic to formalize the explanations given by the human and the IAS output as well. The aim of this validation is to prove the IAS correctness and the explanations soundness. Experimental results demonstrate that most of the explanations generated automatically in a training plant operator domain are consistent and sound with those provided by the expert. We consider that this method is a useful tool to evaluate the precision of the explanation generation mechanism that could also be extended to other domains.
Keywords
No matching items found.
Documents
No matching items found.