A Logic Formal Validation Model for the Explanations Generation in an Intelligent Assistant


Authors/Editors


Research Areas

No matching items found.


Publication Details

Output typeOther

Author listFrausto J, Elizalde F, Reyes A

Publication year2008

Start page9

End page14

Number of pages6

ISBN978-0-7695-3441-1

LanguagesEnglish-Great Britain (EN-GB)


Unpaywall Data

Open access statusclosed


Abstract

Given that users provided with explanations have a better performance than ones without them, we have developed an explanation generation mechanism based on the selection of the most relevant variable. However explanations generated automatically by an Intelligent Assistant System (IAS) and those generated manually by a human expert could have inconsistences. In this work, we present a formal validation model that uses first order logic to formalize the explanations given by the human and the IAS output as well. The aim of this validation is to prove the IAS correctness and the explanations soundness. Experimental results demonstrate that most of the explanations generated automatically in a training plant operator domain are consistent and sound with those provided by the expert. We consider that this method is a useful tool to evaluate the precision of the explanation generation mechanism that could also be extended to other domains.


Keywords

No matching items found.


Documents

No matching items found.


Last updated on 2025-29-06 at 00:02