Explanation for defeasible entailment

dc.contributor.advisorMeyer, Thomas
dc.contributor.authorChama, Victoria
dc.date.accessioned2020-09-10T07:52:08Z
dc.date.available2020-09-10T07:52:08Z
dc.date.issued2020
dc.date.updated2020-09-10T07:51:46Z
dc.description.abstractExplanation facilities are an essential part of tools for knowledge representation and reasoning systems. Knowledge representation and reasoning systems allow users to capture information about the world and reason about it. They are useful in understanding entailments which allow users to derive implicit knowledge that can be made explicit through inferences. Additionally, explanations also assist users in debugging and repairing knowledge bases when conflicts arise. Understanding the conclusions drawn from logic-based systems are complex and requires expert knowledge, especially when defeasible knowledge bases are taken into account for both expert and general users. A defeasible knowledge base represents statements that can be retracted because they refer to information in which there are exceptions to stated rules. That is, any defeasible statement is one that may be withdrawn upon learning of an exception. Explanations for classical logics such as description logics which are well-known formalisms for reasoning about information in a given domain are provided through the notion of justifications. Simply providing or listing the statements that are responsible for an entailment in the classical case is enough to justify an entailment. However, when looking at the defeasible case where entailed statements can be retracted, this is not adequate because the way in which entailment is performed is more complicated than the classical case. In this dissertation, we combine explanations with a particular approach to dealing with defeasible reasoning. We provide an algorithm to compute justification-based explanations for defeasible knowledge bases. It is shown that in order to accurately derive justifications for defeasible knowledge bases, we need to establish the point at which conflicts arise by using an algorithm to come up with a ranking of defeasible statements. This means that only a portion of the knowledge is considered because the statements that cause conflicts are discarded. The final algorithm consists of two parts; the first part establishes the point at which the conflicts occur and the second part uses the information obtained from the first algorithm to compute justifications for defeasible knowledge bases.
dc.identifier.apacitationChama, V. (2020). <i>Explanation for defeasible entailment</i>. (). ,Faculty of Science ,Department of Computer Science. Retrieved from http://hdl.handle.net/11427/32206en_ZA
dc.identifier.chicagocitationChama, Victoria. <i>"Explanation for defeasible entailment."</i> ., ,Faculty of Science ,Department of Computer Science, 2020. http://hdl.handle.net/11427/32206en_ZA
dc.identifier.citationChama, V. 2020. Explanation for defeasible entailment. . ,Faculty of Science ,Department of Computer Science. http://hdl.handle.net/11427/32206en_ZA
dc.identifier.ris TY - Master Thesis AU - Chama, Victoria AB - Explanation facilities are an essential part of tools for knowledge representation and reasoning systems. Knowledge representation and reasoning systems allow users to capture information about the world and reason about it. They are useful in understanding entailments which allow users to derive implicit knowledge that can be made explicit through inferences. Additionally, explanations also assist users in debugging and repairing knowledge bases when conflicts arise. Understanding the conclusions drawn from logic-based systems are complex and requires expert knowledge, especially when defeasible knowledge bases are taken into account for both expert and general users. A defeasible knowledge base represents statements that can be retracted because they refer to information in which there are exceptions to stated rules. That is, any defeasible statement is one that may be withdrawn upon learning of an exception. Explanations for classical logics such as description logics which are well-known formalisms for reasoning about information in a given domain are provided through the notion of justifications. Simply providing or listing the statements that are responsible for an entailment in the classical case is enough to justify an entailment. However, when looking at the defeasible case where entailed statements can be retracted, this is not adequate because the way in which entailment is performed is more complicated than the classical case. In this dissertation, we combine explanations with a particular approach to dealing with defeasible reasoning. We provide an algorithm to compute justification-based explanations for defeasible knowledge bases. It is shown that in order to accurately derive justifications for defeasible knowledge bases, we need to establish the point at which conflicts arise by using an algorithm to come up with a ranking of defeasible statements. This means that only a portion of the knowledge is considered because the statements that cause conflicts are discarded. The final algorithm consists of two parts; the first part establishes the point at which the conflicts occur and the second part uses the information obtained from the first algorithm to compute justifications for defeasible knowledge bases. DA - 2020_ DB - OpenUCT DP - University of Cape Town KW - Computer Science LK - https://open.uct.ac.za PY - 2020 T1 - Explanation for defeasible entailment TI - Explanation for defeasible entailment UR - http://hdl.handle.net/11427/32206 ER - en_ZA
dc.identifier.urihttp://hdl.handle.net/11427/32206
dc.identifier.vancouvercitationChama V. Explanation for defeasible entailment. []. ,Faculty of Science ,Department of Computer Science, 2020 [cited yyyy month dd]. Available from: http://hdl.handle.net/11427/32206en_ZA
dc.language.rfc3066eng
dc.publisher.departmentDepartment of Computer Science
dc.publisher.facultyFaculty of Science
dc.subjectComputer Science
dc.titleExplanation for defeasible entailment
dc.typeMaster Thesis
dc.type.qualificationlevelMasters
dc.type.qualificationlevelMSc
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
thesis_sci_2020_chama victoria.pdf
Size:
953.14 KB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
0 B
Format:
Item-specific license agreed upon to submission
Description:
Collections