TY - GEN
T1 - Finding small proofs for description logic entailments
T2 - 23rd International Conference on Logic for Programming, Artificial Intelligence and Reasoning, LPAR23 2020
AU - Alrabbaa, Christian
AU - Baader, Franz
AU - Borgwardt, Stefan
AU - Koopmann, Patrick
AU - Kovtunova, Alisa
PY - 2020
Y1 - 2020
N2 - Logic-based approaches to AI have the advantage that their behaviour can in principle be explained by providing their users with proofs for the derived consequences. However, if such proofs get very large, then it may be hard to understand a consequence even if the individual derivation steps are easy to comprehend. This motivates our interest in finding small proofs for Description Logic (DL) entailments. Instead of concentrating on a specific DL and proof calculus for this DL, we introduce a general framework in which proofs are represented as labeled, directed hypergraphs, where each hyperedge corresponds to a single sound derivation step. On the theoretical side, we investigate the complexity of deciding whether a certain consequence has a proof of size at most n along the following orthogonal dimensions: (i) the underlying proof system is polynomial or exponential; (ii) proofs may or may not reuse already derived consequences; and (iii) the number n is represented in unary or binary. We have determined the exact worst-case complexity of this decision problem for all but one of the possible combinations of these options. On the practical side, we have developed and implemented an approach for generating proofs for expressive DLs based on a non-standard reasoning task called forgetting. We have evaluated this approach on a set of realistic ontologies and compared the obtained proofs with proofs generated by the DL reasoner ELK, finding that forgetting-based proofs are often better w.r.t. different measures of proof complexity.
AB - Logic-based approaches to AI have the advantage that their behaviour can in principle be explained by providing their users with proofs for the derived consequences. However, if such proofs get very large, then it may be hard to understand a consequence even if the individual derivation steps are easy to comprehend. This motivates our interest in finding small proofs for Description Logic (DL) entailments. Instead of concentrating on a specific DL and proof calculus for this DL, we introduce a general framework in which proofs are represented as labeled, directed hypergraphs, where each hyperedge corresponds to a single sound derivation step. On the theoretical side, we investigate the complexity of deciding whether a certain consequence has a proof of size at most n along the following orthogonal dimensions: (i) the underlying proof system is polynomial or exponential; (ii) proofs may or may not reuse already derived consequences; and (iii) the number n is represented in unary or binary. We have determined the exact worst-case complexity of this decision problem for all but one of the possible combinations of these options. On the practical side, we have developed and implemented an approach for generating proofs for expressive DLs based on a non-standard reasoning task called forgetting. We have evaluated this approach on a set of realistic ontologies and compared the obtained proofs with proofs generated by the DL reasoner ELK, finding that forgetting-based proofs are often better w.r.t. different measures of proof complexity.
UR - http://www.scopus.com/inward/record.url?scp=85091425309&partnerID=8YFLogxK
U2 - 10.29007/nhpp
DO - 10.29007/nhpp
M3 - Conference contribution
VL - 73
T3 - EPiC Series in Computing
SP - 32
EP - 67
BT - 23rd International Conference on Logic for Programming, Artificial Intelligence and Reasoning, LPAR23 2020
A2 - Albert, E.
A2 - Kovács, L.
PB - EasyChair
Y2 - 22 May 2020 through 27 May 2020
ER -