WebPIE: A web-scale parallel inference engine using MapReduce

Jacopo Urbani, Spyros Kotoulas, Jason Maassen, Frank Van Harmelen, Henri Bal

Research output: Contribution to JournalArticleAcademicpeer-review

Abstract

The large amount of Semantic Web data and its fast growth pose a significant computational challenge in performing efficient and scalable reasoning. On a large scale, the resources of single machines are no longer sufficient and we are required to distribute the process to improve performance. The article that we attach to our submission [1] tackles this problem proposing a methodology to perform inference materializing every possible consequence using the MapReduce programming model. We introduce a number of optimizations to address the issues that a naive implementation would raise and to improve the overall performance. We have implemented the presented techniques in a prototype called WebPIE and the evaluation shows that our approach is able to perform complex inference based on the OWL language over a very large input of about 100 billion triples. To the best of our knowledge, it is the only approach that demonstrates complex inference over an input of a hundred billion of triples.

Original languageEnglish
JournalBelgian/Netherlands Artificial Intelligence Conference
Publication statusPublished - 2012

Fingerprint

Inference engines
Semantic Web

Cite this

@article{34d46a307ff4440a8fb5efa26a3fb321,
title = "WebPIE: A web-scale parallel inference engine using MapReduce",
abstract = "The large amount of Semantic Web data and its fast growth pose a significant computational challenge in performing efficient and scalable reasoning. On a large scale, the resources of single machines are no longer sufficient and we are required to distribute the process to improve performance. The article that we attach to our submission [1] tackles this problem proposing a methodology to perform inference materializing every possible consequence using the MapReduce programming model. We introduce a number of optimizations to address the issues that a naive implementation would raise and to improve the overall performance. We have implemented the presented techniques in a prototype called WebPIE and the evaluation shows that our approach is able to perform complex inference based on the OWL language over a very large input of about 100 billion triples. To the best of our knowledge, it is the only approach that demonstrates complex inference over an input of a hundred billion of triples.",
author = "Jacopo Urbani and Spyros Kotoulas and Jason Maassen and {Van Harmelen}, Frank and Henri Bal",
year = "2012",
language = "English",
journal = "Belgian/Netherlands Artificial Intelligence Conference",
issn = "1568-7805",

}

WebPIE : A web-scale parallel inference engine using MapReduce. / Urbani, Jacopo; Kotoulas, Spyros; Maassen, Jason; Van Harmelen, Frank; Bal, Henri.

In: Belgian/Netherlands Artificial Intelligence Conference, 2012.

Research output: Contribution to JournalArticleAcademicpeer-review

TY - JOUR

T1 - WebPIE

T2 - A web-scale parallel inference engine using MapReduce

AU - Urbani, Jacopo

AU - Kotoulas, Spyros

AU - Maassen, Jason

AU - Van Harmelen, Frank

AU - Bal, Henri

PY - 2012

Y1 - 2012

N2 - The large amount of Semantic Web data and its fast growth pose a significant computational challenge in performing efficient and scalable reasoning. On a large scale, the resources of single machines are no longer sufficient and we are required to distribute the process to improve performance. The article that we attach to our submission [1] tackles this problem proposing a methodology to perform inference materializing every possible consequence using the MapReduce programming model. We introduce a number of optimizations to address the issues that a naive implementation would raise and to improve the overall performance. We have implemented the presented techniques in a prototype called WebPIE and the evaluation shows that our approach is able to perform complex inference based on the OWL language over a very large input of about 100 billion triples. To the best of our knowledge, it is the only approach that demonstrates complex inference over an input of a hundred billion of triples.

AB - The large amount of Semantic Web data and its fast growth pose a significant computational challenge in performing efficient and scalable reasoning. On a large scale, the resources of single machines are no longer sufficient and we are required to distribute the process to improve performance. The article that we attach to our submission [1] tackles this problem proposing a methodology to perform inference materializing every possible consequence using the MapReduce programming model. We introduce a number of optimizations to address the issues that a naive implementation would raise and to improve the overall performance. We have implemented the presented techniques in a prototype called WebPIE and the evaluation shows that our approach is able to perform complex inference based on the OWL language over a very large input of about 100 billion triples. To the best of our knowledge, it is the only approach that demonstrates complex inference over an input of a hundred billion of triples.

UR - http://www.scopus.com/inward/record.url?scp=84874794293&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84874794293&partnerID=8YFLogxK

M3 - Article

JO - Belgian/Netherlands Artificial Intelligence Conference

JF - Belgian/Netherlands Artificial Intelligence Conference

SN - 1568-7805

ER -