Scalable RDF data compression with MapReduce

J. Urbani, J. Maassen, N. Drost, F.J. Seinstra, H.E. Bal

Research output: Contribution to JournalArticleAcademicpeer-review

Abstract

The Semantic Web contains many billions of statements, which are released using the resource description framework (RDF) data model. To better handle these large amounts of data, high performance RDF applications must apply a compression technique. Unfortunately, because of the large input size, even this compression is challenging. In this paper, we propose a set of distributed MapReduce algorithms to efficiently compress and decompress a large amount of RDF data. Our approach uses a dictionary encoding technique that maintains the structure of the data. We highlight the problems of distributed data compression and describe the solutions that we propose. We have implemented a prototype using the Hadoop framework, and evaluate its performance. We show that our approach is able to efficiently compress a large amount of data and scales linearly on both input size and number of nodes. Copyright © 2012 John Wiley & Sons, Ltd.
Original languageEnglish
Pages (from-to)24-39
JournalConcurrency and Computation: Practice and Experience
Volume25
Issue number1
DOIs
Publication statusPublished - 2013

Fingerprint

Dive into the research topics of 'Scalable RDF data compression with MapReduce'. Together they form a unique fingerprint.

Cite this