Characterization of a big data storage workload in the cloud

Sacheendra Talluri, Cristina L. Abad, Alicja Łuszczak, Alexandru Iosup

Research output: Chapter in Book / Report / Conference proceedingConference contributionAcademicpeer-review

140 Downloads (Pure)

Abstract

The proliferation of big data processing platforms has led to radically different system designs, such as MapReduce and the newer Spark. Understanding the workloads of such systems facilitates tuning and could foster new designs. However, whereas MapReduce workloads have been characterized extensively, relatively little public knowledge exists about the characteristics of Spark workloads in representative environments. To address this problem, in this work we collect and analyze a 6-month Spark workload from a major provider of big data processing services, Databricks. Our analysis focuses on a number of key features, such as the long-term trends of reads and modifications, the statistical properties of reads, and the popularity of clusters and of file formats. Overall, we present numerous findings that could form the basis of new systems studies and designs. Our quantitative evidence and its analysis suggest the existence of daily and weekly load imbalances, of heavy-tailed and bursty behaviour, of the relative rarity of modifications, and of proliferation of big data specific formats.

Original languageEnglish
Title of host publicationICPE 2019 - Proceedings of the 2019 ACM/SPEC International Conference on Performance Engineering
Place of PublicationNew York, NY
PublisherAssociation for Computing Machinery, Inc
Pages33-44
Number of pages12
ISBN (Electronic)9781450362399
DOIs
Publication statusPublished - 4 Apr 2019
Event10th ACM/SPEC International Conference on Performance Engineering, ICPE 2019 - Mumbai, India
Duration: 7 Apr 201911 Apr 2019

Conference

Conference10th ACM/SPEC International Conference on Performance Engineering, ICPE 2019
CountryIndia
CityMumbai
Period7/04/1911/04/19

Fingerprint Dive into the research topics of 'Characterization of a big data storage workload in the cloud'. Together they form a unique fingerprint.

Cite this