Methodological Principles for Reproducible Performance Evaluation in Cloud Computing

Alessandro Vittorio Papadopoulos*, Laurens Versluis, Andre Bauer, Nikolas Herbst, Joakim Von Kistowski, Ahmed Ali-Eldin, Cristina L. Abad, Jose Nelson Amaral, Petr Tuma, Alexandru Iosup

*Corresponding author for this work

Research output: Contribution to JournalArticleAcademicpeer-review

Abstract

The rapid adoption and the diversification of cloud computing technology exacerbate the importance of a sound experimental methodology for this domain. This work investigates how to measure and report performance in the cloud, and how well the cloud research community is already doing it. We propose a set of eight important methodological principles that combine best-practices from nearby fields with concepts applicable only to clouds, and with new ideas about the time-accuracy trade-off. We show how these principles are applicable using a practical use-case experiment. To this end, we analyze the ability of the newly released SPEC Cloud IaaS benchmark to follow the principles, and showcase real-world experimental studies in common cloud environments that meet the principles. Last, we report on a systematic literature review including top conferences and journals in the field, from 2012 to 2017, analyzing if the practice of reporting cloud performance measurements follows the proposed eight principles. Worryingly, this systematic survey and the subsequent two-round human reviews, reveal that few of the published studies follow the eight experimental principles. We conclude that, although these important principles are simple and basic, the cloud community is yet to adopt them broadly to deliver sound measurement of cloud environments.

Original languageEnglish
Article number8758926
Pages (from-to)1528-1543
Number of pages16
JournalIEEE Transactions on Software Engineering
Volume47
Issue number8
Early online date10 Jul 2019
DOIs
Publication statusPublished - 1 Aug 2021

Bibliographical note

Funding Information:
Cristina L. Abad received the MS and PhD degrees from the Computer Science Department, University of Illinois at Urbana-Champaign, where she was a recipient of the Computer Science Excellence Fellowship and a Fulbright Scholarship. She is a professor at Escuela Superior Politecnica del Litoral, ESPOL, in Guayaquil–Ecuador. From 2011 through 2014, she was a member of the Hadoop Core team at Yahoo, Inc. She has been the recipient of two Google Faculty Research Awards (2015 and 2017). Her main research interests lie in the area of distributed systems. She is a member of the IEEE.

Funding Information:
This work was also supported by Swedish Foundation for Strategic Research under the project “Future factories in the cloud (FiC)” with grant number GMT14-0032, by the Knowledge Foundation (KKS), by the German Research Foundation (DFG) under grant No. KO 3445/11-1, by the US NSF grant No. 1836752, by the Wallenberg Foundation, by the Swedish strategic research programme eSSENCE, by the Google Faculty Research Award, by the National Science and Engineering Research Council of Canada (NSERC), by the ECSEL Joint Undertaking (JU) grant No 783162, by the Dutch NWO Vidi grant MagnaData, by generous donations by Oracle and Intel Labs, both USA, and by Solvinity, the Netherlands, and by SPEC Research.

Publisher Copyright:
© 1976-2012 IEEE.

Copyright:
Copyright 2021 Elsevier B.V., All rights reserved.

Keywords

  • Experimental evaluation
  • experimentation
  • observation study

Fingerprint

Dive into the research topics of 'Methodological Principles for Reproducible Performance Evaluation in Cloud Computing'. Together they form a unique fingerprint.

Cite this