A compression algorithm for the combination of PDF sets

Stefano Carrazza, Jose I. Latorre, Juan Rojo, Graeme Watt

Research output: Contribution to JournalArticleAcademicpeer-review

Abstract

The current PDF4LHC recommendation to estimate uncertainties due to parton distribution functions (PDFs) in theoretical predictions for LHC processes involves the combination of separate predictions computed using PDF sets from different groups, each of which comprises a relatively large number of either Hessian eigenvectors or Monte Carlo (MC) replicas. While many fixed-order and parton shower programs allow the evaluation of PDF uncertainties for a single PDF set at no additional CPU cost, this feature is not universal, and moreover the a posteriori combination of the predictions using at least three different PDF sets is still required. In this work, we present a strategy for the statistical combination of individual PDF sets, based on the MC representation of Hessian sets, followed by a compression algorithm for the reduction of the number of MC replicas. We illustrate our strategy with the combination and compression of the recent NNPDF3.0, CT14 and MMHT14 NNLO PDF sets. The resulting Compressed Monte Carlo PDF (CMC-PDF) sets are validated at the level of parton luminosities and LHC inclusive cross-sections and differential distributions. We determine that around 100 replicas provide an adequate representation of the probability distribution for the original combined PDF set, suitable for general applications to LHC phenomenology.
Original languageEnglish
Article number474
JournalEuropean Physical Journal C. Particles and Fields
Volume75
Issue number10
DOIs
Publication statusPublished - 24 Apr 2015

Fingerprint

partons
Distribution functions
distribution functions
replicas
predictions
Eigenvalues and eigenfunctions
Probability distributions
Program processors
Luminance
showers
phenomenology
recommendations
eigenvectors
luminosity
costs
evaluation
cross sections
Costs

Bibliographical note

45 pages, 24 figures, version accepted for publication in the European Journal of Physics C

Keywords

  • hep-ph
  • hep-ex
  • nucl-ex
  • nucl-th

Cite this

Carrazza, Stefano ; Latorre, Jose I. ; Rojo, Juan ; Watt, Graeme. / A compression algorithm for the combination of PDF sets. In: European Physical Journal C. Particles and Fields. 2015 ; Vol. 75, No. 10.
@article{d39ff95774734ac2aa3e4cfa96e364d1,
title = "A compression algorithm for the combination of PDF sets",
abstract = "The current PDF4LHC recommendation to estimate uncertainties due to parton distribution functions (PDFs) in theoretical predictions for LHC processes involves the combination of separate predictions computed using PDF sets from different groups, each of which comprises a relatively large number of either Hessian eigenvectors or Monte Carlo (MC) replicas. While many fixed-order and parton shower programs allow the evaluation of PDF uncertainties for a single PDF set at no additional CPU cost, this feature is not universal, and moreover the a posteriori combination of the predictions using at least three different PDF sets is still required. In this work, we present a strategy for the statistical combination of individual PDF sets, based on the MC representation of Hessian sets, followed by a compression algorithm for the reduction of the number of MC replicas. We illustrate our strategy with the combination and compression of the recent NNPDF3.0, CT14 and MMHT14 NNLO PDF sets. The resulting Compressed Monte Carlo PDF (CMC-PDF) sets are validated at the level of parton luminosities and LHC inclusive cross-sections and differential distributions. We determine that around 100 replicas provide an adequate representation of the probability distribution for the original combined PDF set, suitable for general applications to LHC phenomenology.",
keywords = "hep-ph, hep-ex, nucl-ex, nucl-th",
author = "Stefano Carrazza and Latorre, {Jose I.} and Juan Rojo and Graeme Watt",
note = "45 pages, 24 figures, version accepted for publication in the European Journal of Physics C",
year = "2015",
month = "4",
day = "24",
doi = "10.1140/epjc/s10052-015-3703-3",
language = "English",
volume = "75",
journal = "European Physical Journal C. Particles and Fields",
issn = "1434-6044",
publisher = "Springer New York",
number = "10",

}

A compression algorithm for the combination of PDF sets. / Carrazza, Stefano; Latorre, Jose I.; Rojo, Juan; Watt, Graeme.

In: European Physical Journal C. Particles and Fields, Vol. 75, No. 10, 474, 24.04.2015.

Research output: Contribution to JournalArticleAcademicpeer-review

TY - JOUR

T1 - A compression algorithm for the combination of PDF sets

AU - Carrazza, Stefano

AU - Latorre, Jose I.

AU - Rojo, Juan

AU - Watt, Graeme

N1 - 45 pages, 24 figures, version accepted for publication in the European Journal of Physics C

PY - 2015/4/24

Y1 - 2015/4/24

N2 - The current PDF4LHC recommendation to estimate uncertainties due to parton distribution functions (PDFs) in theoretical predictions for LHC processes involves the combination of separate predictions computed using PDF sets from different groups, each of which comprises a relatively large number of either Hessian eigenvectors or Monte Carlo (MC) replicas. While many fixed-order and parton shower programs allow the evaluation of PDF uncertainties for a single PDF set at no additional CPU cost, this feature is not universal, and moreover the a posteriori combination of the predictions using at least three different PDF sets is still required. In this work, we present a strategy for the statistical combination of individual PDF sets, based on the MC representation of Hessian sets, followed by a compression algorithm for the reduction of the number of MC replicas. We illustrate our strategy with the combination and compression of the recent NNPDF3.0, CT14 and MMHT14 NNLO PDF sets. The resulting Compressed Monte Carlo PDF (CMC-PDF) sets are validated at the level of parton luminosities and LHC inclusive cross-sections and differential distributions. We determine that around 100 replicas provide an adequate representation of the probability distribution for the original combined PDF set, suitable for general applications to LHC phenomenology.

AB - The current PDF4LHC recommendation to estimate uncertainties due to parton distribution functions (PDFs) in theoretical predictions for LHC processes involves the combination of separate predictions computed using PDF sets from different groups, each of which comprises a relatively large number of either Hessian eigenvectors or Monte Carlo (MC) replicas. While many fixed-order and parton shower programs allow the evaluation of PDF uncertainties for a single PDF set at no additional CPU cost, this feature is not universal, and moreover the a posteriori combination of the predictions using at least three different PDF sets is still required. In this work, we present a strategy for the statistical combination of individual PDF sets, based on the MC representation of Hessian sets, followed by a compression algorithm for the reduction of the number of MC replicas. We illustrate our strategy with the combination and compression of the recent NNPDF3.0, CT14 and MMHT14 NNLO PDF sets. The resulting Compressed Monte Carlo PDF (CMC-PDF) sets are validated at the level of parton luminosities and LHC inclusive cross-sections and differential distributions. We determine that around 100 replicas provide an adequate representation of the probability distribution for the original combined PDF set, suitable for general applications to LHC phenomenology.

KW - hep-ph

KW - hep-ex

KW - nucl-ex

KW - nucl-th

U2 - 10.1140/epjc/s10052-015-3703-3

DO - 10.1140/epjc/s10052-015-3703-3

M3 - Article

VL - 75

JO - European Physical Journal C. Particles and Fields

JF - European Physical Journal C. Particles and Fields

SN - 1434-6044

IS - 10

M1 - 474

ER -