An asymptotic analysis of distributed nonparametric methods

Botond Szabó, Harry Van Zanten

Research output: Contribution to JournalArticleAcademicpeer-review

Abstract

We investigate and compare the fundamental performance of several distributed learning methods that have been proposed recently. We do this in the context of a distributed version of the classical signal-in-Gaussian-white-noise model, which serves as a benchmark model for studying performance in this setting. The results show how the design and tuning of a distributed method can have great impact on convergence rates and validity of uncertainty quantification. Moreover, we highlight the difficulty of designing nonparametric distributed procedures that automatically adapt to smoothness.

Original languageEnglish
JournalJournal of Machine Learning Research
Volume20
Publication statusPublished - 1 Jun 2019
Externally publishedYes

Keywords

  • Convergence rates
  • Distributed learning
  • Gaussian processes
  • High-dimensional models
  • Nonparametric models

Fingerprint Dive into the research topics of 'An asymptotic analysis of distributed nonparametric methods'. Together they form a unique fingerprint.

  • Cite this