Abstract
Conditional Neural Processes (CNPs) are a class of metalearning models popular for combining the runtime efficiency of amortized inference with reliable uncertainty quantification. Many relevant machine learning tasks, such as in spatio-temporal modeling, Bayesian Optimization and continuous control, inherently contain equivariances - for example to translation - which the model can exploit for maximal performance. However, prior attempts to include equivariances in CNPs do not scale effectively beyond two input dimensions. In this work, we propose Relational Conditional Neural Processes (RCNPs), an effective approach to incorporate equivariances into any neural process model. Our proposed method extends the applicability and impact of equivariant neural processes to higher dimensions. We empirically demonstrate the competitive performance of RCNPs on a large array of tasks naturally containing equivariances.
| Original language | English |
|---|---|
| Title of host publication | Advances in Neural Information Processing Systems 36 (NeurIPS 2023) |
| Subtitle of host publication | [Proceedings] |
| Editors | A. Oh, T. Naumann, A. Globerson, K. Saenko, M. Hardt, S. Levine |
| Publisher | NeurIPS |
| Pages | 1-38 |
| Number of pages | 38 |
| DOIs | |
| Publication status | Published - 2023 |
| Event | 37th Conference on Neural Information Processing Systems, NeurIPS 2023 - New Orleans, United States Duration: 10 Dec 2023 → 16 Dec 2023 |
Conference
| Conference | 37th Conference on Neural Information Processing Systems, NeurIPS 2023 |
|---|---|
| Country/Territory | United States |
| City | New Orleans |
| Period | 10/12/23 → 16/12/23 |
Bibliographical note
Publisher Copyright:© 2023 Neural information processing systems foundation. All rights reserved.
Fingerprint
Dive into the research topics of 'Practical Equivariances via Relational Conditional Neural Processes'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver