Compositionality in Computational Linguistics

Lucia Donatelli, Alexander Koller

Research output: Contribution to JournalReview articleAcademicpeer-review

Abstract

© 2023 by the author(s). This work is licensed under a Creative Commons Attribution 4.0 International License.Neural models greatly outperform grammar-based models across many tasks in modern computational linguistics. This raises the question of whether linguistic principles, such as the Principle of Compositionality, still have value as modeling tools. We review the recent literature and find that while an overly strict interpretation of compositionality makes it hard to achieve broad coverage in semantic parsing tasks, compositionality is still necessary for a model to learn the correct linguistic generalizations from limited data. Reconciling both of these qualities requires the careful exploration of a novel design space; we also review some recent results that may help in this exploration.
Original languageEnglish
Pages (from-to)463-481
JournalAnnual Review of Linguistics
Volume9
DOIs
Publication statusPublished - 17 Jan 2023
Externally publishedYes

Fingerprint

Dive into the research topics of 'Compositionality in Computational Linguistics'. Together they form a unique fingerprint.

Cite this