Abstract
Cross-lingual transfer has become an effective way of transferring knowledge between languages. In this paper, we explore an often-overlooked aspect in this domain: the influence of the source language of a language model on language transfer performance. We consider a case where the target language and its script are not part of the pre-trained model. We conduct a series of experiments on monolingual and multilingual models that are pre-trained on different tokenization methods to determine factors that affect cross-lingual transfer to a new language with a unique script. Our findings reveal the importance of the tokenizer as a stronger factor than the shared script, language similarity, and model size.
Original language | English |
---|---|
Title of host publication | Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies |
Subtitle of host publication | Volume 4: Student Research Workshop |
Editors | Yang (Trista) Cao, Isabel Papadimitriou, Anaelia Ovalle, Marcos Zampieri, Francis Ferraro, Swabha Swayamdipta |
Publisher | ACL Anthology |
Pages | 124-129 |
Number of pages | 6 |
Volume | 4 |
ISBN (Electronic) | 9798891761179 |
DOIs | |
Publication status | Published - 2024 |
Event | 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL 2024 - Hybrid, Mexico City, Mexico Duration: 16 Jun 2024 → 21 Jun 2024 |
Conference
Conference | 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL 2024 |
---|---|
Country/Territory | Mexico |
City | Hybrid, Mexico City |
Period | 16/06/24 → 21/06/24 |
Bibliographical note
Publisher Copyright:© 2024 Association for Computational Linguistics.