Improving Cross-lingual Transfer through Subtree-aware Word Reordering

Ofir Arviv, Dmitry Nikolaev, Taelin Karidi, Omri Abend

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Despite the impressive growth of the abilities of multilingual language models, such as XLM-R and mT5, it has been shown that they still face difficulties when tackling typologically-distant languages, particularly in the low-resource setting. One obstacle for effective cross-lingual transfer is variability in word-order patterns. It can be potentially mitigated via source- or target-side word reordering, and numerous approaches to reordering have been proposed. However, they rely on language-specific rules, work on the level of POS tags, or only target the main clause, leaving subordinate clauses intact. To address these limitations, we present a new powerful reordering method, defined in terms of Universal Dependencies, that is able to learn fine-grained word-order patterns conditioned on the syntactic context from a small amount of annotated data and can be applied at all levels of the syntactic tree. We conduct experiments on a diverse set of tasks and show that our method consistently outperforms strong baselines over different language pairs and model architectures. This performance advantage holds true in both zero-shot and few-shot scenarios.
Original languageEnglish
Title of host publicationFindings of the Association for Computational Linguistics: EMNLP 2023
EditorsHouda Bouamor, Juan Pino, Kalika Bali
PublisherAssociation for Computational Linguistics
Pages718–736
DOIs
Publication statusPublished - Dec 2023
Externally publishedYes
EventConference on Empirical Methods in Natural Language Processing - , Singapore
Duration: 8 Dec 202310 Dec 2023

Conference

ConferenceConference on Empirical Methods in Natural Language Processing
Country/TerritorySingapore
Period8/12/2310/12/23

Fingerprint

Dive into the research topics of 'Improving Cross-lingual Transfer through Subtree-aware Word Reordering'. Together they form a unique fingerprint.

Cite this