
doi: 10.31224/4536
Cross-lingual transfer learning has become a cornerstone of multilingual NLP, yet performance disparities persist for low-resource languages, particularly those with typologically divergent features from high-resource source languages. This paper investigates how explicit typological constraints— derived from databases like the World Atlas of Language Structures (WALS) Dryer and Haspelmath [2013]—can guide parameter sharing and alignment in multilingual models. Building on recent work in typologically informed neural architectures Ponti et al. [2020], Bjerva and Augenstein [2021], we propose a novel adapter-based framework that conditions layer-wise transformations on syntactic and morphological features. Our experiments on three low-resource languages (Arapaho, Uyghur, and Tsez) demonstrate that typological guidance reduces negative interference and improves transfer accuracy by up to 12% compared to unconstrained baselines. We further analyze the interplay between feature granularity and model performance, drawing on insights from linguistic typology Bickel and Nichols [2017] and low-resource NLP Joshi et al. [2020].
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
