Logo Logo
Hilfe
Hilfe
Switch Language to English

Dufter, Philipp und Schütze, Hinrich (1. Mai 2020): Identifying Necessary Elements for BERT’s Multilinguality. [PDF, 631kB]

[thumbnail of identify,dufter.pdf]
Vorschau
Download (631kB)

Abstract

It has been shown that multilingual BERT (mBERT) yields high quality multilingual rep- resentations and enables effective zero-shot transfer. This is suprising given that mBERT does not use any kind of crosslingual sig- nal during training. While recent literature has studied this effect, the exact reason for mBERT’s multilinguality is still unknown. We aim to identify architectural properties of BERT as well as linguistic properties of lan- guages that are necessary for BERT to become multilingual. To allow for fast experimenta- tion we propose an efficient setup with small BERT models and synthetic as well as natu- ral data. Overall, we identify six elements that are potentially necessary for BERT to be mul- tilingual. Architectural factors that contribute to multilinguality are underparameterization, shared special tokens (e.g., “[CLS]”), shared position embeddings and replacing masked to- kens with random tokens. Factors related to training data that are beneficial for multilin- guality are similar word order and comparabil- ity of corpora.

Dokument bearbeiten Dokument bearbeiten