Logo Logo
Hilfe
Hilfe
Switch Language to English

Liu, Yihong; Ye, Haotian; Weissweiler, Leonie; Wicke, Philipp; Pei, Renhao; Zangenfeind, Robert und Schütze, Hinrich (Juli 2023): A Crosslingual Investigation of Conceptualization in 1335 Languages. 61st Annual Meeting of the Association for Computational Linguistics (ACL 2023), Toronto, Canada, July 2023. Rogers, Anna; Boyd-Graber, Jordan und Okazaki, Naoaki (Hrsg.): In: Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), S. 12969-13000 [PDF, 6MB]

Abstract

Languages differ in how they divide up the world into concepts and words; e.g., in contrast to English, Swahili has a single concept for ‘belly’ and ‘womb’. We investigate these differences in conceptualization across 1,335 languages by aligning concepts in a parallel corpus. To this end, we propose Conceptualizer, a method that creates a bipartite directed alignment graph between source language concepts and sets of target language strings. In a detailed linguistic analysis across all languages for one concept (‘bird’) and an evaluation on gold standard data for 32 Swadesh concepts, we show that Conceptualizer has good alignment accuracy. We demonstrate the potential of research on conceptualization in NLP with two experiments. (1) We define crosslingual stability of a concept as the degree to which it has 1-1 correspondences across languages, and show that concreteness predicts stability. (2) We represent each language by its conceptualization pattern for 83 concepts, and define a similarity measure on these representations. The resulting measure for the conceptual similarity between two languages is complementary to standard genealogical, typological, and surface similarity measures. For four out of six language families, we can assign languages to their correct family based on conceptual similarity with accuracies between 54% and 87%

Dokument bearbeiten Dokument bearbeiten