Abstract
There has been little work on modeling the morphological well-formedness (MWF) of derivatives, a problem judged to be complex and difficult in linguistics (Bauer, 2019). We present a graph auto-encoder that learns em- beddings capturing information about the com- patibility of affixes and stems in derivation. The auto-encoder models MWF in English sur- prisingly well by combining syntactic and se- mantic information with associative informa- tion from the mental lexicon.
| Item Type: | Conference |
|---|---|
| EU Funded Grant Agreement Number: | 740516 |
| EU Projects: | Horizon 2020 > ERC Grants > ERC Advanced Grant > ERC Grant 740516: NonSequeToR - Non-sequence models for tokenization replacement |
| Research Centers: | Center for Information and Language Processing (CIS) |
| Subjects: | 000 Computer science, information and general works > 000 Computer science, knowledge, and systems 400 Language > 410 Linguistics |
| URN: | urn:nbn:de:bvb:19-epub-72197-4 |
| Place of Publication: | Stroudsburg, USA |
| Language: | English |
| Item ID: | 72197 |
| Date Deposited: | 20. May 2020 09:39 |
| Last Modified: | 04. Nov 2020 13:53 |

