Abstract
Language is central to human intelligence. We review recent break- throughs in machine language processing and consider what re- mains to be achieved. Recent approaches rely on domain general principles of learning and representation captured in artificial neu- ral networks. Most current models, however, focus too closely on language itself. In humans, language is part of a larger system for acquiring, representing, and communicating about objects and sit- uations in the physical and social world, and future machine lan- guage models should emulate such a system. We describe exist- ing machine models linking language to concrete situations, and point toward extensions to address more abstract cases. Human language processing exploits complementary learning systems, in- cluding a deep neural network-like learning system that learns grad- ually as machine systems do, as well as a fast-learning system that supports learning new information quickly. Adding such a system to machine language models will be an important further step toward truly human-like language understanding.
Item Type: | Paper |
---|---|
EU Funded Grant Agreement Number: | 740516 |
EU Projects: | Horizon 2020 > ERC Grants > ERC Advanced Grant > ERC Grant 740516: NonSequeToR - Non-sequence models for tokenization replacement |
Research Centers: | Center for Information and Language Processing (CIS) |
Subjects: | 000 Computer science, information and general works > 000 Computer science, knowledge, and systems 400 Language > 410 Linguistics |
URN: | urn:nbn:de:bvb:19-epub-72201-1 |
Language: | English |
Item ID: | 72201 |
Date Deposited: | 20. May 2020, 08:17 |
Last Modified: | 04. Nov 2020, 13:53 |