Logo Logo
Switch Language to German

Yue, Zhenrui; Kratzwald, Bernhard and Feuerriegel, Stefan ORCID logoORCID: https://orcid.org/0000-0001-7856-8729 (2021): Contrastive Domain Adaptation for Question Answering using Limited Text Corpora. EMNLP 2021, Online, 07.11.2021-11.11.2021. In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pp. 9575-9593

Full text not available from 'Open Access LMU'.


Question generation has recently shown impressive results in customizing question answering (QA) systems to new domains. These approaches circumvent the need for manually annotated training data from the new domain and, instead, generate synthetic question answer pairs that are used for training. However, existing methods for question generation rely on large amounts of synthetically generated datasets and costly computational resources, which render these techniques widely inaccessible when the text corpora is of limited size. This is problematic as many niche domains rely on small text corpora, which naturally restricts the amount of synthetic data that can be generated. In this paper, we propose a novel framework for domain adaptation called contrastive domain adaptation for QA (CAQA). Specifically, CAQA combines techniques from question generation and domain-invariant learning to answer out-of-domain questions in settings with limited text corpora. Here, we train a QA system on both source data and generated data from the target domain with a contrastive adaptation loss that is incorporated in the training objective. By combining techniques from question generation and domain-invariant learning, our model achieved considerable improvements compared to state-of-the-art baselines.

Actions (login required)

View Item View Item