๐
๐
Old Age
"No, they did not": Dialogue response dynamics in pre-trained language models
October 05, 2022 ยท Entered Twilight ยท ๐ International Conference on Computational Linguistics
Repo contents: .DS_Store, LICENSE, README.md, datasets, src
Authors
Sanghee J. Kim, Lang Yu, Allyson Ettinger
arXiv ID
2210.02526
Category
cs.CL: Computation & Language
Citations
1
Venue
International Conference on Computational Linguistics
Repository
https://github.com/sangheek16/dialogue-response-dynamics
โญ 6
Last Checked
1 month ago
Abstract
A critical component of competence in language is being able to identify relevant components of an utterance and reply appropriately. In this paper we examine the extent of such dialogue response sensitivity in pre-trained language models, conducting a series of experiments with a particular focus on sensitivity to dynamics involving phenomena of at-issueness and ellipsis. We find that models show clear sensitivity to a distinctive role of embedded clauses, and a general preference for responses that target main clause content of prior utterances. However, the results indicate mixed and generally weak trends with respect to capturing the full range of dynamics involved in targeting at-issue versus not-at-issue content. Additionally, models show fundamental limitations in grasp of the dynamics governing ellipsis, and response selections show clear interference from superficial factors that outweigh the influence of principled discourse constraints.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Computation & Language
๐
๐
Old Age
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
RoBERTa: A Robustly Optimized BERT Pretraining Approach
R.I.P.
๐ป
Ghosted
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
R.I.P.
๐ป
Ghosted