No Permanent Friends or Enemies: Tracking Relationships between Nations from News
April 18, 2019 ยท Entered Twilight ยท ๐ North American Chapter of the Association for Computational Linguistics
"Last commit was 6.0 years ago (โฅ5 year threshold)"
Evidence collected by the PWNC Scanner
Repo contents: README.md, baseline_model.ipynb, change_point_analysis.ipynb, constants.py, data_processing.ipynb, modules.py, our_model.ipynb, utils.py
Authors
Xiaochuang Han, Eunsol Choi, Chenhao Tan
arXiv ID
1904.08950
Category
cs.CL: Computation & Language
Cross-listed
cs.AI,
cs.SI
Citations
11
Venue
North American Chapter of the Association for Computational Linguistics
Repository
https://github.com/BoulderDS/LARN
โญ 5
Last Checked
1 month ago
Abstract
Understanding the dynamics of international politics is important yet challenging for civilians. In this work, we explore unsupervised neural models to infer relations between nations from news articles. We extend existing models by incorporating shallow linguistics information and propose a new automatic evaluation metric that aligns relationship dynamics with manually annotated key events. As understanding international relations requires carefully analyzing complex relationships, we conduct in-person human evaluations with three groups of participants. Overall, humans prefer the outputs of our model and give insightful feedback that suggests future directions for human-centered models. Furthermore, our model reveals interesting regional differences in news coverage. For instance, with respect to US-China relations, Singaporean media focus more on "strengthening" and "purchasing", while US media focus more on "criticizing" and "denouncing".
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Computation & Language
๐
๐
Old Age
๐
๐
Old Age
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
RoBERTa: A Robustly Optimized BERT Pretraining Approach
R.I.P.
๐ป
Ghosted
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
R.I.P.
๐ป
Ghosted