IITK-RSA at SemEval-2020 Task 5: Detecting Counterfactuals
July 21, 2020 ยท Entered Twilight ยท ๐ International Workshop on Semantic Evaluation
"Last commit was 5.0 years ago (โฅ5 year threshold)"
Evidence collected by the PWNC Scanner
Repo contents: .gitignore, CNN-only, Discourse_BiLSTM, LICENSE, README.md, bilstm_cnn_crf, discourse-transformer-lstm, ensembles, transformer-only
Authors
Anirudh Anil Ojha, Rohin Garg, Shashank Gupta, Ashutosh Modi
arXiv ID
2007.10866
Category
cs.CL: Computation & Language
Cross-listed
cs.AI,
cs.LG
Citations
6
Venue
International Workshop on Semantic Evaluation
Repository
https://github.com/gargrohin/Counterfactuals-NLP
Last Checked
1 month ago
Abstract
This paper describes our efforts in tackling Task 5 of SemEval-2020. The task involved detecting a class of textual expressions known as counterfactuals and separating them into their constituent elements. Counterfactual statements describe events that have not or could not have occurred and the possible implications of such events. While counterfactual reasoning is natural for humans, understanding these expressions is difficult for artificial agents due to a variety of linguistic subtleties. Our final submitted approaches were an ensemble of various fine-tuned transformer-based and CNN-based models for the first subtask and a transformer model with dependency tree information for the second subtask. We ranked 4-th and 9-th in the overall leaderboard. We also explored various other approaches that involved the use of classical methods, other neural architectures and the incorporation of different linguistic features.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Computation & Language
๐
๐
Old Age
๐
๐
Old Age
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
RoBERTa: A Robustly Optimized BERT Pretraining Approach
R.I.P.
๐ป
Ghosted
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
R.I.P.
๐ป
Ghosted