Towards General Error Diagnosis via Behavioral Testing in Machine Translation
October 20, 2023 ยท Entered Twilight ยท ๐ Conference on Empirical Methods in Natural Language Processing
Repo contents: .DS_Store, .gitignore, LICENSE, README.md, Readme.md, btpg_ner.py, btpg_pos.py, btpg_tense.py, calculate_pass_rate.py, chatgpt_translation.py, compare_comet.py, data, detokenizer.perl, extract_word_phrase_candidates.py, system_compare_comet.py
Authors
Junjie Wu, Lemao Liu, Dit-Yan Yeung
arXiv ID
2310.13362
Category
cs.CL: Computation & Language
Cross-listed
cs.AI,
cs.LG,
cs.SE
Citations
2
Venue
Conference on Empirical Methods in Natural Language Processing
Repository
https://github.com/wujunjie1998/BTPGBT
โญ 2
Last Checked
1 month ago
Abstract
Behavioral testing offers a crucial means of diagnosing linguistic errors and assessing capabilities of NLP models. However, applying behavioral testing to machine translation (MT) systems is challenging as it generally requires human efforts to craft references for evaluating the translation quality of such systems on newly generated test cases. Existing works in behavioral testing of MT systems circumvent this by evaluating translation quality without references, but this restricts diagnosis to specific types of errors, such as incorrect translation of single numeric or currency words. In order to diagnose general errors, this paper proposes a new Bilingual Translation Pair Generation based Behavior Testing (BTPGBT) framework for conducting behavioral testing of MT systems. The core idea of BTPGBT is to employ a novel bilingual translation pair generation (BTPG) approach that automates the construction of high-quality test cases and their pseudoreferences. Experimental results on various MT systems demonstrate that BTPGBT could provide comprehensive and accurate behavioral testing results for general error diagnosis, which further leads to several insightful findings. Our code and data are available at https: //github.com/wujunjie1998/BTPGBT.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Computation & Language
๐
๐
Old Age
๐
๐
Old Age
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
RoBERTa: A Robustly Optimized BERT Pretraining Approach
R.I.P.
๐ป
Ghosted
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
R.I.P.
๐ป
Ghosted