Improving Complex Knowledge Base Question Answering via Question-to-Action and Question-to-Question Alignment

December 26, 2022 ยท Entered Twilight ยท ๐Ÿ› Conference on Empirical Methods in Natural Language Processing

๐Ÿ’ค TWILIGHT: Eternal Rest
Repo abandoned since publication

Repo contents: BFS, README.md, action2text.py, calculate_sample_test_dataset.py, data, model.py, predict_question_rewrite.py, predict_with_beam_search.py, question_decompose.py, requirements.txt, symbolics.py, train.py, train_question_rewrite.py, train_util.py, transform_util.py, utils.py

Authors Yechun Tang, Xiaoxia Cheng, Weiming Lu arXiv ID 2212.13036 Category cs.CL: Computation & Language Cross-listed cs.AI Citations 11 Venue Conference on Empirical Methods in Natural Language Processing Repository https://github.com/TTTTTTTTy/ALCQA โญ 7 Last Checked 1 month ago
Abstract
Complex knowledge base question answering can be achieved by converting questions into sequences of predefined actions. However, there is a significant semantic and structural gap between natural language and action sequences, which makes this conversion difficult. In this paper, we introduce an alignment-enhanced complex question answering framework, called ALCQA, which mitigates this gap through question-to-action alignment and question-to-question alignment. We train a question rewriting model to align the question and each action, and utilize a pretrained language model to implicitly align the question and KG artifacts. Moreover, considering that similar questions correspond to similar action sequences, we retrieve top-k similar question-answer pairs at the inference stage through question-to-question alignment and propose a novel reward-guided action sequence selection strategy to select from candidate action sequences. We conduct experiments on CQA and WQSP datasets, and the results show that our approach outperforms state-of-the-art methods and obtains a 9.88\% improvements in the F1 metric on CQA dataset. Our source code is available at https://github.com/TTTTTTTTy/ALCQA.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

๐Ÿ“œ Similar Papers

In the same crypt โ€” Computation & Language

๐ŸŒ… ๐ŸŒ… Old Age

Attention Is All You Need

Ashish Vaswani, Noam Shazeer, ... (+6 more)

cs.CL ๐Ÿ› NeurIPS ๐Ÿ“š 166.0K cites 8 years ago