๐
๐
Old Age
ExpNote: Black-box Large Language Models are Better Task Solvers with Experience Notebook
November 13, 2023 ยท Entered Twilight ยท ๐ Conference on Empirical Methods in Natural Language Processing
Repo contents: .gitignore, README.md, dataloader, datasets, expnote.py, lm.py, main.py, requirements.txt, scripts, utils.py
Authors
Wangtao Sun, Xuanqing Yu, Shizhu He, Jun Zhao, Kang Liu
arXiv ID
2311.07032
Category
cs.CL: Computation & Language
Cross-listed
cs.AI
Citations
3
Venue
Conference on Empirical Methods in Natural Language Processing
Repository
https://github.com/forangel2014/ExpNote
โญ 5
Last Checked
1 month ago
Abstract
Black-box Large Language Models (LLMs) have shown great power in solving various tasks and are considered general problem solvers. However, LLMs still fail in many specific tasks although understand the task instruction. In this paper, we focus on the problem of boosting the ability of black-box LLMs to solve downstream tasks. We propose ExpNote, an automated framework to help LLMs better adapt to unfamiliar tasks through reflecting and noting experiences from training data and retrieving them from external memory during testing. We evaluate ExpNote on multiple tasks and the experimental results demonstrate that the proposed method significantly improves the performance of black-box LLMs. The data and code are available at https://github.com/forangel2014/ExpNote
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Computation & Language
๐
๐
Old Age
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
RoBERTa: A Robustly Optimized BERT Pretraining Approach
R.I.P.
๐ป
Ghosted
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
R.I.P.
๐ป
Ghosted