LAD: Language Models as Data for Zero-Shot Dialog

July 28, 2022 · Declared Dead · 🏛 SIGDIAL Conferences

⚰️ CAUSE OF DEATH: The Empty Tomb
GitHub repo is empty
Authors Shikib Mehri, Yasemin Altun, Maxine Eskenazi arXiv ID 2207.14393 Category cs.CL: Computation & Language Cross-listed cs.AI Citations 27 Venue SIGDIAL Conferences Repository https://github.com/Shikib/lad ⭐ 8 Last Checked 1 month ago
Abstract
To facilitate zero-shot generalization in taskoriented dialog, this paper proposes Language Models as Data (LAD). LAD is a paradigm for creating diverse and accurate synthetic data which conveys the necessary structural constraints and can be used to train a downstream neural dialog model. LAD leverages GPT-3 to induce linguistic diversity. LAD achieves significant performance gains in zero-shot settings on intent prediction (+15%), slot filling (+31.4 F-1) and next action prediction (+11 F1). Furthermore, an interactive human evaluation shows that training with LAD is competitive with training on human dialogs. LAD is open-sourced, with the code and data available at https://github.com/Shikib/lad.
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

📜 Similar Papers

In the same crypt — Computation & Language

🌅 🌅 Old Age

Attention Is All You Need

Ashish Vaswani, Noam Shazeer, ... (+6 more)

cs.CL 🏛 NeurIPS 📚 166.0K cites 8 years ago

Died the same way — ⚰️ The Empty Tomb