Lemma Dilemma: On Lemma Generation Without Domain- or Language-Specific Training Data

October 08, 2025 · Declared Dead · 🏛 Conference on Empirical Methods in Natural Language Processing

⚰️ CAUSE OF DEATH: The Empty Tomb
GitHub repo is empty
Authors Olia Toporkov, Alan Akbik, Rodrigo Agerri arXiv ID 2510.07434 Category cs.CL: Computation & Language Citations 0 Venue Conference on Empirical Methods in Natural Language Processing Repository https://github.com/oltoporkov/lemma-dilemma Last Checked 1 month ago
Abstract
Lemmatization is the task of transforming all words in a given text to their dictionary forms. While large language models (LLMs) have demonstrated their ability to achieve competitive results across a wide range of NLP tasks, there is no prior evidence of how effective they are in the contextual lemmatization task. In this paper, we empirically investigate the capacity of the latest generation of LLMs to perform in-context lemmatization, comparing it to the traditional fully supervised approach. In particular, we consider the setting in which supervised training data is not available for a target domain or language, comparing (i) encoder-only supervised approaches, fine-tuned out-of-domain, and (ii) cross-lingual methods, against direct in-context lemma generation with LLMs. Our experimental investigation across 12 languages of different morphological complexity finds that, while encoders remain competitive in out-of-domain settings when fine-tuned on gold data, current LLMs reach state-of-the-art results for most languages by directly generating lemmas in-context without prior fine-tuning, provided just with a few examples. Data and code available upon publication: https://github.com/oltoporkov/lemma-dilemma
Community shame:
Not yet rated
Community Contributions

Found the code? Know the venue? Think something is wrong? Let us know!

📜 Similar Papers

In the same crypt — Computation & Language

🌅 🌅 Old Age

Attention Is All You Need

Ashish Vaswani, Noam Shazeer, ... (+6 more)

cs.CL 🏛 NeurIPS 📚 166.0K cites 8 years ago

Died the same way — ⚰️ The Empty Tomb