๐
๐
Old Age
Effective Slogan Generation with Noise Perturbation
October 06, 2023 ยท Entered Twilight ยท ๐ International Conference on Information and Knowledge Management
Repo contents: README.md, crawl_img.png, crawling_slogan.ipynb, infer_model, inference.py, requirements.txt
Authors
Jongeun Kim, MinChung Kim, Taehwan Kim
arXiv ID
2310.04472
Category
cs.CL: Computation & Language
Cross-listed
cs.AI
Citations
2
Venue
International Conference on Information and Knowledge Management
Repository
https://github.com/joannekim0420/SloganGeneration
Last Checked
1 month ago
Abstract
Slogans play a crucial role in building the brand's identity of the firm. A slogan is expected to reflect firm's vision and brand's value propositions in memorable and likeable ways. Automating the generation of slogans with such characteristics is challenging. Previous studies developted and tested slogan generation with syntactic control and summarization models which are not capable of generating distinctive slogans. We introduce a a novel apporach that leverages pre-trained transformer T5 model with noise perturbation on newly proposed 1:N matching pair dataset. This approach serves as a contributing fator in generting distinctive and coherent slogans. Turthermore, the proposed approach incorporates descriptions about the firm and brand into the generation of slogans. We evaluate generated slogans based on ROUGE1, ROUGEL and Cosine Similarity metrics and also assess them with human subjects in terms of slogan's distinctiveness, coherence, and fluency. The results demonstrate that our approach yields better performance than baseline models and other transformer-based models.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Computation & Language
๐
๐
Old Age
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
RoBERTa: A Robustly Optimized BERT Pretraining Approach
R.I.P.
๐ป
Ghosted
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
R.I.P.
๐ป
Ghosted