๐
๐
Old Age
SETSum: Summarization and Visualization of Student Evaluations of Teaching
July 08, 2022 ยท Entered Twilight ยท ๐ North American Chapter of the Association for Computational Linguistics
Repo contents: LICENSE, README.md, SETSum - fps15.gif, aspect-extraction, sentiment, summarization, survey, website
Authors
Yinuo Hu, Shiyue Zhang, Viji Sathy, A. T. Panter, Mohit Bansal
arXiv ID
2207.03640
Category
cs.CL: Computation & Language
Cross-listed
cs.AI
Citations
6
Venue
North American Chapter of the Association for Computational Linguistics
Repository
https://github.com/evahuyn/SETSum
โญ 6
Last Checked
1 month ago
Abstract
Student Evaluations of Teaching (SETs) are widely used in colleges and universities. Typically SET results are summarized for instructors in a static PDF report. The report often includes summary statistics for quantitative ratings and an unsorted list of open-ended student comments. The lack of organization and summarization of the raw comments hinders those interpreting the reports from fully utilizing informative feedback, making accurate inferences, and designing appropriate instructional improvements. In this work, we introduce a novel system, SETSum, that leverages sentiment analysis, aspect extraction, summarization, and visualization techniques to provide organized illustrations of SET findings to instructors and other reviewers. Ten university professors from diverse departments serve as evaluators of the system and all agree that SETSum helps them interpret SET results more efficiently; and 6 out of 10 instructors prefer our system over the standard static PDF report (while the remaining 4 would like to have both). This demonstrates that our work holds the potential to reform the SET reporting conventions in the future. Our code is available at https://github.com/evahuyn/SETSum
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Computation & Language
๐
๐
Old Age
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
RoBERTa: A Robustly Optimized BERT Pretraining Approach
R.I.P.
๐ป
Ghosted
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
R.I.P.
๐ป
Ghosted