Design and Analysis of the NIPS 2016 Review Process
August 31, 2017 ยท Declared Dead ยท ๐ Journal of machine learning research
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Nihar B. Shah, Behzad Tabibian, Krikamol Muandet, Isabelle Guyon, Ulrike von Luxburg
arXiv ID
1708.09794
Category
cs.DL: Digital Libraries
Cross-listed
cs.LG,
cs.SI,
stat.ML
Citations
108
Venue
Journal of machine learning research
Last Checked
1 month ago
Abstract
Neural Information Processing Systems (NIPS) is a top-tier annual conference in machine learning. The 2016 edition of the conference comprised more than 2,400 paper submissions, 3,000 reviewers, and 8,000 attendees. This represents a growth of nearly 40% in terms of submissions, 96% in terms of reviewers, and over 100% in terms of attendees as compared to the previous year. The massive scale as well as rapid growth of the conference calls for a thorough quality assessment of the peer-review process and novel means of improvement. In this paper, we analyze several aspects of the data collected during the review process, including an experiment investigating the efficacy of collecting ordinal rankings from reviewers. Our goal is to check the soundness of the review process, and provide insights that may be useful in the design of the review process of subsequent conferences.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Digital Libraries
R.I.P.
๐ป
Ghosted
R.I.P.
๐ป
Ghosted
Measuring academic influence: Not all citations are equal
R.I.P.
๐ป
Ghosted
The Open Access Advantage Considering Citation, Article Usage and Social Media Attention
R.I.P.
๐ป
Ghosted
A Bibliometric Review of Large Language Models Research from 2017 to 2023
R.I.P.
๐ป
Ghosted
On the Performance of Hybrid Search Strategies for Systematic Literature Reviews in Software Engineering
R.I.P.
๐ป
Ghosted
A Systematic Identification and Analysis of Scientists on Twitter
Died the same way โ ๐ป Ghosted
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted