A Multiclass Multiple Instance Learning Method with Exact Likelihood
November 29, 2018 Β· Entered Twilight Β· + Add venue
"Last commit was 5.0 years ago (β₯5 year threshold)"
Evidence collected by the PWNC Scanner
Repo contents: CIFAR10_experiment.py, MNIST_Demo.py, MNIST_experiment.py, README.md, SVHN_demo_on_cropped_images.py, TraditionalMIL_ExtendedMNISTExperiment.py, extended_MNIST_experiment.py, misc, mnist_model_test_error_rate_0p0032, svhn, train_svhn.py, utilities.py
Authors
Xi-Lin Li
arXiv ID
1811.12346
Category
stat.ML: Machine Learning (Stat)
Cross-listed
cs.LG
Citations
2
Repository
https://github.com/lixilinx/MCMIL
β 5
Last Checked
1 month ago
Abstract
We study a multiclass multiple instance learning (MIL) problem where the labels only suggest whether any instance of a class exists or does not exist in a training sample or example. No further information, e.g., the number of instances of each class, relative locations or orders of all instances in a training sample, is exploited. Such a weak supervision learning problem can be exactly solved by maximizing the model likelihood fitting given observations, and finds applications to tasks like multiple object detection and localization for image understanding. We discuss its relationship to the classic classification problem, the traditional MIL, and connectionist temporal classification (CTC). We use image recognition as the example task to develop our method, although it is applicable to data with higher or lower dimensions without much modification. Experimental results show that our method can be used to learn all convolutional neural networks for solving real-world multiple object detection and localization tasks with weak annotations, e.g., transcribing house number sequences from the Google street view imagery dataset.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β Machine Learning (Stat)
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Distilling the Knowledge in a Neural Network
R.I.P.
π»
Ghosted
Layer Normalization
R.I.P.
π»
Ghosted
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
R.I.P.
π»
Ghosted
Domain-Adversarial Training of Neural Networks
R.I.P.
π»
Ghosted