Robust multi-stage model-based design of optimal experiments for nonlinear estimation
November 11, 2020 Β· Declared Dead Β· π Computers and Chemical Engineering
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Anwesh Reddy Gottu Mukkula, Michal MateΓ‘Ε‘, Miroslav Fikar, Radoslav Paulen
arXiv ID
2011.06042
Category
stat.ME
Cross-listed
cs.LG,
eess.SY,
math.OC,
stat.ML
Citations
11
Venue
Computers and Chemical Engineering
Last Checked
1 month ago
Abstract
We study approaches to robust model-based design of experiments in the context of maximum-likelihood estimation. These approaches provide robustification of model-based methodologies for the design of optimal experiments by accounting for the effect of the parametric uncertainty. We study the problem of robust optimal design of experiments in the framework of nonlinear least-squares parameter estimation using linearized confidence regions. We investigate several well-known robustification frameworks in this respect and propose a novel methodology based on multi-stage robust optimization. The proposed methodology aims at problems, where the experiments are designed sequentially with a possibility of re-estimation in-between the experiments. The multi-stage formalism aids in identifying experiments that are better conducted in the early phase of experimentation, where parameter knowledge is poor. We demonstrate the findings and effectiveness of the proposed methodology using four case studies of varying complexity.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β stat.ME
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Performance Metrics (Error Measures) in Machine Learning Regression, Forecasting and Prognostics: Properties and Typology
R.I.P.
π»
Ghosted
External Validity: From Do-Calculus to Transportability Across Populations
R.I.P.
π»
Ghosted
Least Ambiguous Set-Valued Classifiers with Bounded Error Levels
R.I.P.
π»
Ghosted
Doubly Robust Policy Evaluation and Optimization
R.I.P.
π»
Ghosted
Comparison of Bayesian predictive methods for model selection
Died the same way β π» Ghosted
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted