Strapdown Attitude Computation: Functional Iterative Integration versus Taylor Series Expansion
September 22, 2019 ยท Declared Dead ยท ๐ Gyroscopy and Navigation
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Yuanxin Wu, Yury A. Litmanovich
arXiv ID
1909.09935
Category
math.NA: Numerical Analysis
Cross-listed
cs.RO
Citations
9
Venue
Gyroscopy and Navigation
Last Checked
1 month ago
Abstract
This paper compares two basic approaches to solving ordinary differential equations, which form the basis for attitude computation in strapdown inertial navigation systems, namely, the Taylor series expansion approach that was used in its low-order form for deriving all mainstream algorithms and the functional iterative integration approach developed recently. They are respectively applied to solve the kinematic equations of major attitude parameters, including the quaternion, the Rodrigues vector and the rotation vector. Specifically, the mainstream algorithms, which have relied on the simplified rotation vector without exception, are considerably extended by the Taylor series expansion approach using the exact rotation vector and recursive calculation of high-order derivatives. The functional iterative integration approach is respectively implemented on both the normal polynomial and the Chebyshev polynomial. Numerical results under the classical coning motion are reported to assess all derived attitude algorithms. It is revealed that in the relative frequency range when the coning to sampling frequency ratio is below 0.05-0.1 (depending on the chosen polynomial truncation order), all algorithms have the same order of accuracy if the same number of samples are used to fit the angular velocity over the iteration interval; in the range of higher relative frequency, the group of Quat/Rod/RotFIter algorithms (by the functional iterative integration approach combined with the Chebyshev polynomial) perform the best in both accuracy and robustness, thanks to the excellent numerical stability and powerful functional representation capability of the Chebyshev polynomial.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Numerical Analysis
R.I.P.
๐ป
Ghosted
R.I.P.
๐ป
Ghosted
Deep learning-based numerical methods for high-dimensional parabolic partial differential equations and backward stochastic differential equations
R.I.P.
๐ป
Ghosted
PDE-Net: Learning PDEs from Data
R.I.P.
๐ป
Ghosted
Efficient tensor completion for color image and video recovery: Low-rank tensor train
R.I.P.
๐ป
Ghosted
Tensor Ring Decomposition
R.I.P.
๐ป
Ghosted
Machine learning approximation algorithms for high-dimensional fully nonlinear partial differential equations and second-order backward stochastic differential equations
Died the same way โ ๐ป Ghosted
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted