Hostname: page-component-6d856f89d9-4thr5 Total loading time: 0 Render date: 2024-07-16T07:34:53.535Z Has data issue: false hasContentIssue false

GDOP, Ridge Regression and the Kalman Filter

Published online by Cambridge University Press:  21 October 2009

R. J. Kelly
Affiliation:
(Allied-Signal Aerospace Company, Bendix Communication Division, Baltimore, Maryland)

Abstract

Multicollinearity and its effect on parameter estimators such as the Kalman filter is analysed using the navigation application as a special example. All position-fix navigation systems suffer loss of accuracy when their navigation landmarks are nearly collinear. Nearly collinear measurement geometry is termed the geometric dilution of position (GDOP). Its presence causes the errors of the position estimates to be highly inflated. In 1970 Hoerl and Kennard developed ridge regression to combat near collinearity when it arises in the predictor matrix of a linear regression model. Since GDOP is mathematically equivalent to a nearly collinear predictor matrix, Kelly suggested using ridge regression techniques in navigation signal processors to reduce the effects of GDOP. The original programme intended to use ridge regression not only to reduce variance inflation but also to reduce bias inflation. Reducing bias inflation is an extension of Hoerl's ridge concept by Kelly. Preliminary results show that ridge regression will reduce the effects of variance inflation caused by GDOP. However, recent results (Kelly) conclude it will not reduce bias inflation as it arises in the navigation problem, GDOP is not a mismatched estimator/model problem. Even with an estimator matched to the model, GDOP may inflate the MSE of the ordinary Kalman filter while the ridge recursive filter chooses a suitable biased estimator that will reduce the MSE. The main goal is obtaining a smaller MSE for the estimator, rather than minimizing the residual sum of squares. This is a different operation than tuning the Kalman filter's dynamic process noise covariance Q, in order to compensate for unmodelled errors. Although ridge regression has not yielded a satisfactory solution to the general GDOP problem, it has provided insight into exactly what causes multicollinearity in navigation signal processors such as the Kalman filter and under what conditions an estimator's performance can be improved.

Type
Research Article
Copyright
Copyright © The Royal Institute of Navigation 1990

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

REFERENCES

1Belsley, D., Kuh, E. & Welsch, R. (1980). Regression Diagnostics: Identifying Influential Data and Sources of Collinearity. New York: Wiley & Sons.Google Scholar
2Efron, B. (1975). Biased versus unbiased estimation. Advances in Math. 16.Google Scholar
3Wax, M. (1985). Position location from sensors with position uncertainty. IEEE Trans. on Aerospace and Electronics Systems, AES-19, 5 (Sept.).Google Scholar
4Torrieri, D. (1984). Statistical theory of passive location systems. IEEE Trans, on Aerospace and Electronic Systems, AES-20, 2 (March).Google Scholar
5Foy, W. (1976). Position-location solutions by Taylor-Sevies estimation. IEEE Trans, on Aerospace and Electronic Systems, AES-12, 2 (March).Google Scholar
6Vinod, H. & Ullah, A. (1981). Recent Advances in Regression Methods, vol. 41. New York: Dekker.Google Scholar
7Jorgensen, P. (1984). NAVSTAR/Global Positioning System 18-Satellite Constellations. 10N GPS Paper, III.Google Scholar
8Kailith, T. (1984). Lectures on Wiener and Kalman filtering, CISM Monograph no. 140. Heidelberg: Springer-Verlag.Google Scholar
9Theobold, C. M. (1974). Generalization of mean square error applied to Ridge regression. J. Royal Statist. Soc. ser. B, 36.Google Scholar
10Riley, J. (1955). Solving systems of linear equations - with a positive, definite, symmetric but possibly ill-conditioned matrix. In Mathematical Tables and Other Aids to Computation, vol. 9.Google Scholar
11Lindley, D. & Smith, A. (1972). Bayes estimates for the linear model. J. Royal Statist. Soc. ser. B, 34, 1.Google Scholar
12Draper, N. & Smith, H. (1981). Applied Regression Analysis, 2nd ed. New York: Wiley.Google Scholar
13Hoerl, A. & Kennard, R. (1970). Ridge regression and bias estimation for nonorthogonal problems. Technometrics, 12, no. 1.Google Scholar
14Marquardt, D. (1970). Generalized inverses, ridge regression, biased linear estimation and non linear estimation. Technometrics, 12.Google Scholar
15Duncan, D. & Horn, S. (1972). Linear dynamic recursive estimation from the viewpoint of regression analysis. J. American Statist. Assoc. 67.Google Scholar
16Agee, W. S. & Turner, R. H., The use of ridge regression in trajectory estimation. Proceedings of the Twenty-Sixth Conference on the Design of Experiments in Army Research, Development and Testing, ARO Report 81–2.Google Scholar
17Diderrich, G. T. (1985). The Kalman filter from the perspective of Goldberger–Theil estimates. American Statistician, 39, no. 3.Google Scholar
18Kelly, R. (1990). Additional results on reducing geometric dilution of precision using ridge regression. IEEE Transactions on Aerospace and Electronics Systems Correspondence, 26, 4 (July).Google Scholar
19Kelly, R. (1989). Omega accuracy improvement using ridge regression. Proceedings of the 14th Annual Meeting, The International Omega Association. Long Beach, CA.Google Scholar
20Sorenson, H. W. (1980). Parameter Estimation. New York: Marcel Dekker.Google Scholar
21Kelly, R. J. (1990). Reducing geometric dilution of precision using Ridge regressions. IEEE Trans, on Aerospace and Electronic Systems, 26, no. 1.Google Scholar
22Oman, S. (1981). A confidence bound approach to choosing the biasing parameter in Ridge regression. J. American Statist. Assoc. 76 (June).CrossRefGoogle Scholar
23Spall, J. (1988). Some preliminary results of inference and confidence intervals in non-Gaussian state-space models. Proceedings of Business and Economic Statistic Section of American Statistical Association.Google Scholar