Hostname: page-component-84b7d79bbc-lrf7s Total loading time: 0 Render date: 2024-07-30T17:14:30.007Z Has data issue: false hasContentIssue false

A Novel Hybrid Unscented Particle Filter based on Firefly Algorithm for Tightly-Coupled Stereo Visual-Inertial Vehicle Positioning

Published online by Cambridge University Press:  11 November 2019

Xiuyuan Li*
Affiliation:
(School of Instrument and Electronics, North University of China, Taiyuan, China)
Wenxue Gao
Affiliation:
(School of Instrument and Electronics, North University of China, Taiyuan, China)
Jiashu Zhang
Affiliation:
(School of Information and Communication Engineering, North University of China, Taiyuan, China)

Abstract

This paper presents a hybrid unscented particle filter (UPF) based on the firefly algorithm for tightly-coupled stereo visual-inertial vehicle positioning systems (VIVPS). Compared with standard UPF, this novel approach can achieve similar estimation accuracy with much less computational complexity. To reduce the computational complexity, the time updating of the hybrid unscented Kalman filter is conducted via the formula of standard linear Kalman filter on the basis of the constructed linear/nonlinear mixed filter model. The particle updating of the particle filter is optimised by modified firefly algorithm to reduce the number of particles needed by means of moving particles towards high likelihood regions via the attraction and movement of fireflies, leading to a significant reduction of computational complexity. Experimental results show the average execution time of the proposed approach is 23·8% that of the standard UPF with similar accuracy, indicating the designed method for tightly-coupled stereo VIVPS can better satisfy the real-time requirement of the system.

Type
Research Article
Copyright
Copyright © The Royal Institute of Navigation 2019

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

REFERENCES

Abeywardena, D. and Dissanayake, G. (2015). Tightly-coupled model aided visual-inertial fusion for quadrotor micro air vehicles. Field and Service Robotics, 105, 153166.CrossRefGoogle Scholar
Aufderheide, D. and Krybus, W. (2010). A Visual-Inertial Approach for Camera Egomotion Estimation and Simultaneous Recovery of Scene Structure. Proceedings of IEEE International Virtual Environments Human-Computer Interfaces and Measurement Systems, Taranto, Italy.CrossRefGoogle Scholar
Durrie, J., Gerritsen, T., Frew, E. W., et al. (2009). Vision-Aided Inertial Navigation on an Uncertain Map Using a Particle Filter. Proceedings of IEEE International Conference on Robotics and Automation, Kobe, Japan.CrossRefGoogle Scholar
Ebcin, S. and Veth, M. (2007). Tightly-coupled image-aided inertial navigation using the unscented Kalman filter. Sensors, 16, 18511860.Google Scholar
Hartley, R. and Zisserman, A. (2004). Multiple View Geometry in Computer Vision, 2nd Edition, Cambridge University Press.CrossRefGoogle Scholar
Kandepu, R., Foss, B. and Imsland, L. (2008). Applying the unscented Kalman filter for nonlinear state estimation. Journal of Process Control, 18(7), 753768.CrossRefGoogle Scholar
Khoshelham, K. and Ramezani, M. (2017). Vehicle Positioning in the Absence of GNSS Signals: Potential of Visual-Inertial Odometry. Proceedings of Joint Urban Remote Sensing Event, Dubai, United Arab Emirates.CrossRefGoogle Scholar
Kneip, L., Weiss, S. and Siegwart, R. (2011). Deterministic Initialization of Metric State Estimation Filters for Loosely-Coupled Monocular Vision-Inertial Systems. Proceedings of IEEE International Conference on Intelligent Robots and Systems, San Francisco, USA.CrossRefGoogle Scholar
Kong, X., Wu, W., Zhang, L., et al. (2015). Tightly-coupled stereo visual-inertial navigation using point and line features. Sensors, 15, 1281612833.CrossRefGoogle ScholarPubMed
Leutenegger, S., Furgale, P., Rabaud, V. et al. (2014a). Keyframe-based visual–inertial SLAM using nonlinear optimization. The International Journal of Robotics Research, 34, 789795.Google Scholar
Leutenegger, S., Lynen, S., Bosse, M. et al. (2015). Keyframe-based visual–inertial odometry using nonlinear optimization. International Journal of Robotics Research, 34, 314334.CrossRefGoogle Scholar
Li, X., Jiang, R., Song, X. et al. (2017). A tightly coupled positioning solution for land vehicles in urban canyons. Journal of Sensors, 3, 111.Google Scholar
Mueggler, E., Gallego, G., Rebecq, H. et al. (2017). Continuous-time visual-inertial trajectory estimation with event cameras. IEEE Transactions on Robotics, 25, 116.Google Scholar
Qiang, M., Lei, X., Cui, H. et al. (2013). Remaining useful life prediction of lithium-ion battery with unscented particle filter technique. Microelectronics Reliability, 53(6), 805810.Google Scholar
Quan, M. and Piao, S. (2017). Robust visual-inertial SLAM: combination of EKF and optimization method. Journal of Sensors, 33, 10581063.Google Scholar
Quan, M., Piao, S., Tan, M. et al. (2017). Accurate monocular visual-inertial SLAM using a map-assisted EKF approach. Journal of Residuals Science & Technology, 32, 10251032.Google Scholar
Ramezani, M. and Khoshelham, K. (2018). Vehicle positioning in GNSS-deprived urban areas by stereo visual-inertial odometry. IEEE Transactions on Intelligent Vehicles, 3, 208217.CrossRefGoogle Scholar
Shen, S., Michael, N. and Kumar, V. (2015). Tightly-Coupled Monocular Visual-Inertial Fusion for Autonomous Flight of Rotorcraft MAVs. Proceedings of IEEE International Conference on Robotics and Automation, Seattle, USA.CrossRefGoogle Scholar
Song-hui, M., Ming-ming, S. and Peng, W. (2016). Tightly coupled visual and inertial measurements for motion estimation. Journal of Residuals Science & Technology, 13, 120126.Google Scholar
Urtasun, R., Lenz, P. and Geiger, A. (2012). Are We Ready for Autonomous Driving? The KITTI Vision Benchmark Suite. Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, Providence, RI, USA.Google Scholar
Van Der Merwe, R., Doucet, A., De Freitas, N. et al. (2001). The unscented particle filter. Advances in Neural Information Processing Systems, 1(1), 17.Google Scholar
Wan, E. A. and Van Der Merwe, R. (2000). The Unscented Kalman Filter for Nonlinear Estimation. Proceedings of the IEEE Adaptive Systems for Signal Processing, Communications, and Control Symposium, Alberta, Canada.CrossRefGoogle Scholar
Xu, W. and Choi, D. (2016). Direct Visual-Inertial Odometry and Mapping for Unmanned Vehicle. Proceedings of International Symposium on Visual Computing, Las Vegas, USA.CrossRefGoogle Scholar
Yang, X. S. (2010). Firefly algorithm, stochastic test functions and design optimisation. International Journal of Bio-Inspired Computation, 2, 7884.CrossRefGoogle Scholar
Yang, X. S. and Deb, S. (2010). Eagle strategy using Lévy walk and firefly algorithms for stochastic optimization. Studies in Computational Intelligence, 284, 101111.Google Scholar
Yang, X. S. and He, X. (2013). Firefly algorithm: recent advances and applications. International Journal of Swarm Intelligence, 1(1), 3650.CrossRefGoogle Scholar
Yang, Z., Gao, F. and Shen, S. (2017). Real-Time Monocular Dense Mapping on Aerial Robots Using Visual-Inertial Fusion. Proceedings of IEEE International Conference on Robotics and Automation, Singapore.CrossRefGoogle Scholar
Yap, T., Li, M., Morikis, A. I. et al. (2011). A Particle Filter for Monocular Vision-Aided Odometry. Proceedings of IEEE International Conference on Robotics and Automation, Shanghai, China.CrossRefGoogle Scholar
Yong, R. and Chen, Y. (2001). Better Proposal Distributions: Object Tracking Using Unscented Particle Filter. Proceedings of IEEE International Conference on Computer Vision & Pattern Recognition, Kauai, USA.CrossRefGoogle Scholar
Zhang, J. and Singh, S. (2015). Visual-inertial combined odometry system for aerial vehicles. Journal of Field Robotics, 32, 10431055.CrossRefGoogle Scholar
Zhang, L., Liu, L., Yang, X. S. et al. (2016). A novel hybrid firefly algorithm for global optimization. PLoS One, 11(9), 7087.Google ScholarPubMed