21st IFAC Symposium on Automatic Control in Aerospace 21st IFAC Symposium on Automatic Control in Aerospace August 27-30, 2019. Cranfield, UK Control 21st Symposium on 21st IFAC IFAC Symposium on Automatic Automatic Control in in Aerospace Aerospace August 27-30, 2019. Cranfield, UK Available at www.sciencedirect.com August 27-30, 2019. Cranfield, Cranfield, UK Control in online 21st IFAC Symposium on Automatic Aerospace August 27-30, 2019. UK August 27-30, 2019. Cranfield, UK
ScienceDirect
IFAC PapersOnLine 52-12 (2019) 376–381
Runway Relative Positioning of Aircraft Runway Runway Relative Relative Positioning Positioning of of Aircraft Aircraft with IMU-Camera Data Fusion Runway Relative Positioning of Aircraft with IMU-Camera Data Fusion with IMU-Camera Data Fusion with Data ∗∗ Fusion ∗ ∗ ∗∗ Tamas GrofIMU-Camera ∗ Peter Bauer ∗ Antal Hiba ∗∗ Attila Gati ∗∗
Tamas Grof ∗∗ Peter Bauer ∗∗ Antal Hiba ∗∗ Attila Gati ∗∗ ∗ ∗ ∗ Antal ∗∗ Attila Tamas Grof Grof Akos Peter Bauer ∗∗ Hiba Gati ∗∗ Zarandy Balint Vanek ∗ Tamas Peter Bauer Antal Attila Gati Akos Zarandy BalintHiba Vanek ∗∗ ∗ ∗ ∗ ∗∗ ∗∗ ∗ Akos Zarandy BalintHiba VanekAttila Gati ∗∗ Tamas Grof Akos Peter Bauer Antal Zarandy Balint Vanek ∗ ∗ AkosLaboratory, Zarandy ∗∗Institute Balint for Vanek Computer Science and ∗ Systems and Control ∗ Systems and Control Laboratory, Institute for Computer Science and ∗ Systems and Control Laboratory, Institute for Computer Science and Control, Academy of Hungary (e-mail: SystemsHungarian and Control Laboratory, InstituteBudapest, for Computer Science and Control, Hungarian Academy of Sciences, Sciences, Budapest, Hungary (e-mail: ∗ Control, Hungarian Academy of Sciences, Budapest, Hungary (e-mail: Systems and Control Laboratory, Institute for Computer Science and
[email protected]). Control, Hungarian Academy of Sciences, Budapest, Hungary (e-mail:
[email protected]). ∗∗
[email protected]). Control, Hungarian Academy of Sciences, Budapest, Hungary (e-mail: Computational Optical Sensing and Processing Laboratory, Institute ∗∗
[email protected]). ∗∗ Computational Optical Sensing and Processing Laboratory, Institute ∗∗ Computational Optical Sensing andHungarian ProcessingAcademy Laboratory, Institute
[email protected]). for Computer and Control, of Computational Optical Sensing and Processing Laboratory, Institute for Computer Science Science and Control, Hungarian Academy of Sciences, Sciences, ∗∗ for Computer Science and Control, Hungarian Academy of Sciences, Computational Optical Sensing and Processing Laboratory, Budapest, Hungary (e-mail:
[email protected]) for Computer Science and Control, Hungarian Academy of Sciences, Budapest, Hungary (e-mail:
[email protected])Institute Budapest, Hungary for Computer Science and (e-mail: Control,
[email protected]) Hungarian Academy of Sciences, Budapest, Hungary (e-mail:
[email protected]) Budapest, Hungary (e-mail:
[email protected]) Abstract: of providing providing precise precise runway runway relative relative position position and and Abstract: In In this this article article the the challenge challenge of Abstract: In the of precise runway relative position and orientation to aircraft monocular camera and inertial data is Abstract: reference In this this article article the challenge challenge of providing providing precise runway position and orientation reference to a a landing landing aircraft based-on based-on monocular camera and relative inertial sensor sensor data is orientation reference to VISION a landing landing aircraft based-on monocular camera and inertial sensor data is Abstract: In this article the challenge of providing precise runway relative position and targeted in frame of the EU H2020 research project. The sensors provide image positions orientation reference to a aircraft based-on monocular camera and inertial sensor data is targeted in frame of the VISION EU H2020 research project. The sensors provide image positions targeted in frame frame of the the VISION EU H2020 research project. The sensors provide image positions orientation reference to a landing aircraft based-on monocular camera and inertial sensor data is of the corners of the runway and the so-called vanishing point and measured angular rate and targeted in of VISION EU H2020 research project. The sensors provide image positions of the corners of the runway and the so-called vanishing point and measured angular rate and of the corners of the runway and the so-called vanishing point and measured angular rate and targeted in frame of the VISION EU H2020 research project. The sensors provide image positions acceleration of the aircraft. Measured data is fused with an Extended Kalman Filter considering of the corners of the runway and the so-called vanishing point and measured angular rate and acceleration of the aircraft. Measured data is fused with an Extended Kalman Filter considering acceleration of the aircraft. Measured is fused with an Extended Kalman Filter considering of the corners of runway andbiases. the data so-called vanishing andtested measured angular rate and measurement and possible The was off-line with computer acceleration ofnoise thethe aircraft. Measured data fused withmethod anpoint Extended Kalman Filter considering measurement noise and possible biases. Theisdeveloped developed method was tested off-line with computer measurement noise and possible biases. Theisdeveloped developed method was of tested off-line with computer acceleration of the aircraft. Measured data fused with an Extended Kalman Filter considering simulated data from simulation of the aircraft and the processing artificial images. This measurement noise and possible biases. The method was tested off-line with computer simulated data from simulation of the aircraft and the processing of artificial images. This way way simulated data from simulation of the and the processing artificial images. This measurement noise possible biases. The developed was of tested off-line with computer the image generated noise and in processing are considered realistically. simulated fromand simulation of uncertainties the aircraft aircraft and themethod processing of artificial images. This way way considered realistically. the image data generated noise and the the uncertainties in image image processing are the image generated noise and the uncertainties in image processing are considered realistically. simulated data from simulation of the aircraft and the processing of artificial images. This Inertial sensor noises and biases are generated artificially in the simulation. A large set of the image generated noise and the uncertainties in image processing are considered realistically. of Inertial sensor noises and biases are generated artificially in the simulation. A large setway Inertial sensor noises and biases are generated artificially in the simulation. A large set of the image generated noise and the uncertainties in image processing are considered realistically. simulation cases was tested. The results are promising so completing instrumental landing Inertial sensor and biases generated artificially the simulation. A largelanding set of simulation casesnoises was tested. The are results are promising so incompleting instrumental simulation cases was tested. tested. The are results are promising so completing instrumental landing Inertial sensor and estimates biases generated artificially incompleting the simulation. A largelanding set of system and GPS with the can be aa next step development. simulation cases was The results promising instrumental system and GPSnoises with the estimates can beare next step of of so development. system GPS the can aa next step development. simulation was tested. The results promising completing instrumental landing system and and cases GPS with with the estimates estimates can be beare next step of of so development. Copyrightand © 2019. Thewith Authors. bycan Elsevier rights system GPS the Published estimates be aLtd. nextAllstep of reserved. development. Keywords: Keywords: Aircraft Aircraft operation, operation, Computer Computer vision, vision, Vision-inertial Vision-inertial data data fusion fusion Keywords: Keywords: Aircraft Aircraft operation, operation, Computer Computer vision, vision, Vision-inertial Vision-inertial data data fusion fusion Keywords: Aircraft operation, Computer vision, Vision-inertial dataGPS-ILS-Camera fusion 1. INTRODUCTION can augment the 1. INTRODUCTION can augment the GPS-ILS-Camera fusion fusion by by integrating integrating 1. can augment augment the GPS-ILS-Camera GPS-ILS-Camera fusion bydata. integrating inertial measurement unit (IMU) and camera Vision 1. INTRODUCTION INTRODUCTION can the fusion by integrating inertial measurement unit (IMU) and camera data. Vision inertial measurement unit (IMU) andfusion camera data. Vision 1. INTRODUCTION can augment the GPS-ILS-Camera by integrating In recent years several projects aimed to provide analytical integrated navigation is an extensively researched topic inertial measurement unit (IMU) and camera data. In recent years several projects aimed to provide analytical integrated navigation is an extensively researchedVision topic In recent years several projects aimed to provide analytical integrated navigation is an extensively researched topic inertial measurement unit (IMU) and camera data. Vision redundancy and additional information sources to onboard so only few relevant references are cited. Strydom et al. In recent years several projects aimed to provide analytical integrated navigation is an extensively researched topic redundancy and additional information sources to onboard so only few relevant references are cited. Strydom et al. redundancy and additional information sources to onboard so only few relevant references are cited. Strydom et al. In recent years several projects aimed to provide analytical integrated navigation is an extensively researched topic aircraft systems. As camera sensors become more and more (2014) applies stereo vision and optic flow to position a redundancy and additional sources onboard so onlyapplies few relevant cited. et al. aircraft systems. As camerainformation sensors become moretoand more (2014) stereo references vision andare optic flowStrydom to position a aircraft systems. As camera sensors become more and more (2014) applies stereo vision and andare optic flowStrydom to position position a redundancy and additional information sources to onboard so only few relevant references cited. et al. popular not only on unmanned aerial vehicles (UAVs) but quadcopter along a trajectory. The introduced method aircraft systems. As camera sensors become more and more (2014) applies stereo vision optic flow to a popular not only on unmanned aerial vehicles (UAVs) but quadcopter along a trajectory. The introduced method popular not only on unmanned aerial vehicles (UAVs) but quadcopter along a trajectory. trajectory. The introduced method aircraft systems. As camera sensors become more and more (2014) applies stereo vision and optic flow to position also on passenger airplanes (Gibert et al. (2018)) they can requires several hundred (400 in the example) points to be popular not only on unmanned aerial vehicles (UAVs) but quadcopter along a The introduced method also on passenger airplanes (Gibert et al. (2018)) they can requires several hundred (400 in the example) points to bea also on airplanes (Gibert et al. they requires several hundred (400 in points to be popular not only aerial (UAVs) but along a trajectory. Theexample) introduced method be considered as on anunmanned additional source of(2018)) information. A quadcopter tracked, considers estimated from IMU alsoconsidered on passenger passenger airplanes (Gibert et vehicles al.of (2018)) they can can requires hundred (400orientation in the the example) points toand be be as an additional source information. A tracked, several considers estimated orientation from an an IMU and be considered as an additional source of information. A tracked, considers estimated orientation from an IMU and also on passenger airplanes (Gibert et al. (2018)) they can requires several hundred (400 in the example) points to be research project targeting to explore the possibilities to use is limited to 0-15m altitude range because of the stereo vibe considered an additional source information. A tracked, orientation from an stereo IMU and research projectastargeting to explore the of possibilities to use is limitedconsiders to 0-15mestimated altitude range because of the viresearch project targeting to explore the possibilities to use use is limited limited to 0-15m 0-15mintroduced altitude range range because of the stereo vibe considered as an additional source of information. A tracked, considers estimated orientation from an IMU and camera systems as additional information sources during sion. The method in Conte and Doherty (2009) research project targeting to explore the possibilities to is to altitude because of the stereo vicamera systems as additional information sources during sion. The method introduced in Conte and Doherty (2009) camera as information sources during sion. The method method introduced in Conte Conte and of Doherty (2009) research project targeting to explore the possibilities to use is limited to 0-15mintroduced altitude because the aerial stereo viaircraft landing is aaadditional Europe-Japan collaborative project considers monocular vision with geo-referenced imcamera systems systems as additional information sources project during sion. The in and Doherty (2009) aircraft landing is Europe-Japan collaborative considers monocular vision range with geo-referenced aerial imaircraft landing is aaadditional Europe-Japan collaborative project considers monocular vision with geo-referenced aerial imcamera systems as information sources during sion. The method introduced in Conte and Doherty (2009) called VISION (Validation of Integrated Safety-enhanced ages database. Visual odometry with Kalman filter and aircraft landing is Europe-Japan collaborative project considers monocular vision with geo-referenced aerial imcalled VISION (Validation of Integrated Safety-enhanced ages database. Visual odometry with Kalman filter and called of Safety-enhanced ages database. Visual odometry with Kalman filter and aircraft landing is a Europe-Japan collaborative project considers monocular vision with and geo-referenced imIntelligent flight(Validation cONtrol) (see VISION (2016)). VISION acceleration bias state is least identified called VISION VISION (Validation of Integrated Integrated Safety-enhanced ages database. Visual odometry withat filter and Intelligent flight cONtrol) (see VISION (2016)). VISION acceleration bias state is used used and atKalman least 44 aerial identified Intelligent flight cONtrol) (see VISION (2016)). VISION acceleration bias state is used used and and atKalman least 4 filter identified called VISION (Validation of Integrated Safety-enhanced ages database. Visual odometry with and focuses on critical flight scenarios especially on near-earth points in the image are required. Ground relative position, Intelligent flight cONtrol) (see VISION (2016)). VISION acceleration bias state is at least 4 identified focuses on critical flight scenarios especially on near-earth points in the image are required. Ground relative position, focuses flight scenarios especially on points in in the the orientation imagestate are required. required. Ground relative position, Intelligent flight cONtrol) (see VISION (2016)). VISION isare and atwithout least 4 gyroscope identified maneuvers and aims aims to improve improve the overall precision level acceleration velocity estimated focuses on on critical critical flight scenarios especially on near-earth near-earth points image are Ground relative position, maneuvers and to the overall precision level velocity and and bias orientation areused estimated without gyroscope maneuvers and aims to improve the overall precision level velocity and orientation are estimated without gyroscope focuses on critical flight scenarios especially on near-earth points in the image are required. Ground relative of the navigational systems currently used by aircraft. To bias and there can be an observability issue of the heading maneuvers and aims to improve the overall precision level velocity and orientation are estimated without gyroscope of the navigational systems currently used by aircraft. To bias and there can be an observability issue of theposition, heading of navigational currently used by aircraft. To bias and there can an observability issue of the heading maneuvers and aimssystems to improve the overall precision level orientation estimated without gyroscope meet these expectations methods with combined GPS-ILS in mode. Real flight test are presented of the thethese navigational systems currently used by aircraft. To velocity bias and and there can be be an are observability issue of heading meet expectations methods with combined GPS-ILS in hovering hovering mode. Real flight test results results arethe presented meet these expectations methods with combined GPS-ILS in hovering hovering mode. Real flight test test on results arethepresented presented of the navigational systems currently used by aircraft. To bias and there can be an observability issue of (Instrumental Landing System)-Camera data are being deapplying three onboard computers the Yamaha Rmax meet these expectations methods with combined GPS-ILS in mode. Real flight results are (Instrumental Landing System)-Camera data are being de- applying three onboard computers on the Yamahaheading Rmax (Instrumental Landing System)-Camera data are being deapplying three onboard computers on theaYamaha Yamaha Rmax meet these expectations methods with combined GPS-ILS in hovering mode. Real flight test results are presented veloped and tested with the goal to preserve the acceptable helicopter. Weiss et al. (2013) introduces method which (Instrumental Landing System)-Camera data are being deapplying three onboard computers on the veloped and tested with the goal to preserve the acceptable helicopter. Weiss et al. (2013) introduces a method Rmax which veloped and tested with the goal to preserve the acceptable helicopter. Weiss et al. (2013) introduces aaenvironment method which (Instrumental Landing System)-Camera data are being deapplying three onboard computers on the Yamaha Rmax level of safety even if one of the three sensory systems uses IMU-Camera fusion in GPS denied to veloped and tested with the goal to preserve the acceptable helicopter. Weiss et al. (2013) introduces method which level of safety even if one of the three sensory systems uses IMU-Camera fusion in GPS denied environment to level of safety even if one of the three sensory systems uses IMU-Camera fusion in GPS denied environment to veloped and tested with the goal to preserve the acceptable helicopter. Weiss et al. (2013) introduces a method which has degraded performance see Watanabe et al. (2019). estimate aircraft position, velocity and orientation in a level of safety even if one of the three sensory systems IMU-Camera fusion invelocity GPS denied environment has degraded performance see Watanabe et al. (2019). uses estimate aircraft position, and orientation in to a has degraded performance see Watanabe et al. (2019). estimate aircraftsystem. position, velocity and image orientation in to a level of safety even if one of the three sensory systems uses IMU-Camera fusion in GPS denied environment In this article an additional method is presented which loosely coupled It develops the processing has degraded performance see Watanabe et al. (2019). estimate aircraft position, velocity and orientation in In this article an additional method is presented which loosely coupled system. It develops the image processinga In this article an additional method is presented which loosely coupled system. It develops the image processing has degraded performance see Watanabe et al. (2019). estimate aircraft position, velocity and orientation in a part to provide 6D position information and couples it with In this article an additional method is presented which loosely coupled system. It develops the image processing paper was supported by the J´ anos Bolyai Research Scholarpart to provide 6D position information and couples it with This paper was an supported by themethod J´ anos Bolyai Research Scholarpart to provide 6D position information and couples it with InThis this article additional is presented which loosely coupled system. It develops the image processing an Error State Kalman Filter (ESKF) which considers part to provide 6D position information and couples it with ship of the Hungarian Academy of Sciences. The research presented This paper was supported by the J´ a nos Bolyai Research Scholaran Error State Kalman Filter (ESKF) which considers This paper was supported by the J´ anos Bolyai Researchpresented Scholarship of the Hungarian Academy of Sciences. The research an Error Error State Kalman Filter (ESKF) which considers part provide position information and couples it image and possible drifts of the system. It also an State Kalman Filter which considers in this was by the Higher Education Institutional Exship of paper the Hungarian Academy of Sciences. The research research presented paper was funded supported by the J´ anos Bolyai Research Scholarship of the Hungarian Academy of Sciences. The presented imagetoscale scale and6D possible drifts of(ESKF) the vision vision system. Itwith also in This this paper was funded by the Higher Education Institutional Eximage scale and possible drifts of the vision system. It also an Error State Kalman Filter (ESKF) which considers estimates acceleration and angular rate sensor biases. Noncellence Program. The research leading to these results has received in this paper was funded by the Higher Education Institutional Eximage scale and possible drifts of the vision system. It also ship of the Hungarian Academy of Sciences. The research presented in this paper was funded by theleading Higher to Education Institutional Exestimates acceleration and angular rate sensor biases. Noncellence Program. The research these results has received estimates acceleration and angular rate sensor biases. Nonimage scale and possible drifts of the vision system. It also funding from the European Union’s Horizon 2020results research innocellence Program. The research research leading to these results hasand received linear observability analysis is performed and real flight in this paper was funded by the Higher Education Institutional Exestimates acceleration and angular rate sensor biases. Noncellence Program. The leading to these has received funding from the European Union’s Horizon 2020 research and innolinear observability analysis is performed and real flight vation programme under grant agreement 690811 and Japan funding from the Union’s Horizon 2020 research and innolinear observability analysis is performed and real flight estimates acceleration and angular rate sensor biases. Noncellence Program. The research leading to No. these results hasthe received test results underline the applicability of the method. Gibfunding from the European European Union’s Horizon 2020 research and innolinear observability analysis is performed and real flight vation programme under grant agreement No. 690811 and the Japan test results underline the applicability of the method. GibNew Industrial Technology Development Organization vationEnergy programme under grant grant agreement No. 690811 and the Japan funding from and the European Union’s Horizon 2020 research andJapan innotestetresults results underline the applicability applicability of the theand method. Giblinear observability analysis performed realcamera flight vation programme under agreement No. 690811 and the ert (2018) considers the of test underline the method. GibNew Energy and Industrial Technology Development Organization ert et al. al. (2018) considers theisfusion fusion ofofmonocular monocular camera under grant agreement No. 062600 as a part of the EU/Japan joint New Energy Energy and Industrial Industrial Technology Development Organization vation programme under grant agreement No. 690811 and the Japan New and Technology Development Organization ert et etresults al. (2018) (2018) considers the fusion fusion ofofmonocular monocular camera test underline the applicability the method. Gibwith IMU to determine runway relative position assuming under grant agreement No. 062600 as a part of the EU/Japan joint ert al. considers the of camera with IMU to determine runway relative position assuming research project ’Validation under grant agreement No. 062600 as a Integrated part of EU/Japan joint New andentitled Industrial Technology Development Organization underEnergy grant agreement No. 062600 asof part of the the Safety-enhanced EU/Japan joint with IMU to determine runway relative position assuming research project entitled ’Validation ofa Integrated Safety-enhanced ert et al. (2018) considers the fusion of monocular camera that IMU provides ground relative velocity and orientation with IMU to determine runway relative position assuming that IMU provides ground relative velocity and orientation Intelligent flight cONtrol (VISION)’. research project entitledNo. ’Validation ofa Integrated Integrated Safety-enhanced under grant agreement 062600 asof part of the Safety-enhanced EU/Japan joint research project entitled ’Validation Intelligent flight cONtrol (VISION)’. that ground relative velocity and with IMU provides to determine runway relative position assuming that IMU IMU provides ground relative velocity and orientation orientation Intelligent flight (VISION)’. research entitled ’Validation Intelligentproject flight cONtrol cONtrol (VISION)’. of Integrated Safety-enhanced that IMU provides ground relative velocity and orientation Intelligent flight cONtrol (VISION)’. Copyright © 2019 IFAC 2405-8963 Copyright © 2019. The Authors. Published by Elsevier Ltd. 376 All rights reserved. Copyright © 2019 IFAC 376 Copyright ©under 2019 responsibility IFAC 376 Control. Peer review© of International Federation of Automatic Copyright 2019 IFAC 376 10.1016/j.ifacol.2019.11.272 Copyright © 2019 IFAC 376
2019 IFAC ACA August 27-30, 2019. Cranfield, UK
Tamas Grof et al. / IFAC PapersOnLine 52-12 (2019) 376–381
with sufficient precision and the runway sizes are unknown. Finally, Watanabe et al. (2019) considers position and orientation estimation relative to a runway with known sizes during landing (in frame of the VISION project) and applies an ESKF for GNSS-Camera loose/tight data fusion considering the delay in image processing. The estimated parameters are relative position, velocity, orientation and sensor biases. The results of extensive simulation are presented together with the steps toward real flight validation. In Martinelli (2011) a closed-form solution was proposed for orientation and speed determination by fusing monocular vision and inertial measurements. An algorithm is presented where the position of the features and the vehicle speed in the local frame and also the absolute roll and pitch angles can be calculated. In order to calculate these values the camera needs to observe a feature four times during a very short time interval. The strengh of this method is that it is capable to calculate the said states of the aircraft without any initialization and perform the estimation in a very short time, but the observability analysis which was done in the paper showed that the aicraft’s absolute yaw angle can’t be estimated using this particular method. Huang et al. (2017) focuses on estimating the absolute position, velocity and orientation values for a UAV. That article proposes an absolute navigation system based on landmarks which positions are known in advance. With the help of these landmarks the paper considers a navigation filter design by fusing a monocular camera and IMU without considering the possible measurement biases of the latter. They restrict the number of landmarks to three and execute observability analysis. After this an Extended and an Unscented Kalman Filter (UKF) design are proposed to address the problem. Between these two filter designs a comparison is presented in regards of precision levels which showed that the UKF is superior. The method that is presented in this paper focuses on calculating an aircraft’s runway relative position, velocity and orientation applying the fusion of IMU measurements and monocular images. Possibly stereo images can be more effective but resonances can cause more problems (synchronization of the images) and stereo vision has range limitation also (Strydom et al. (2014)). Our method considers only 3 reference points related to the runway: the corners and the vanishing point in the direction of the runway. This is much less than the several tens or hundreds points in Strydom et al. (2014) and Weiss et al. (2013) and does not require geo-referenced images as Conte and Doherty (2009). The only required information to be known is the width of the runway as in Watanabe et al. (2019) there is no need for absolute position of any point. Though the method published in Gibert et al. (2018) does not require any information about the runway they assume that the IMU provides precise velocity and orientation information which is not true for a UAV with low grade IMU. Compared to Martinelli (2011) our method targets to estimate also the yaw angle (by considering the vanishig point). Compared to Huang et al. (2017) we also target to estimate the angular rate and acceleration biases and consider EKF because we assume that the filter can be initialized with close to the real data from GPS and/or ILS.
377
377
The structure of the article is as follows. Section 2 summarizes the system equations and examines the observability of the system, Section 3 introduces the simulation setup while Section 4 summarizes the performance of the algorithm based-on simulation data. Finally Section 5 concludes the paper. 2. SYSTEM EQUATIONS Firstly, we have to define the coordinate systems. These are the fixed runway inertial frame (X E , Y E , Z E ) which centerpoint (OE ) was defined as the center of the runway’s threshold line, the body frame (X B , Y B , Z B ) which is rotated and translated compared to the inertial frame, and lastly the camera frame (X C , Y C , Z C ) which is assumed to have a shared centerpoint with the body frame but its axes are swapped according to Fig. 1. rvp
B
C
O =O
xE
b
f
rL
b
E
zc yc
rR
c
x
O
zE
yE
Fig. 1. Coordinate systems In this paper the mathematical model is formulated by the aircraft kinematic equations including the following variables: T T T T x = vB p q T T T u = aB ω B T b = bTa bTω T T T η = ηaT ηωT ηba ηbω T ν = νzTL νzTR νzTvp T T T T zR zvp y = zL
(1) (2) (3) (4) (5) (6)
Where x is the state vector with vB runway relative velocity in the body system p runway relative position in runway system and q the quaternion representation of runway relative orientation. u is the input including the IMU measurements with the measured acceleration aB and angular rate ωB in body frame, while b contains the bias parameters influencing the accelerometers and gyroscopes. The η vector consists of the process noise variables that are affecting the IMU measurements and the IMU bias values. This case ηa and ηω refer to the noise that affect the inertial measurements, while ηba and ηbω influence the bias values (modeled as first order Markov processes). The measurement noise parameters for the pixel coordinates of the reference points are in the ν vector, where νzL , νzR , νzvp are the measurement noise values that distort the measured pixel coordinates of the left, right corner and the vanishing point respectively. Finally, y includes the camera measurement data, where zL , zR , are the image plane coordinates of the left and right corners of the runway and zvp is the projection of the vanishing point (see (13)).
2019 IFAC ACA 378 August 27-30, 2019. Cranfield, UK
Tamas Grof et al. / IFAC PapersOnLine 52-12 (2019) 376–381
The kinematic equations that describe the aircraft motion are presented in (7) to (10) where VB is the matrix representation of the vB × cross product operator, TEB is the body to runway transformation matrix and Q(q) is the matrix with quaternion terms in the quaternion dynamics similarly as in Weiss et al. (2013). eG B g is the gravitational acceleration in body frame I2 is the two dimensional unit matrix and 0 is a zero matrix with appropriate dimension. v˙ B =VB ωB − VB (bω + ηω ) + aB − ba + eG B g − ηa =fvB (x, η) + g1 (x)vB ωB + g2 (x)vB aB p˙ = TEB vB = fp (x, η) + 0
(7) (8)
q˙ = −Q(q)ωB + Q(q)(bω + ηω ) = fq (x, η) + g1 (x)q ωB (9) b˙ = [0 I2 ] η = fb (x, η) + 0
(10)
The output equations were formulated by using a perspective camera projection model. The first two reference points are the corners of the runway while the third reference point is the so called vanishing point aligned with the runway’s heading direction (it coincides with the runway system X E axis). The camera frame coordinates of these points can be obtained as: rL/R = TCB TBE (fL/R − p) 1 rvp = TCB TBE 0 0
see Vidyasagar (1993)). This calculation can be done by applying the Matlab Symbolic Toolbox. Checking the rank of the symbolic result usually gives full rank as symbolic variables are considered by the toolbox as nonzero. On the contrary there can be several zero parameters in the co-distribution in a special case that’s why special configurations with zero parameters should be carefully checked. The considered nonzero state and input values were selected to represent realistic values as: T T vB = [20, 1, 1] (m/s), p = [−1000, 2, −50] (m), aB = T T ◦ 2 −eG B g + [1, −0.5, −1] (m/s ), ωB = [10, 5, −3] ( /s), T T ba = [−1.5, 1, 2] (m/s2 ), bω = [−1.5, 1, 2] (◦ /s) and q assumes aircraft body alignment with the glideslope (3◦ ). The runway relative position and velocity can not be zero if we are on approach, however ωB , ba , bω can all be zeros. Zero quaternion means an alignment with the runway system (q = [1, 0, 0, 0]) and zero acceleration means pure gravity measurement (aB = −eG B g). The rank of the codistribution was tested for all combinations of zero and nonzero parameters and it resulted to be full rank in all cases. So the system is locally observable in every possible case. The examined combinations are summarized in Fig. 2. The horizontal axis shows the possibly zero parameters, the vertical axis shows the examined 32 cases (one case / row) where ×-s show the zero value of a parameter in the given case.
(11) (12)
Where TCB is the rotation matrix from body to camera frame. fL and fR are the left and right coordinates of T threshold’s corners in the runway’s frame while [1 0 0] is the direction of the vanishing point in the runway frame. Finally, the relation between the image plane and the camera frame coordinates can be defined as follows considering subscripts x, y, z as the coordinates of the vectors and the measurement noises also (index j can be L or R or vp for the three points): f rj,x + ν zj zj = rj,z rj,y
(13)
2.1 Observability Considering the whole system of equations (7) to (10) its a nonlinear system (14) in input affine form so observability should be checked accordingly based-on Vidyasagar (1993) for example.
x˙ = f (x, η) +
m i=1
y = h(x, ν)
gi (x)ui
(14)
Local observability of such a system can be checked by calculating the observability co-distribution (for details 378
Fig. 2. Examined special combinations in observability check 3. MATLAB AND FLIGHTGEAR SIMULATION The estimation algorithm was tuned and tested off-line with Matlab/Simulink simulation generated flight data. This consists of the Matlab simulation of K-50 aircraft synchronized with the Flightgear based generation and processing of runway images giving finally zL , zR , zvp . The camera modeled in FlightGear has a resolution of 1280×960 pixels with 30◦ horizontal field of view. Aircraft runway relative position, velocity and orientation generated inside the K-50 aircraft simulation is considered as real data in the tests of the estimator. The estimator is implemented as an EKF in an off-line script. The considered acceleration and angular rate measurments are also generated by the simulation including artificial biases and noises if required. The simulation uses an ILS model to guide the aircraft towards the runway independently from
2019 IFAC ACA August 27-30, 2019. Cranfield, UK
Tamas Grof et al. / IFAC PapersOnLine 52-12 (2019) 376–381
the off-line estimator. The controller that drives the vehicle to the runway consists of a longitudinal and a lateral part. The vehicle’s runway relative coordinates can be converted into deviations from the path that the glide and localizer sensors provide. These deviations are the inputs of the tracking controller. The outputs of the controller will manage the vehicle’s engine and control surfaces to control the flight path.
Fig. 3. FlightGear image snapshot from Hiba et al. (2018) It is important to note that the IMU and the image processing work with different measurement frequency, so when implementing the EKF algorithm the correction step is only applied when the camera has observed the required pixel coordinates otherwise only the prediction steps are propagated. During the simulation the IMU unit’s frequency was 100Hz while the camera’s frequency was set to 10Hz as in the real hardware system. The delays of image processing (see Watanabe et al. (2019)) are neglected here as they can be considered in a future development. After the implementation and tuning (through the covariance matrices by trial and error) of the EKF several test scenarios were defined in order to check if the algorithm works in different cases. The considered initial positions are summarized in Table 1 while for every initial position all the sensor configurations in Table 2 were considered. Table 1. Simulated initial positions Case
Description
1 2 3
Aircraft starts on the glide slope Vertical or horizontal offset from the glide slope Both vertical and horizontal offset
Table 2. Simulated sensor setups Setup
Description
1
Simulation without any bias or noise Simulation with process and measurement noise, but no sensor bias Simulation with either acceleration or angular rate bias, but without noise Simulation with all of the sensor biases and process and measurement noises
2 3 4
379
4. RESULTS All possible scenarios described in Section 3 Tables 1 and 2 were run in Matlab. All of the simulations were initialized with estimation errors which were set as 5m for position, ◦ 1m s for velocity and 1 for Euler angles (see the figures). In this chapter two scenarios – the best and the worst – are presented in detail as the estimation results of the other scenarios are very similar. The given figures (Fig. 4 to 9) show the estimation errors with dashed lines, while the approximated steady state errors as continuous lines. In the first case the aircraft starts the landing from a position located on the glide slope and there are no additional noises or sensor biases added to the system (simulated case 1/1). This is considered as a best case scenario (everything known perfectly) and the errors relative to the real values remain small as expected. Figures 4 and 6 show that the difference between the real values and the aircraft’s estimated velocity and orientation converges around zero, while Fig. 5 presents the runway relative position errors of the vehicle converging to nonzero values. The K-50 Simulink simulation uses runway relative values in meters, while the Flightgear is fed by the Latitude-LongitudeAltitude (LLA) coordinates. The WGS-84 geoid model used in Matlab in the conversion of distances to LLA is different from the model of Flightgear that’s why there is difference between image-based and Matlab position. Future tests on real flight data will show if the method works well without model mismatch. The second case (simulated case 3/4) includes the results for the worst scenario with initial horizontal and vertical offsets and all sensor biases and noises. The added noise is a zero mean white noise with a varience of 0.1[m/s2 ] for accelerations and 0.01[rad/s] for angular rates. The results of this case (Fig. 7 to 9) show larger deviations from the real values before the convergence occurs but these are also acceptable. The noise causes the estimation errors to continuously alternate around the steady state values with an acceptable amplitude. The rate of convergence is a bit smaller in the worst case then in the best. It is greatly affected by the filter’s covariance matrices, which were set the same for all the simulations. Possibly by further tuning the covariances better convergence can be obtained. The introduction of the sensor biases increased the transient errors but then the steady state error levels are similar. After running all the simulation cases (from 1/1 to 3/4) it can be concluded that bias parameters have a bigger relevance in the early stages of the estimation as they cause greater transient errors while the noises cause some random differences later after the convergence. The results (see Table 3) show that the velocity and orientation of the aircraft can be estimated with close to zero errors after 10-15s convergence. The steady state error of the along and cross positions can be as large as 3-5m and 1m respectively but these are acceptable even in precision landing. However, the −2m altitude error could pose problems during the landing if it will be present in real flight as it estimates UAV position 2m higher than the real value. The sensor bias values [−0.5 0.4 0.8] [ sm2 ] for the accelometer and [−0.3 −0.2 0.1] [ rad s ] for the gyroscope are estimated well by the EKF in all cases as Fig. 10 illustrates.
379
Tamas Grof et al. / IFAC PapersOnLine 52-12 (2019) 376–381
V x error [m/s]
V x error [m/s]
2019 IFAC ACA August 27-30, 2019. Cranfield, UK 380
2 0 -2 -4 5
10
15
20
25
30
2 0 -2 -4 5
10
2 1 0 -1 5
10
15
20
25
30
V z error [m/s]
V z error [m/s]
1 0 -1 15
20
25
30
0
5
10
20
25
30
Cross error [m]
20
25
30
Time [s]
Altitude error [m]
Cross error [m] Altitude error [m]
1 0 15
6 4 2 0 -2 5
10
15
20
25
30
Roll error [deg]
0 -2 -4 20
25
30
Pitch error [deg]
Time [s] 2 0 -2 5
10
15
20
25
30
Time [s] 1 0 -1 5
10
15
25
30
35
0 -1 5
10
15
20
5 3 0 5
10
15
20
25
30
35
25
30
35
25
30
35
5
1 0 5
10
15
20
Time [s] 6 4 2 0 -2 5
10
15
20
20
25
Fig. 8. Estimated postition deviation from real data in simulated case 3/4
Yaw error [deg]
Roll error [deg] Pitch error [deg] Yaw error [deg]
2
15
35
Time [s]
Fig. 5. Estimated position deviation from real data in simulated case 1/1
10
30
1
Time [s]
5
20
Time [s]
5
10
15
2
Time [s]
5
25
0 -1
Fig. 7. Estimated velocity deviation from real data in simulated case 3/4 Along error [m]
Along error [m]
5 3 15
35
Time [s]
Fig. 4. Estimated velocity deviation from real data in simulated case 1/1
10
30
1
Time [s]
5
25
Time [s]
2
10
20
2
Time [s]
5
15
Time [s] V y error [m/s]
V y error [m/s]
Time [s]
30
Time [s]
2 0 -2 -4 5
10
15
20
25
30
35
25
30
35
25
30
35
Time [s] 2 0 -2 5
10
15
20
Time [s] 1 0 -1 5
10
15
20
Time [s]
Fig. 6. Estimated orientation deviation from real data in simulated case 1/1 380
Fig. 9. Estimated orientation deviation from real data in simulated case 3/4
b ax [m/s 2 ]
2019 IFAC ACA August 27-30, 2019. Cranfield, UK
Tamas Grof et al. / IFAC PapersOnLine 52-12 (2019) 376–381
1
nature of the persistent position error regarding whether it comes from the simulation’s model differences or other sources. Future work will include reformulation as Error State Kalman Filter and consideration of image processing delays similar to Watanabe et al. (2019), application considering real flight test data and finally the fusion with ILS and/or SBAS systems at least in simulation.
Estimated bias Real bias
0 -0.5 -1 5
10
15
20
25
30
35
b ay [m/s 2 ]
Time [s] 1
Estimated bias Real bias
0.4
REFERENCES
0 5
10
15
20
25
30
35
Time [s] b az [m/s 2 ]
381
Estimated bias Real bias
1 0.8 0 5
10
15
20
25
30
35
Time [s]
Fig. 10. Estimated accelometer bias in simulated case 3/4 Table 3. Minimum, Maximum and Mean values of the simulations 1/1 and 3/4 Variable
Minimum
Mean
Maximum
Vx m s
−1.15/ − 3.58 −0.49/ − 0.89 −0.33/ − 0.97 0.87/1.11 1.19/1.25 −2.29/ − 2.05 −1.08/ − 3.55 −0.25/ − 1.56 −0.25/ − 0.52
−0.07/ − 0.59
1.01/1.01
−0.01/0.12
1.01/1.88
0.01/ − 0.04
1.00/1.38
2.79/3.97 1.70/1.89 −1.03/ − 0.87 −0.14/ − 0.10
7.27/6.44 5.11/5.10
0.13/0.13
1.10/0.74
−0.08/ − 0.09
0.46/0.60
Vy m s Vz m s Along[m] Cross[m] Alt[m] Roll[deg] P itch[deg] Y aw[deg]
5.05/5.13 0.64/1.37
5. CONCLUSION In this paper an estimation method is presented for fusing inertial and camera sensors to help aircraft navigation during the landing phase. It applies an Extended Kalman Filter. The proposed algorithm is capable to estimate aircraft position, velocity and orientation relative to the runway and the biases of the acceleration and angular rate sensors requiring only the knowledge of the runway width. After showing that the formulated mathematical model is locally state observable, the implemented method was tested offline for data generated in different landing scenarios in Matlab/Simulink simulation. Flightgear software package is connected to Matlab and it implements artificial runway image generation and processing to consider realistic uncertainties of this process. The filter showed promising results in regards of estimating the desired states and sensor biases with acceptable precision levels and also with reasonable estimation convergence times. The only questionable result is the -2m offset in the estimated altitude which could result from the difference between Matlab and Flightgear geoid models. This should be checked before further applying the method. Testing it with real flight data would provide the required information about the 381
Conte, G. and Doherty, P. (2009). VisionBased Unmanned Aerial Vehicle Navigation Using Geo-Referenced Information. EURASIP Journal on Advances in Signal Processing, 2009(1), 387308. doi:10.1155/2009/387308. URL https://doi.org/10.1155/2009/387308. Gibert, V., Plestan, F., Burlion, L., Boada-Bauxell, J., and Chriette, A. (2018). Visual estimation of deviations for the civil aircraft landing. Control Engineering Practice, 75, 17 – 25. doi: https://doi.org/10.1016/j.conengprac.2018.03.004. Hiba, A., Szabo, A., Zsedrovits, T., Bauer, P., and Zarandy, A. (2018). Navigation data extraction from monocular camera images during final approach. In 2018 International Conference on Unmanned Aircraft Systems (ICUAS), 340–345. doi: 10.1109/ICUAS.2018.8453457. Huang, L., Song, J., and Zhang, C. (2017). Observability analysis and filter design for a vision inertial absolute navigation system for UAV using landmarks. Optik, 149, 455 – 468. doi: https://doi.org/10.1016/j.ijleo.2017.09.060. Martinelli, A. (2011). Closed-Form Solutions for Attitude, Speed, Absolute Scale and Bias Determination by Fusing Vision and Inertial Measurements. Research Report RR-7530, INRIA. URL https://hal.inria.fr/inria-00569083. Strydom, R., Thurrowgood, S., and Srinivasan, M.V. (2014). Visual Odometry : Autonomous UAV Navigation using Optic Flow and Stereo. Vidyasagar, M. (1993). Nonlinear Systems Analysis. Prentice Hall. VISION (2016). Vision project webpage. URL https://w3.onera.fr/h2020 vision/node/1. Watanabe, Y., Manecy, A., Hiba, A., Nagai, S., and Aoki, S. (2019). Vision-integrated navigation system for aircraft final approach in case of gnss/sbas or ils failures. AIAA SciTech Forum. American Institute of Aeronautics and Astronautics. doi:10.2514/6.2019-0113. URL https://doi.org/10.2514/6.2019-0113. Weiss, S., Achtelik, M., Lynen, S., Achtelik, M., Kneip, L., Chli, M., and Siegwart, R. (2013). Monocular vision for long-term micro aerial vehicle state estimation: A compendium. J. Field Robotics, 30, 803–831.