Advanced Visual System for Situation Awareness Enhancement – Prototype
Enhanced Vision Systems (EVS) are aiming to alleviate restrictions in airspace and airport capacity in low-visibility conditions. EVS relies on weather-penetrating forward-looking sensors that augment the naturally existing visual cues in the environment and provide a real-time image of prominent topographical objects that may be identified by the pilot. The basic idea behind the technology is to allow VMC operations under IMC. The recently released (in spring 2004) final rule of the FAA for Enhanced Flight Vision Systems (EFVS) for part 92 aircraft (business aircraft, general aviation aircraft) clearly acknowledges the operational benefits of such a technology by stating the following: “Use of an EFVS with a HUD may improve the level of safety by improving position awareness, providing visual cues to maintain a stabilized approach, and minimizing missed approach situations“. Moreover, “The pilot would use this enhanced flight visibility … to continue the approach from DH or MDA down to 100 ft above the touchdown zone elevation of the runway of intended landing”. This rule change marks a significant token of confidence towards EVS technology and clearly demonstrates that EVS offers the capability to decrease landing minima and thus, increase accessibility of airports (even of non-equipped airports) under low visibility conditions. One major advantage of EVS is that it can be easily used in combination with other landing aids like e.g. SBAS. Allowing the pilot to “see” under low visibility conditions, EVS increases safety and offers the possibility to increase accessibility and capacity by reducing landing minima or even by reducing separation distances.
It easily can be seen that the performance of the Enhanced Vision System is strongly depending on the selection of imaging sensors. At DH (or MDA) of the flown approach procedure the pilot has to use EVS as primary input to continue the approach down to 100ft after which visual contact to the runway has to be established. Infra-red (IR) and millimeter-wave (MMW) sensors are currently envisaged as EVS support of pilot vision in low visibility. One important benefit of IR-sensors is that these sensors generate a perspective image, from which the human operator can derive the perceptual cues of depth to generate a three-dimensional interpretation of the outside world. This is an important feature of the IR-sensor because a perspective sensor image can be overlaid to the outside scene on a head-up display (HUD). However, the penetration of bad weather in the infrared spectrum is remarkably poorer than the weather penetration that can be achieved by MMW-radar. An active MMW-radar delivers primarily information about the range and the angular direction (azimuth) of a certain object. This range/angle information can be transformed into a view “out-of-the-window”, but there is still a lack of information about the objects’ height and/or their vertical position. The presentation of such images needs knowledge about the surrounding elevation, which often is estimated by the so-called “flat-earth-assumption”.
An important topic for integrating new visual sensors into existing cockpit environments concerns the question on how to visualize the acquired images and/or visual cues. An obvious method for showing this information is a simple overlay onto the head-up-display (HUD). Due to its simplicity this method has been applied in several projects in the past.
The Project ADVISE-PRO
Within the research project ADVISE-PRO (Advanced visual system for situation awareness enhancement – prototype, 2003 - 2006), DLR has combined elements of Enhanced Vision and Synthetic Vision to one integrated system to allow all low visibility operations independently from the infrastructure on ground. The core element of this system is the adequate fusion of all information that is available on-board. In principle, high precision on-board navigation data in combination with an accurate airport database is sufficient to enable the pilot to land the aircraft (Synthetic Vision System). But under low visibility conditions, the pilot cannot verify whether these information is correct, or if there are errors in either the navigation data or in the airport database. Up to now, sufficient integrity neither of available airport databases nor of on-board navigation (combination of DGPS with INS) can be guaranteed. Therefore, without additional means to verify the correctness of such information, approach and landing cannot be performed under such circumstances. In the ADVISE-PRO system, the necessary verification of navigation and airport database (integrity monitoring) is obtained by the additional use of weather penetrating imaging sensors, especially by a MMW radar sensor and a long wave infrared camera. Significant structures, e.g. like the runway itself, are extracted from the sensor data (automatically by means of “machine vision”) and checked whether they match with the navigation data in combination with database information. Furthermore, the sensor images are analysed to detect obstacles on the runway.
Aircraft that are equipped with such a system can perform board-autonomous all-weather approach and landing, only on those airports of which accurate airport databases are available. Thus, one part of ADVISE-PRO was the development of a subsystem that calculates the position of the aircraft relative to the runway threshold based on the analysis of the senor data only (sensor based navigation).
The key modules of the ADVISE-PRO system are:
The structure of the ADVISE-PRO landing system is depicted in Figure 1. The core part of the system is the fusion of different kinds of information (different due to source, content and reliability).For example the fusion of potential runway structures, which have been extracted from different sensor images (IR or MMW) must be combined to runway hypotheses to enable the determination of the aircraft’s position relative to the runway. Further on, such information has to be combined with information from other sources like from GPS, INS, terrain data or from the radar altimeter. All these information result from the automatic analysis of sensor data. Therefore, every piece of information is attached with some uncertainty which can vary from “completely wrong” up to “highly precise”. Consequently, the fusion process has to cope with ambiguous, incomplete or even with contradictory information to derive a consistent description of the current situation.
Another important part of the project ADVISE-PRO is the investigation of an adequate HMI for an EVS-based landing system. The objective is to elaborate which kind of information has to be displayed when, how, and on which kind of display to the pilot so that he is able to perform approach and landing under low visibility conditions autonomously. There are investigations of head-down, head-up, and head-mounted displays. Figure 2 shows an HMI example developed in ADVISE-PRO.
In ADVISE-PRO several simulation runs to support and validate HMI developments have been performed in the cockpit simulator (Figure 3). Furthermore, a huge number of flight trials have been conducted with the DLR test aircraft, a Dornier Do 228, equipped with the EADS HiVision 35 GHz MMW radar, a radar altimeter, a long wave IR-camera and state of the art navigation sensors (INS, DGPS).