Augmenting Reality: Day, Night, All Weather Synthetic Vision Systems

Military helicopter flying in the air

Summary

Helicopters are routinely used by the Emergency Services, Military and other operators to tackle extremely difficult challenges where human life is at stake. For example, the Emergency Services use helicopters to gain access to difficult to reach and frequently hazardous areas that cannot be reached by any other means.

a helicopter
Airborne Test Environment Integrated into Army Lynx Helicopter

Time is also often critical in extracting an injured person and transporting them quickly to a hospital for life saving treatment. Operating scenarios for these helicopters range from very good visibility to extremely hazardous conditions often down to conditions of near zero visibility, and at very low level (below tree top height). Unfortunately, operating at these extreme levels places the lives of the helicopter crew at risk and sometimes a mission has to be aborted due to poor visibility.

The environment in which the helicopter was required to operate was full of hazards that aircrew need to be aware of and comprises telegraph wires, aerial masts, terrain and building structures. Not all of these hazards are known beforehand (e.g. stored on a database). Therefore, a sophisticated means of detecting these hazardous features needs to be carried aboard the helicopter. The big challenge is the way these features are presented to the aircrew whilst they are flying at speed and with sufficient warning so that they can take evasive action. Optimal integration of the different sensory technologies needs to be undertaken so that the pilot has sufficient reaction time to fly the aircraft safely and avoid the obstacles.    

The AVRRC were funded by BAE Systems to research into new concepts for an augmented reality based synthetic vision systems comprising sensor processing and information presentation prior to comprehensive airborne flight trials at Middle Whallop, UK.    

Aims & objectives   

The DNAW project was part of a very comprehensive airborne flight trial that were designed to demonstrate effective and safe use of synthetic vision for  piloting military helicopters, especially for low-level missions. The AVRRC were tasked with undertaking laboratory based investigations to de-risk design concepts before they were tested in the flight trials.    

Research addressed the following areas:    

  • Creation of a comprehensive coupled-simulator integrating real-time models of the atmosphere, sensor performance, system performance, helicopter flight dynamics, synthetic vision display presentation and models of human reaction time/behaviour.    
  • Integrity assessment to address the issue of spatial awareness and range to interference in obscured visibility conditions    
  • Estimation of integrity of the synthetic information provided to the pilot in normal and failure conditions    
  • Undertake a study of the implications and rationale of 2000m visibility (dictated by air regulations)    
  • Create a mathematical model of spatial error in relation to the terrain profile    

AVRRC research undertook an integrity assessment to:    

  • Determine range to interference in obscured visibility conditions, without the use of NVG
  • To estimate the integrity of the synthetic information provided to the pilot in normal and failure conditions    
  • To create a computational model of spatial error in relation to the terrain profile.    
  • To validate the complex coupled simulation model    
  • Sensor modelling & simulation (including atmospheric condition modelling)    
  • Development of range to interference model    
  • Modelling of crew interaction and response    
  • Analysis of multi sensor – integrated cues    
  • Analysis of whole system performance (architectural, timing, information presentation, crew reaction time etc)    
  • Support to safety case studies    
  • Underpinning simulator studies    

System overview

Head Mounted AR Display
Visually Coupled Systems - Head Mounted AR Display

Due to commercial reasons it is not possible to detail the architecture of teh synthetic vision system other than to say that it comprised three forward looking infra-red sensors, three low light TV sensors, a scanning laser obstacle warning system, GPS and other navigational sensors.    

After processing information was presented to the pilot on a see-through head mounted display (augmented reality) comprising a head tracking system to provide a visually coupled system.    

Results

helicopter simulator
Example Infra-red Imaging of Electricity Pylons

The AVRRC created a comprehensive coupled model simulator representing the environment, the helicopter and its synthetic vision system such that a pilot could fly the simulator as though it was the real helicopter. Figure 4 shows the AVRRC simulator that was built to support the trials.    

The simulator outside world was presented in a 4m Vision Dome and the helicopter simulator comprised representative controls and dynamics.    

Head mounted display synthetic vision information was presented as an overlay on the outside world display or on a head mounted display depending on the level of trial. Example of sensor displays of the infra-red imaging system are shown in Figure 5.    

Sensory information was processed and integrated with other sensory and on on-board database information to create hazard/obstacle warning cues for the pilot. Coupling the simulator with models of the atmosphere allowed symbology design to be optimised for different environmental conditions prior to suggesting candidate symbology suites for use in airborne trials. Unfortunately, we are not permitted to provide examples of synthetic vision display symbology we developed for the project since this is commercial protected.    

Conclusions

Model based systems engineering techniques were adopted to both express the simulator architecture and to model the constituent systems. Our work undertaken in the e-Science Reality Grid project was adapted to help define the computational architecture for the synthetic vision simulator. In particular, the human factors research aspects of the research we undertook within the RealityGrid project were directly applicable to this project in creating a representative visually coupled system. Regular meetings between the research team and flight trials people ensured the AVRRC simulator tracked very closely with the trials programme so that risks could be identified before trials took place.    

Impact   

BAE Systems embedded an experienced engineers into our research team - this proved to be a very effective approach to knowledge transfer and ensured our research was more easily integrated into the extensive flight trials programme. This also ensured our system models were as representative as possible of the airborne test environment.    

The project was extremely successful and the AVRRC based laboratory risk reducing research saved millions of pounds in terms of cost savings on the flight trials programme. Also the simulation environment proved to be a great environment in which to prototype and test novel synthetic vision system concepts prior to flight trials.