Multimodal pilot cueing for 360° situation awareness

dc.contributor.author Godfroy-Cooper, M.
dc.contributor.author Miller, J.D.
dc.contributor.author Szoboslay, Z.
dc.contributor.author Hartnett, G.
dc.date.accessioned 2022-10-04T07:23:21Z
dc.date.available 2022-10-04T07:23:21Z
dc.date.issued 2019
dc.description.abstract The improved agility and flight control augmentation of Future Vertical Lift (FVL) aircraft will allow a variety of mission sets, extending the current helicopter reach to new terrains of operations such as high-altitude desert plateau and the urban canyons of megacities. Operations in megacities will require many of the same aviation capabilities of attack, reconnaissance, assault, and medical evacuation used in operations in less dense terrain, but with considerable constraints. Megacities offer limited landing and pickup zones. Flying close to the ground to provide air support is made more difficult by powerlines, antennas and satellites dishes, and narrow flight patterns between buildings. In this context, it is crucial to develop integrated multimodal interfaces that extend the current operational envelope while enhancing flight safety, providing a 360° SA coverage. Visual displays present inherent limitations due to partial representation of the threat space, because of their limited field-of-view (FOV) or their 2D exocentric perspective. Spatial auditory displays support a natural, ecologically valent, egocentric representation of space where auditory objects behave realistically in terms of direction, distance, and motion. Tactile displays also support a partial representation of 3D space, although with a lower resolution and typically limited to direction and motion. A study was conducted at the US Army Aeromedical Research Laboratory to evaluate the effectiveness of a trimodal display suite consisting of the Integrated Cueing Environment-Collision Avoidance Symbology (ICECAS) blended with the Primary Flight Display (PFD) symbology, an Integrated Collision Avoidance Display (ICAD) overlaying a panel-mounted terrain display (PMD), an Augmented-Reality Spatial Auditory Display (ARSAD), and the Tactile Situational Awareness System (TSAS). Ten UH60M Army evaluation pilots participated in a high-fidelity simulation at the U.S. Army Aeromedical Laboratory (USAARL) in the full-motion UH60 simulator. The results showed that deviations from Commanded Heading were the lowest when the Spatial Auditory Display was used, even more pronounced when the TSAS was activated. This suggests that the Auditory warning gives more time to the pilot to plan the avoidance trajectory. Overall, Exposure Time, which represents the frequency of Time on Task where at least one obstacle was present within the Threat Space (Caution and Warning regions around the ownship), was the lowest when using a combination of Visual, Spatial Auditory and Tactile Displays. Exposure Time to two obstacles (vs. one) was also the lowest with the trimodal Visual-Auditory-Tactile Display combination. When the Tactile Display was activated, the Time of Exposure in the Warning region was lower in the Visual-Auditory-Tactile than in the Visual-Tactile condition, indicating that the spatial auditory information led to a faster avoidance maneuver. These qualitative results validate the previously reported and new subjective data, and demonstrate the substantial advantage provided by multimodal displays for obstacle avoidance. The evolution of the multimodal Display suite and its physical integration for in flight demonstration are discussed in the context of pilot cueing synergies for the FVL multirole platform.
dc.identifier.other ERF2019 0175
dc.identifier.uri https://hdl.handle.net/20.500.11881/4056
dc.language.iso en
dc.title Multimodal pilot cueing for 360° situation awareness
Files
Original bundle
Now showing 1 - 1 of 1
Thumbnail Image
Name:
ERF2019 0175.pdf
Size:
2.7 MB
Format:
Adobe Portable Document Format
Description:
Collections