Breakdowns in human-automation coordination in data-rich, event-driven domains, such as aviation, can be explained in part by a mismatch between the high degree of autonomy yet low observability of modern technology. To some extent, the latter is the result of an increasing reliance in feedback design on foveal vision--an approach that fails to support pilots in tracking system-induced changes and events in parallel with performing concurrent flight-related tasks. A possible solution to the problem is the distribution of tasks and information across sensory modalities and processing channels. In this paper, a simulator study is presented comparing the effectiveness of current foveal feedback and 2 implementations of peripheral visual feedback for keeping pilots informed about uncommanded changes in the status of an automated cockpit system. Both peripheral visual displays resulted in higher detection rates and faster response times, without interfering with performance of concurrent visual tasks any more than currently available automation feedback. Potential applications are discussed.
Abstract