Effects of Touch, Voice, and Multimodal Input, and Task Load on Multiple-UAV Monitoring Performance During Simulated Manned-Unmanned Teaming in a Military Helicopter

Citation metadata

Date: Dec. 2018
From: Human Factors(Vol. 60, Issue 8)
Publisher: Sage Publications, Inc.
Document Type: Brief article
Length: 262 words

Document controls

Main content

Abstract :

Objective: We evaluated three interface input methods for a simulated manned-unmanned teaming (MUM-T) supervisory control system designed for Air Mission Commanders (AMCs) in Black Hawk helicopters. Background: A key component of the U.S. Army's vision for unmanned aerial vehicles (UAVs) is to integrate UAVs into manned missions, called MUM-T (Department of Defense, 2010). One application of MUM-T is to provide the AMC of a team of Black Hawk helicopters control of multiple UAVs, offering advanced reconnaissance and real-time intelligence of flight routes and landing zones. Method: Participants supervised a (simulated) team of two helicopters and three UAVs while traveling toward a landing zone to deploy ground troops. Participants classified aerial photographs collected by UAVs, monitored instrument warnings, and responded to radio communications. We manipulated interface input modality (touch, voice, multimodal) and task load (number of photographs). Results: Compared with voice, touch and multimodal control resulted in better performance on all tasks and resulted in lower subjective workload and greater subjective situation awareness, ps Conclusion: Touchscreen and multimodal control were superior to voice control in a supervisory control task that involved monitoring visual displays and communicating on radio channels. Application: Although voice control is often considered a more natural and less physically demanding input method, caution is needed when designing visual displays for users sharing common communication channels. Keywords: supervisory control, uninhabited aerial vehicles, touchscreens, speech user interfaces, multimodal displays

Source Citation

Source Citation   

Gale Document Number: GALE|A563358768