User Testing: Aircrew System Advisory Panel

Aircrew System Advisory Panels (ASAPs) were held twice a year, bringing in active duty fleet personnel to provide feedback on the platform capabilities and OMI as it continued to develop. These sessions allowed for the P-8A Human Engineering group to work directly with active duty operators, gaining insight on design changes and obtaining both qualitative and quantitative data from user testing using realistic mission scenarios.

A total of 9 ASAP meetings and simulation events were held over a period of 4 years.

This is a summary of our methodology. Full details of the final report are unavailable pursuant to Department of Defense distribution restrictions.

Objective

Obtain fleet input regarding the OMI design early and often. Analyze and implement updates based on qualitative and quantitative metrics from user testing and recommended design changes through participant feedback when deemed appropriate.

Design

Two active duty crews (each crew consisting of 1 pilot, 1 TACCO, 1 Co-TACCO, 1 sensor operator and 2 acoustic operators) were brought in for a 4-day ASAP event. Two days were spent in briefings, where P-8A Human Engineering presented changes and newly designed interfaces.

Half of a day was provided for operators in a “free play” type inside the P-8A OMI Simulation Lab. The remaining 1.5 days were then spent running mission scenario simulations with usability evaluations and workload measurements. Operator workstations replicated display resolutions and position of hardware elements, as they would be installed on the actual operational monuments.

Method

Design briefings were held to obtain operator feedback in a focus group type environment.

Crews were given realistic mission scenarios to carry out inside the OMI Simulation Lab while P-8A Human Engineering collected qualitative and quantitative data (e.g., perceived work load and time to complete tasks). Crews were briefed on a “mission in progress” and given one or more objectives to successfully complete the mission. The scenarios were designed to require each operator to perform multiple tasks to successfully reach the objective.

Human Engineering personnel would intervene in extreme circumstances where operators became lost. Operators would be asked to “think aloud” as they worked to overcome the roadblock. After each session the crew was given a questionnaire to evaluate their experience, including a modified Cooper-Harper to evaluate perceived workload.

Results

Feedback from RDCs along with both qualitative and quantitative data from user testing were analyzed and used to make changes to the OMI during early development phases. This allowed for rapid iterative design changes to be made, instead of later in the development process with changes became more difficult.

Prototyping software change reductions.