Research: Comparison of Three EO-IR Camera Control Methods Using the P-8A Operator-Machine Interface

A study was conducted in the P8-A Operator Machine Interface (OMI) Simulation Lab to compare three methods to conduct manual tracking using the EO-IR camera: a hardware thumb-actuated joystick (thumbstick), a touch-screen “virtual” joystick, and a trackball. Former P-3 operators with experience using aircraft EO/IR camera systems took part. Data were collected across a series of trials where each participant manually tracked a variety of ships using the EO-IR camera simulator. Participants provided workload data after each trial. Performance was primarily assessed in terms of time-on-target.

A colleague and I split the total participants, acting as the primary or observing researcher for any given session. The final report was co-authored by myself and the other researcher.

This is a summary of our methodology. Full details of the final report are unavailable pursuant to Department of Defense distribution restrictions.

Design

All participants were run individually in the OMI Simulation Lab, using the latest available software and actual workstation hardware, across four days. Participants were initially briefed on the purpose of the study: an EO-IR manual tracking task comparing three control methods. Participants were then provided with instructions on the task, which was to keep the crosshair on top of the target as much as possible. Targets were surface ships that appeared in the EO-IR camera display.

Each session of the study was composed of three primary blocks, one for each controller-type. These primary blocks were divided into two sub-blocks for different field-of-view (FoV), thus creating a total of six blocks. The purpose of the FoV independent variable was to create two difficulty conditions where the one would be easier, due to FoV characteristics.

Method

All eight possible unique sequences for the three control devices were used in the experiment. The order of controller-method was randomized across the participants. Each participant went through 3 blocks of trials, with each block consisted of three trials. A trial consisted of 2 minutes and 50 seconds worth of manual tracking and participants were told when to begin each trial and when they were approaching the end of each trial. Participants began each trial with the target already on-screen. The task was to use the control device to manually keep the target within the crosshairs as much as possible.

After each trial, participants answered a NASA TLX questionnaire. This index is a multi-dimensional rating procedure that provides an overall workload score based on six sub-scales: Mental Demands, Physical Demands, Temporal Demands, Own Performance, Effort, and Frustration level. NASA TLX was chosen over other Workload measures (e.g., Bedford) because it specifically addresses fatigue (which was expected to be a key contributor to overall workload for manual target tracking). Participants began with a practice trial. Each session consisted of real 18 trials and lasted for about one hour and thirty minutes. The dependent variables measured in this study were:

  • Time on target (ToT): Duration of tracking time on-target per trial. ToT data were collected at two levels. The first level was whether or not any part of the target was within a 1.25 inch by 1.25 inch invisible square surrounding the EO-IR aimpoint crosshair. The second level was whether or not any part of the target was anywhere on the screen at all.
  • NASA TLX: Subjective workload assessment provided after every trial.
  • Rank order preference: Participants were asked to rank the three control devices in their order of preference at the end of the study.

Results

The pattern of results from the analysis of the rank-order, subjective-workload, and tracking performance data support both primary hypotheses regarding touch-screen vs. thumbstick control devices. Results suggest that, for manual tracking tasks similar to that used in this study, performance with the virtual handgrip is expected to be equivalent to the physical handgrip. Interestingly, the trackball control method faired best in terms of all measures: rank-order preference, subjective workload, and tracking performance. This was in part because the trackball control method did not require constant pressure to be applied to keep the camera slewing at a constant rate.

Example tracking performance.