Decision Aiding and Human-Computer Interface Design
Overview

CTI utilizes a framework for adaptive decision aid design, called Personalized and Prescriptive Aiding. This approach uses cognitive task analysis and cognitive theory to design aids that are compatible with user knowledge structures and processing strategies, while at the same time drawing on normative theory to provide effective solutions and to guard against potential errors associated with user strategies.

That framework has been applied and tested by CTI staff in a large number of application domains:

  1. Submarine command staff (ONR).

  2. Air wing ground planning (RADC).

  3. Commercial airline pilot diversion decisions (NASA).

  4. Enroute replanning by Air Force pilots (WPAFB).

  5. Army air defense (ARI).

  6. A general-purpose aid for selection from a large database of options (ONR/NUSC).


  7. An aid for inferential retrieval of documents from a large database (NSF).


  8. Cognitive aspects of display design for automated target recognition devices (HRED/ARL).

See also:


Decision Support for Critical Thinking
(Naval Air Warfare Center / Training Systems Division, Orlando, FL)

CTI is developing and testing design concepts for a decision support system that helps Naval officers evaluate air and surface contacts. The aid supports knowledge structures and critical thinking strategies based on the Recognition / Metacognition model. Features of the aid include a causally meaningful layout of information regarding each contact, support for matching data to prestored patterns, and support for interactive construction and testing of plausible stories in situations where no pattern perfectly fits the data.


Automated Target Recognition Design
(Army Research Laboratory / Human Research & Development)

Based on experimental research with active-duty helicopter pilots, CTI has developed a design methodology for tailoring displays and inputs to user perceptual and cognitive structures. The methodology was applied to derive design recommendations for automated target recognition devices which include perceptual transformations of target images to heighten the salience of key features, the appropriate level of detail for labeling conclusions, and methods for facilitating effective user-computer collaboration in handling uncertain cases.



Trust in Decision Aids
(AATD, Fort Eustis, VA)

Users of a decision aid must make a series of decisions that determine their level of dependence on the aid. First, they may or may not choose to use the aid at all. Second, if they use it, they may be able to set the mode in which the aid will operate, including possibly the degree to which a task is turned over to the aid or requires human intervention. During the course of an operation, users must decide whether or not, or how often, to monitor the aid, and whether or not to revisit decisons about the aid's mode of operation. After the aid has produced a recommendation, the user must decide whether to accept it (with or without modifications) or reject it; or in the case of some aids, whether or not to modify the aid's parameters and wait for another recommendation.

All these decisions depend on the degree of trust of the user in the aid under varying conditions and over varyiong spans of time. Yet training of decision aid users has seldom touched on this most critical issue. A key to user acceptance and effective use of decision aids is an understanding of both its strengths and weaknesses.

In research for the Army's Rotorcraft Pilot's Associate Program, CTI developed a systematic framework for understanding trust in decision aids, its parameters, and how it evolves, The framework also describes the relationship between trust at any time and user decisions regarding reliance on the aid. Finally, the framework was applied in the development of training for users of the RPA.




Copyright © 2000-2011 Cognitive Technologies, Inc.
Questions? Comments? Contact webmaster@cog-tech.com