Skip to main content

Table 2 Design features of non-clinical decision support applications

From: Advancing clinical decision support using lessons from outside of healthcare: an interdisciplinary systematic review

Design feature and problems addressed

Key lessons from the literature

Feature: Broad, system-level views of the big picture

Provide a broad overview, so that the decision-maker can see the entire environment, what is known, and what is not[22]. Develop a comprehensive view of operations and interconnected systems by identifying key nodes within each system (persons, places, or things), establishing relationships, emphasizing baseline data for the current situation and how it relates more generally to the known solutions, and categorizing information objectively[17]. Provide multiple levels of detail (i.e., the broad view with zooms)[45].

Problems addressed: Tunnel vision, cognitive biases that prevent consideration of the full range of options (e.g., representation heuristic, anchoring and adjustment)

 
 

Filter out unnecessary clutter to increase the leader’s situational awareness, and allow him/her to focus on key tasks[22], but allow system drill-down with increasing granularity to educate decision-makers on the task[33].

Parallels in CDS literature: None

 
 

Frame problems with all the relevant factors and friendly/opposing viewpoints, posing questions throughout the process that prompt users to search for the root of the problem and think about what is not known[17]. Continue the problem formulation process until an opposing view is considered[37, 42].

Feature: Customized to address specified problems and user needs

Development should balance virtues of careful initial design and rapid prototyping[47]. Tools that are simplified and customized for niche uses may, in some instances, be developed rapidly and avoid unnecessary complexity. Expanding a niche system to other user groups then requires a significant jump, and should be done after the processes, data formats, and availability are evaluated[27].

Problems addressed: Generic systems with too much complexity, which are not user-friendly do not handle any single problem well

 
 

Commercial, off-the-shelf systems may work, but they need to be adapted appropriately to the targeted users[25, 32]. In some situations, fully customized systems are required[22]. They should be part of an integrated information system, follow standard software development processes (developing, testing, maintenance), and use standards-consistent hardware and software platforms for acceptability, reliability and maintenance[35].

Parallels in CDS literature: Addressed somewhat by Bates[9]

 
 

Different situations may demand different tools. A defense operation involves many phases: planning, deployment, execution, recovery, and post-operation work – and different tools are needed at each phase[19].

Feature: Involving users in system design

Partner with end users in problem discovery and design[26, 27]. User participation in the development phase can improve the success of adoption, in terms of user satisfaction, intent to use systems, and actual use of systems[30].

Problems addressed: Poor adoption of system, user trust, ease of use

 

Parallels in CDS literature: Addressed in many studies; see Kawamoto[7]

 

Feature: Transparency that documents the underlying methodologies and decision processes

Ensure that users can apply their own judgment and explore trade-offs by using interactive tools and visuals to show likely/unlikely possibilities, short- and long-term trends, etc. – give “better answers, not just the answer,” including supporting evidence and key drivers of outcomes. Show how trade-offs between competing objectives affect outcomes[27], and provide the right level of granularity to back up recommendations[23]. “Build insight, not black boxes”[27].

Problems addressed: User acceptance and over- or under-trust of system recommendations, “satisficing” behavior, ethical biases

 
 

Collect metadata – data that describes the nature of the data, such as user actions and date/time stamps. Build in system capabilities that show what actions are recommended, when they were taken, and what criteria were satisfied to justify those actions. This facilitates tracking how the decision was made, and can be used to improve decisions or provide liability protection[19, 22].

Parallels in CDS literature: Partial: Kawamoto addressed “justification of decision support via provision of reasoning and evidence”[7]

 
 

Elicit the decision-making structure[39]. Provide information about the reliability of the decision aid, and about the reliability of human judgment, to encourage appropriate use of systems – e.g., avoiding blind adherence (overuse) and distrust (underuse)[24]. Restate issues and build flow diagrams that challenge the user to consider how each piece of evidence supports their decision[37].

Feature: Effective organization and presentation of data

Use presentation methods such as summary dashboards, graphics and visuals, interactive simulations and models, storyboards, matrices, spreadsheets, qualitative data fields, and customized interfaces[25, 26, 33, 37, 39, 42]. The most effective presentation format depends on the situation, and research does not consistently support which is right in which situation[39].

Problems addressed: Cognitive limits on processing large volumes of data, meaningful application of naturalistic-intuitive decision-making within rational-analytic DS systems

 
 

Display patterns that are better recognized by humans than computers in showing a trend, and avoid asking users for extra information from unformatted text[22].

 

Provide well-conceived default formats and easy restoration, but allow users to control and customize displays using scatter diagrams, bar charts, dashboards, statistical analyses, reports, etc.[32]. Organize data using filtering and retrieval functions that allow users to change the aggregation level from highly detailed to overall summaries, but add in alerts in case users filter out important information[22, 32] – i.e., allow users to “pull” extra information as desired.

Parallels in CDS literature: Partial: Topic of “relevant data display” in Wright[6]

 
 

“Push” key information and updates to users – deliver prompts when critical new pieces of information arrive, tailored to the action requirements of specific users, and develop pre-programmed sets of plans that can be applied in response to new information[21]. Good DS design will push out only key information that facilitates the task, not overwhelm the user with too much information.

 

Use consistent standards and terminology so that words, situations, and actions are clear, and to increase user friendliness[21].

Feature: Multi-scenario, multi-option generation

Use multi-scenario generation, portfolio analysis, foresight methods, and branch-and-sequel methods to educate the decision-maker on the implications of uncertainty and ways to hedge, including with planned adaptation[18]. Use rational-analytic structures to assure presence of alternative choices and (possibly to apply probabilities and weights), but avoid making a single recommendation about the final choice – instead, show how changes in variables or criteria affect assessments[18, 39].

Problems addressed: Co-existing presence of rational-analytic and naturalistic-intuitive decision-making, unreliable nature of optimization-based models

 
 

Allow the user to explore various outcomes by generating a distribution of all plausible outcomes, accounting for both desired and undesired effects[20]. Simplify by grouping assumptions (including those about values and disagreements), so that users can more readily see how choice depends on “perspective”[45].1

Parallels in CDS literature: None

 
 

Work backward from the observed outcome. Map out the possible chains of events that could have led to the outcome[28]. Alternatively, identify the potential outcomes, then examine all the branches that could lead to those outcomes. Use a hierarchical / nested design to show DS rules that lead to different results[29]. Functionally, the point is to show what one would have to believe to get different results.

Feature: Collaborative, group, and web-based systems

Leverage the Internet and email to support collaborative decisions that draw upon a range of expertise[36, 40]. Share information on a central website, which includes access to analytic tools, databases, and websites for more information[21].

Problems addressed: De-centralized information sources, team collaboration in decision-making, interoperability of systems, need for broad range and depth of expertise from individuals in disparate locations

 
 

At the same time, recognize that expert opinion is often not nearly so reliable as often assumed. This is highly dependent upon details of knowledge elicitation[54].

 

Assure a user-friendly design that requires little training and presents a clear picture of the important features of the situation[22]. With collaborative tools, facilitate rapid communication[1, 21].

Parallels in CDS literature: None

 
 

For “wicked problems” with unclear solutions, use cognitive, dialogue, and process mapping methods to encourage brainstorming and organize a group’s ideas[34].

  1. 1A “perspective” represents a way of looking at issues. In military analyses, perspectives might reflect different values or judgments about the real-world feasibility of competing military strategies (e.g., counterinsurgency strategies based on U.S.-intensive efforts versus efforts to leverage indigenous forces of a supported government). In health care, by analogy, different perspectives might include maximizing length of life versus quality of life, or might reflect different assumptions about how well a patient could and would follow treatment recommendations. A number of assumptions would go naturally with each of these perspectives, reducing dimensionality of the uncertainty analysis. For publicly available decision-support software incorporating the perspectives methodology, see Davis PK and Dreyer P, RAND’s Portfolio Analysis Tool, Santa Monica, CA: RAND, 2009.