Lana Tikhomirov, Megan L Bartlett, Jackson Duncan-Reid, Jason S McCarley
{"title":"Identifying inefficient strategies in automation-aided signal detection.","authors":"Lana Tikhomirov, Megan L Bartlett, Jackson Duncan-Reid, Jason S McCarley","doi":"10.1037/xap0000484","DOIUrl":null,"url":null,"abstract":"<p><p>Automated diagnostic aids can assist human operators in signal detection tasks, providing alarms, warnings, or diagnoses. Operators often use decision aids poorly, though, falling short of best possible performance levels. Previous research has suggested that operators interact with binary signal detection aids using a sluggish contingent cutoff (CC) strategy (Robinson & Sorkin, 1985), shifting their response criterion in the direction stipulated by the aid's diagnosis each trial but making adjustments that are smaller than optimal. The present study tested this model by examining the efficiency of automation-aided signal detection under different levels of task difficulty. In a pair of experiments, participants performed a numeric decision-making task requiring them to make signal or noise judgments on the basis of probabilistic readings. The mean reading values of signal and noise states differed between groups of participants, producing two levels of task difficulty. Data were fit with the CC model and two alternative accounts of automation-aided strategy: a discrete deference (DD) model, which assumed participants defer to the aid on a subset of trials and a mixture model, which assumed that participants choose randomly between the CC and DD strategies every trial. Model fits favored the mixture model. The results indicate multiple forms of inefficiency in operators' strategies for using signal detection aids. (PsycInfo Database Record (c) 2023 APA, all rights reserved).</p>","PeriodicalId":48003,"journal":{"name":"Journal of Experimental Psychology-Applied","volume":" ","pages":"869-886"},"PeriodicalIF":2.7000,"publicationDate":"2023-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Experimental Psychology-Applied","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1037/xap0000484","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2023/7/20 0:00:00","PubModel":"Epub","JCR":"Q2","JCRName":"PSYCHOLOGY, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
Automated diagnostic aids can assist human operators in signal detection tasks, providing alarms, warnings, or diagnoses. Operators often use decision aids poorly, though, falling short of best possible performance levels. Previous research has suggested that operators interact with binary signal detection aids using a sluggish contingent cutoff (CC) strategy (Robinson & Sorkin, 1985), shifting their response criterion in the direction stipulated by the aid's diagnosis each trial but making adjustments that are smaller than optimal. The present study tested this model by examining the efficiency of automation-aided signal detection under different levels of task difficulty. In a pair of experiments, participants performed a numeric decision-making task requiring them to make signal or noise judgments on the basis of probabilistic readings. The mean reading values of signal and noise states differed between groups of participants, producing two levels of task difficulty. Data were fit with the CC model and two alternative accounts of automation-aided strategy: a discrete deference (DD) model, which assumed participants defer to the aid on a subset of trials and a mixture model, which assumed that participants choose randomly between the CC and DD strategies every trial. Model fits favored the mixture model. The results indicate multiple forms of inefficiency in operators' strategies for using signal detection aids. (PsycInfo Database Record (c) 2023 APA, all rights reserved).
期刊介绍:
The mission of the Journal of Experimental Psychology: Applied® is to publish original empirical investigations in experimental psychology that bridge practically oriented problems and psychological theory. The journal also publishes research aimed at developing and testing of models of cognitive processing or behavior in applied situations, including laboratory and field settings. Occasionally, review articles are considered for publication if they contribute significantly to important topics within applied experimental psychology. Areas of interest include applications of perception, attention, memory, decision making, reasoning, information processing, problem solving, learning, and skill acquisition.