Date of Award
Summer 2024
Document Type
Dissertation
Degree Name
Doctor of Philosophy (PhD)
Department
Psychology
Program/Concentration
Human Factors Psychology
Committee Director
Yusuke Yamani
Committee Member
Xiao Yang
Committee Member
Sampath Jayarathna
Committee Member
Eric Chancey
Abstract
Future Advanced Air Mobility (AAM) operations will likely involve autonomous systems that exceed the capabilities of a typical automation. However, human operators could use such systems counterproductively by either misusing unreliable systems or disusing reliable systems. One determinant for inappropriate use of autonomous systems is trust. Human factors theorists proposed numerous ways to characterize trust such as the tripartite model of trust that describes the bases of trust in automation (i.e., performance, process, and purpose; Lee & See, 2004). Previous works have indicated that participants rated lower performance- and process-based trust toward the automation when the tracking task required more frequent input in the Multi-Attribute Task Battery (MATB-II) paradigm (i.e., high task load condition). Yet, it is unclear how trust in automation and trust in autonomy develops over time in attention demanding environments. Specifically, my dissertation employed a 4 (agent characteristics) × 2 (task load) × 3 (time epoch) split-plot design. Participants completed three experimental trials that required participants to concurrently perform the tracking task and the system monitoring task. The system monitoring task was supported by a 70% reliable signaling system. Task load was manipulated between groups by altering the tracking difficulty. Agent characteristics were manipulated by administering one of four vignettes that describe the aid prior to the experimental session. Results demonstrated a temporal effect where participants supplied less attentional resources for performing the system monitoring task and rated high trust over time. Furthermore, the trajectory of trust development was inconsistent with previous findings whereby trust developed from dependability to faith. Finally, an exploratory analysis indicated that the progression of trust development varied across agent characteristics. These findings offer insights for understanding the dynamic nature of trust and developing interventions for mitigating counterproductive use of autonomous systems.
Rights
In Copyright. URI: http://rightsstatements.org/vocab/InC/1.0/ This Item is protected by copyright and/or related rights. You are free to use this Item in any way that is permitted by the copyright and related rights legislation that applies to your use. For other uses you need to obtain permission from the rights-holder(s).
DOI
10.25777/7eb4-ay45
ISBN
9798384444350
Recommended Citation
Sato, Tetsuya.
"Tracing the Development of Trust in Automation/Autonomy in a Multitasking Environment"
(2024). Doctor of Philosophy (PhD), Dissertation, Psychology, Old Dominion University, DOI: 10.25777/7eb4-ay45
https://digitalcommons.odu.edu/psychology_etds/440
ORCID
0009-0008-0571-8253