Trust in Online Technology: Towards Practical Guidelines Based on Experimentally Verified Theory
A large amount of research attempts to dene trust, yet relatively little research attempts to experimentally verify what makes trust needed in interactions with humans and technology. In this paper we identify the underlying elements of trust-requiring situations: (a) goals that involve dependence on another, (b) a perceived lack of control over the other, (c) uncertainty regarding the ability of the other, and (d) uncertainty regarding the benevolence of the other. Then, we propose a model of the interaction of these elements. We argue that this model can explain why certain situations require trust. To test the applicability of the proposed model to an instance of human-technology interaction, we constructed a website which required subjects to depend on an intelligent software agent to accomplish a task. A strong correlation was found between subjects' level of trust in the software and the ability they perceived the software as having. Strong negative correlations were found between perceived risk and perceived ability, and between perceived risk and trust.
Christian Detweiler and Joost Broekens, Trust in Online Technology: Towards Practical Guidelines Based on Experimentally Verified Theory. In: Human-Computer Interaction, Part III, HCII 2009, LNCS 5612, 2009, pp 605-614