Conversational Assistants: Investigating Privacy Concerns, Trust, and Self-Disclosure

Kambiz Saffarizadeh, Maheshwar Boodraj, Tawfiq M. Alashoor

Research output: Chapter in Book/Report/Conference proceedingChapter

1 Downloads (Pure)

Abstract

By the end of 2017 more than 33 million voice-based devices will be in circulation, many of which will include conversational assistants such as Amazon’s Alexa and Apple’s Siri. These devices require a significant amount of personal information from users to learn their preferences and provide them with personalized responses. This creates an interesting and important tension: the more information users disclose, the greater the value they receive from these devices; however, due to concerns for the privacy of personal information, users tend to disclose less information. In this study, we examine the role of reciprocal self-disclosure and trust within the novel and emerging context of conversational assistants. Specifically, we investigate the effect of conversational assistants’ self-disclosure on the relationship between users’ privacy concerns and their self-disclosure. Further, we explore the mechanism through which self-disclosure by conversational assistants influences this relationship, namely, the role of cognitive trust and emotional trust.

Original languageAmerican English
Title of host publicationConversational Assistants: Investigating Privacy Concerns, Trust, and Self-Disclosure
StatePublished - 10 Dec 2017
Externally publishedYes

Fingerprint

Dive into the research topics of 'Conversational Assistants: Investigating Privacy Concerns, Trust, and Self-Disclosure'. Together they form a unique fingerprint.

Cite this