Investigating Speaker Gaze and Pointing Behaviour in Human-Computer Interaction with the Mint.Tools Collection

Spyros Kousidis, Casey Kennington, David Schlangen

Research output: Chapter in Book/Report/Conference proceedingChapter

Abstract

Can speaker gaze and speaker arm movements be used as a practical information source for naturalistic conversational human–computer interfaces? To investigate this question, we recorded (with eye tracking and motion capture) a corpus of interactions with a (wizarded) system. In this paper, we describe the recording, analysis infrastructure that we built for such studies, and analysis we performed on these data. We find that with some initial calibration, a “minimally invasive”, stationary camera-based setting provides data of sufficient quality to support interaction.
Original languageAmerican English
Title of host publicationProceedings of the SIGDIAL 2013 Conference
StatePublished - 2013
Externally publishedYes

EGS Disciplines

  • Computer Sciences

Fingerprint

Dive into the research topics of 'Investigating Speaker Gaze and Pointing Behaviour in Human-Computer Interaction with the Mint.Tools Collection'. Together they form a unique fingerprint.

Cite this