Investigating Speaker Gaze and Pointing Behaviour in Human-Computer Interaction with the Mint.Tools Collection

Spyros Kousidis, Casey Kennington, David Schlangen

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

9 Scopus citations

Abstract

Can speaker gaze and speaker arm movements be used as a practical information source for naturalistic conversational human-computer interfaces? To investigate this question, we recorded (with eye tracking and motion capture) a corpus of interactions with a (wizarded) system. In this paper, we describe the recording, analysis infrastructure that we built for such studies, and analysis we performed on these data. We find that with some initial calibration, a "minimally invasive", stationary camera-based setting provides data of sufficient quality to support interaction.

Original languageAmerican English
Title of host publicationSIGDIAL 2013 - 14th Annual Meeting of the Special Interest Group on Discourse and Dialogue, Proceedings of the Conference
Place of PublicationMetz, France
Pages319-323
Number of pages5
ISBN (Electronic)9781937284954
StatePublished - 2013
Event14th Annual Meeting of the Special Interest Group on Discourse and Dialogue, SIGDIAL 2013 - Metz, France
Duration: 22 Aug 201324 Aug 2013

Publication series

NameSIGDIAL 2013 - 14th Annual Meeting of the Special Interest Group on Discourse and Dialogue, Proceedings of the Conference

Conference

Conference14th Annual Meeting of the Special Interest Group on Discourse and Dialogue, SIGDIAL 2013
Country/TerritoryFrance
CityMetz
Period22/08/1324/08/13

EGS Disciplines

  • Computer Sciences

Fingerprint

Dive into the research topics of 'Investigating Speaker Gaze and Pointing Behaviour in Human-Computer Interaction with the Mint.Tools Collection'. Together they form a unique fingerprint.

Cite this