Synchronized scene views in mixed virtual reality for guided viewing

I. Vazquez, S. Cutchin

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Virtual Reality devices are available with different resolutions and fields of view. Users can simultaneously interact within environments on head mounted displays, cell phones, tablets, and PowerWalls. Sharing scenes across devices requires solutions that smoothly synchronize shared navigation, minimize jitter and avoid visual confusion. In this paper we present a system that allows a single user to remotely guide many remote users within a virtual reality environment. A variety of mixed device environments are supported to let different users connect to the system. Techniques are implemented to minimize jitter and synchronize views, and deal with different fields of view.

Original languageEnglish
Title of host publicationInternational Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments, ICAT-EGVE 2016
EditorsDirk Reiners, Daisuke Iwai, Frank Steinicke
Pages77-84
Number of pages8
ISBN (Electronic)9783038680123
DOIs
StatePublished - 2016
Event26th International Conference on Artificial Reality and Telexistence, ICAT 2016 and the 21st Eurographics Symposium on Virtual Environments, EGVE 2016 - Little Rock, United States
Duration: 7 Dec 20169 Dec 2016

Publication series

NameInternational Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments, ICAT-EGVE 2016

Conference

Conference26th International Conference on Artificial Reality and Telexistence, ICAT 2016 and the 21st Eurographics Symposium on Virtual Environments, EGVE 2016
Country/TerritoryUnited States
CityLittle Rock
Period7/12/169/12/16

Fingerprint

Dive into the research topics of 'Synchronized scene views in mixed virtual reality for guided viewing'. Together they form a unique fingerprint.

Cite this