Synchronized Scene Views in Mixed Virtual Reality for Guided Viewing

Iker Vazquez, Steve Cutchin

Research output: Contribution to journalArticlepeer-review

Abstract

Virtual Reality devices are available with different resolutions and fields of view. Users can simultaneously interact within environments on head mounted displays, cell phones, tablets, and PowerWalls. Sharing scenes across devices requires solutions that smoothly synchronize shared navigation, minimize jitter and avoid visual confusion. In this paper we present a system that allows a single user to remotely guide many remote users within a virtual reality environment. A variety of mixed device environments are supported to let different users connect to the system. Techniques are implemented to minimize jitter and synchronize views, and deal with different fields of view.

Original languageAmerican English
JournalComputer Science Faculty Publications and Presentations
StatePublished - 1 Jan 2016

Keywords

  • computer graphics
  • graphics systems—remote systems
  • vision and scene understanding—motion

EGS Disciplines

  • Computer Sciences

Fingerprint

Dive into the research topics of 'Synchronized Scene Views in Mixed Virtual Reality for Guided Viewing'. Together they form a unique fingerprint.

Cite this