Synchronized Shared Scene Viewing in Mixed VR Devices in Support of Group Collaboration

Steve Cutchin, Iker Vazquez

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Virtual Reality devices are available with different resolutions and fields of view. Users can simultaneously interact within environments on head mounted displays, cell phones, tablets, and PowerWalls. Sharing scenes across devices requires solutions that smoothly synchronize shared navigation, minimize jitter and avoid visual confusion. In this paper we present a system that allows a single user to remotely guide many remote users in a virtual environment. A variety of mixed device environments are supported. Techniques are implemented to minimize jitter and synchronize views.

Original languageAmerican English
Title of host publicationCooperative Design, Visualization, and Engineering
Subtitle of host publication13th International Conference, CDVE 2016, Proceedings
EditorsYuhua Luo
PublisherSpringer Verlag
Pages348-352
Number of pages5
ISBN (Electronic)9783319467719
ISBN (Print)9783319467702
DOIs
StatePublished - 2016
Event13th International Conference on Cooperative Design, Visualization, and Engineering, CDVE 2016 - Sydney, Australia
Duration: 24 Oct 201627 Oct 2016

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume9929 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference13th International Conference on Cooperative Design, Visualization, and Engineering, CDVE 2016
Country/TerritoryAustralia
CitySydney
Period24/10/1627/10/16

Keywords

  • VR
  • remote VR
  • scene sharing

EGS Disciplines

  • Computer Sciences

Fingerprint

Dive into the research topics of 'Synchronized Shared Scene Viewing in Mixed VR Devices in Support of Group Collaboration'. Together they form a unique fingerprint.

Cite this