Extravehicular Intelligence Solution for Lunar Exploration and Research: ARSIS 5.0

  • Megan Laing
  • , Caleb Cram
  • , Marc Frances
  • , Akiah Tullis
  • , Digno Jr Teogalbo
  • , Karen Doty

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Augmented Reality Space Informatics System (ARSIS) 5.0 is a prototypal system designed to help astronauts on Extravehicular Activity (EVA), per the 2022 NASA SUITS (National Aeronautics and Space Administration) (Spacesuit User Interface Technologies for Students) Challenge. ARSIS employs a Ground Station and a HoloLens 2 application that cohesively collaborate to improve autonomy, efficiency, and efficacy of communication between Mission Control and an astronaut on the moon. The core of the ARSIS system is in its interactive menu navigation. These menus are implementations of the mixed reality toolkit that have redundant interaction control methods employing hand tracking, eye tracking, and voice commands. A few examples of the interactive menus within the ARSIS system include procedures, biometrics, a geology sampling tool and field notes. The Mini Map is a persistent dismissible panel that displays real-time environmental information, as well as waypoints and beacons set by mission control. This overlay can be expanded into an interactive panel (Mega Map) allowing the user to resize, zoom, set waypoints, and ultimately guide an RC assistant all utilizing hand tracking control methods. Utilizing the Mega Map, the HoloLens 2 user will be able to select destinations in the environment for an RC car, which utilizes self-driving to reach the selected destination. The user receives visual feedback in the HMD via static images and video feed. Information overload is avoided by utilizing Arm-Retained Menus which are virtual informational overlays rendered over the user’s forearms for easy visibility and access. ARMs provide additional access to commonly used features and functions such as Navigation, Emergency, Tools, and System Access. The Navigation ARM is located on the back of the left hand and provides quick access to the Mega Map as well as functionalities related to the Map Navigation Beacons. The Emergency ARM is located on the back of the user’s right hand and provides quick access to Biometrics and LunaSAR system functionalities. The Tools ARM is located on the left palm and provides access to various tools. Included is the Record Path tool which allows the user to record their physical path of movement through an environment visualized as an AR annotation as well as the measurement tool which allows the user to place points to measure distance. Mission Control is equipped with the Ground Station which includes virtual reality (VR) and desktop software portals affording Mission Control three major functionalities. The HoloLens 2 HMD records topological data about the user’s surroundings and transmits that data to the Ground Station. Utilizing point cloud matching, a low-resolution version of the environment is rendered. This function provides the Ground Station users the ability to better understand the immediate environment. Future plans include implementing cloud anchors to increase the accuracy of this simulated environment to combat the problem of drift over time. This function supports the primary purpose of the Ground Station to provide Telestration to the HoloLens 2 user. The Ground Station user has the ability to create icons or paths, which are placed in the HoloLens 2 field of view via augmented reality (AR) annotation. Annotations are placed in accordance with the topology received by the Ground Station ensuring they are properly rendered in the HoloLens 2 user’s real-world environment. The Ground Station is also capable of adding or removing Navigational Beacons as well as procedures at run time to allow for flexible problem solving and communication. Topology reconstruction, Telestration, and Mission Updates can all be performed in real-time with minimal latency affording improved effectiveness and efficiency between Mission Control and an Astronaut on EVA.

Original languageEnglish
Title of host publicationHCI International 2022 - Late Breaking Papers. Design, User Experience and Interaction - 24th International Conference on Human-Computer Interaction, HCII 2022, Proceedings
EditorsMasaaki Kurosu, Sakae Yamamoto, Hirohiko Mori, Marcelo M. Soares, Elizabeth Rosenzweig, Aaron Marcus, Pei-Luen Patrick Rau, Don Harris, Wen-Chin Li
PublisherSpringer Science and Business Media Deutschland GmbH
Pages563-580
Number of pages18
ISBN (Print)9783031176142
DOIs
StatePublished - 2022
Event24th International Conference on Human-Computer Interaction, HCII 2022 - Virtual, Online
Duration: 26 Jun 20221 Jul 2022

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume13516 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference24th International Conference on Human-Computer Interaction, HCII 2022
CityVirtual, Online
Period26/06/221/07/22

Keywords

  • Augmented reality
  • Bio-metrics
  • Extended reality
  • Telestration
  • Virtual reality

Fingerprint

Dive into the research topics of 'Extravehicular Intelligence Solution for Lunar Exploration and Research: ARSIS 5.0'. Together they form a unique fingerprint.

Cite this