TY - GEN
T1 - Extravehicular Intelligence Solution for Lunar Exploration and Research
T2 - 24th International Conference on Human-Computer Interaction, HCII 2022
AU - Laing, Megan
AU - Cram, Caleb
AU - Frances, Marc
AU - Tullis, Akiah
AU - Teogalbo, Digno Jr
AU - Doty, Karen
N1 - Publisher Copyright:
© 2022, The Author(s), under exclusive license to Springer Nature Switzerland AG.
PY - 2022
Y1 - 2022
N2 - Augmented Reality Space Informatics System (ARSIS) 5.0 is a prototypal system designed to help astronauts on Extravehicular Activity (EVA), per the 2022 NASA SUITS (National Aeronautics and Space Administration) (Spacesuit User Interface Technologies for Students) Challenge. ARSIS employs a Ground Station and a HoloLens 2 application that cohesively collaborate to improve autonomy, efficiency, and efficacy of communication between Mission Control and an astronaut on the moon. The core of the ARSIS system is in its interactive menu navigation. These menus are implementations of the mixed reality toolkit that have redundant interaction control methods employing hand tracking, eye tracking, and voice commands. A few examples of the interactive menus within the ARSIS system include procedures, biometrics, a geology sampling tool and field notes. The Mini Map is a persistent dismissible panel that displays real-time environmental information, as well as waypoints and beacons set by mission control. This overlay can be expanded into an interactive panel (Mega Map) allowing the user to resize, zoom, set waypoints, and ultimately guide an RC assistant all utilizing hand tracking control methods. Utilizing the Mega Map, the HoloLens 2 user will be able to select destinations in the environment for an RC car, which utilizes self-driving to reach the selected destination. The user receives visual feedback in the HMD via static images and video feed. Information overload is avoided by utilizing Arm-Retained Menus which are virtual informational overlays rendered over the user’s forearms for easy visibility and access. ARMs provide additional access to commonly used features and functions such as Navigation, Emergency, Tools, and System Access. The Navigation ARM is located on the back of the left hand and provides quick access to the Mega Map as well as functionalities related to the Map Navigation Beacons. The Emergency ARM is located on the back of the user’s right hand and provides quick access to Biometrics and LunaSAR system functionalities. The Tools ARM is located on the left palm and provides access to various tools. Included is the Record Path tool which allows the user to record their physical path of movement through an environment visualized as an AR annotation as well as the measurement tool which allows the user to place points to measure distance. Mission Control is equipped with the Ground Station which includes virtual reality (VR) and desktop software portals affording Mission Control three major functionalities. The HoloLens 2 HMD records topological data about the user’s surroundings and transmits that data to the Ground Station. Utilizing point cloud matching, a low-resolution version of the environment is rendered. This function provides the Ground Station users the ability to better understand the immediate environment. Future plans include implementing cloud anchors to increase the accuracy of this simulated environment to combat the problem of drift over time. This function supports the primary purpose of the Ground Station to provide Telestration to the HoloLens 2 user. The Ground Station user has the ability to create icons or paths, which are placed in the HoloLens 2 field of view via augmented reality (AR) annotation. Annotations are placed in accordance with the topology received by the Ground Station ensuring they are properly rendered in the HoloLens 2 user’s real-world environment. The Ground Station is also capable of adding or removing Navigational Beacons as well as procedures at run time to allow for flexible problem solving and communication. Topology reconstruction, Telestration, and Mission Updates can all be performed in real-time with minimal latency affording improved effectiveness and efficiency between Mission Control and an Astronaut on EVA.
AB - Augmented Reality Space Informatics System (ARSIS) 5.0 is a prototypal system designed to help astronauts on Extravehicular Activity (EVA), per the 2022 NASA SUITS (National Aeronautics and Space Administration) (Spacesuit User Interface Technologies for Students) Challenge. ARSIS employs a Ground Station and a HoloLens 2 application that cohesively collaborate to improve autonomy, efficiency, and efficacy of communication between Mission Control and an astronaut on the moon. The core of the ARSIS system is in its interactive menu navigation. These menus are implementations of the mixed reality toolkit that have redundant interaction control methods employing hand tracking, eye tracking, and voice commands. A few examples of the interactive menus within the ARSIS system include procedures, biometrics, a geology sampling tool and field notes. The Mini Map is a persistent dismissible panel that displays real-time environmental information, as well as waypoints and beacons set by mission control. This overlay can be expanded into an interactive panel (Mega Map) allowing the user to resize, zoom, set waypoints, and ultimately guide an RC assistant all utilizing hand tracking control methods. Utilizing the Mega Map, the HoloLens 2 user will be able to select destinations in the environment for an RC car, which utilizes self-driving to reach the selected destination. The user receives visual feedback in the HMD via static images and video feed. Information overload is avoided by utilizing Arm-Retained Menus which are virtual informational overlays rendered over the user’s forearms for easy visibility and access. ARMs provide additional access to commonly used features and functions such as Navigation, Emergency, Tools, and System Access. The Navigation ARM is located on the back of the left hand and provides quick access to the Mega Map as well as functionalities related to the Map Navigation Beacons. The Emergency ARM is located on the back of the user’s right hand and provides quick access to Biometrics and LunaSAR system functionalities. The Tools ARM is located on the left palm and provides access to various tools. Included is the Record Path tool which allows the user to record their physical path of movement through an environment visualized as an AR annotation as well as the measurement tool which allows the user to place points to measure distance. Mission Control is equipped with the Ground Station which includes virtual reality (VR) and desktop software portals affording Mission Control three major functionalities. The HoloLens 2 HMD records topological data about the user’s surroundings and transmits that data to the Ground Station. Utilizing point cloud matching, a low-resolution version of the environment is rendered. This function provides the Ground Station users the ability to better understand the immediate environment. Future plans include implementing cloud anchors to increase the accuracy of this simulated environment to combat the problem of drift over time. This function supports the primary purpose of the Ground Station to provide Telestration to the HoloLens 2 user. The Ground Station user has the ability to create icons or paths, which are placed in the HoloLens 2 field of view via augmented reality (AR) annotation. Annotations are placed in accordance with the topology received by the Ground Station ensuring they are properly rendered in the HoloLens 2 user’s real-world environment. The Ground Station is also capable of adding or removing Navigational Beacons as well as procedures at run time to allow for flexible problem solving and communication. Topology reconstruction, Telestration, and Mission Updates can all be performed in real-time with minimal latency affording improved effectiveness and efficiency between Mission Control and an Astronaut on EVA.
KW - Augmented reality
KW - Bio-metrics
KW - Extended reality
KW - Telestration
KW - Virtual reality
UR - https://www.scopus.com/pages/publications/85140759443
U2 - 10.1007/978-3-031-17615-9_40
DO - 10.1007/978-3-031-17615-9_40
M3 - Conference contribution
AN - SCOPUS:85140759443
SN - 9783031176142
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 563
EP - 580
BT - HCI International 2022 - Late Breaking Papers. Design, User Experience and Interaction - 24th International Conference on Human-Computer Interaction, HCII 2022, Proceedings
A2 - Kurosu, Masaaki
A2 - Yamamoto, Sakae
A2 - Mori, Hirohiko
A2 - Soares, Marcelo M.
A2 - Rosenzweig, Elizabeth
A2 - Marcus, Aaron
A2 - Rau, Pei-Luen Patrick
A2 - Harris, Don
A2 - Li, Wen-Chin
PB - Springer Science and Business Media Deutschland GmbH
Y2 - 26 June 2022 through 1 July 2022
ER -