NASA SUITS (Spacesuit User Interface Technologies for Students)↗️ gives students an authentic engineering design experience supporting NASA’s Artemis mission—landing American astronauts safely on the Moon! This activity challenges students to design and create spacesuit information displays within augmented reality environments.
CLAWS (Collaborative Lab for Advanced Working in Space)↗️ is a multi-disciplinary group of student designers, researchers, engineers at the University of Michigan. Our purpose of the project is designing the spacesuit information display that enables astronauts to conduct various tasks and we are going to present and test the project at Johnson Space Center(JSC) with astronauts and EVA specialists in May 2023.
NASA SUITS (Spacesuit User Interface Technologies for Students)
Sep 2022 ~ May 2023
UX Designer
9 UX/UI Designers,
13 AR Developers,
11 Web Developers,
9 Hardware Engineers
We are creating AR assistant system for astronauts (yes, for real astronauts!) to conduct various tasks in the space. As NASA launches the Artemis program for sustained human presence on the lunar surface and ultimately, Mars, engineers are considering what technology will best aid astronauts to safely and successfully complete science and exploration missions.
→ Our AR assistant system, N.O.V.A, is designed to allow astronauts to support NASA's Artemis mission in the space in the future.
This is a final pitch video from last year's product(HOSHI). Please watch it as a reference only.
We are creating a new product this year(N.O.V.A), but this is a good reference to understand our product as a whole.
We conducted interviews with 3 former astronauts, 1 researcher, and 1engineer at NASA to better understand the external factors at the space environment and learn unfamiliar circumstances such as UIA Egress, EVA, Geological sampling, etc.
We created 5 different user interview protocol scripts based on each interviewee's role and experience. After we obtained the contact info from NASA, we started researching each participants' experience and knowledge since each participant had/have different responsibilities and experience at NASA.
We identified 4 different user scenarios according to our key findings as well as the NASA SUITS Challenge Full Scenario scripts.
Within EVA (Lunar Extravehicular Activity), there are 4 scenarios as you can see below. We wrote down the user scenario scripts for each scenario.
We understand the design for AR is different from the design for 2D mobile or web. Design principle should be different for the AR design. We set the design tenet for AR environment as follows:
AR design should be immersive experience, but safety of the design experience comes first. Blocking user's sight with augmented UIs and distracting their attention with AR popups could be dangerous.
We set the field of view first before diving into low-fi prototype. User is seeing the middle screen/view when they wear HoloLens. After set the eye-gazing setup, the user can see the left or right view and the system tracks the user's gaze.
Voice & Eye-Gazing
We implement the voice assistant "VEGA," a personal voice command system which provide information needs during EVA. Eye-gazing acts as a "cursor" to navigate the interface.
All actions that can be done by voice can be performed by hand gesturing on the interface.
It is important to give feedback to the user while using the system. We give astronauts audio and haptic feedback as well as visual feedback.
Microsoft’s Mixed Reality Toolkit (MRTK) is a cross-platform toolkit that accelerates cross-platform MR app development for Virtual Reality (VR) and Augmented Reality (AR). We used MRTK3 as our design guide and UIs for our product. It works as an API to interface with Unity.
When the user voice command or eye-gazing the vital panel, it opens the expanded vital panel. Then they can check each vital at a glance and in detail. In case of in danger, the panels that have problems turn into red color.
← If the vital is in danger, the user receives an alert message in any state. A warning message pops up just below the menu at the top.
Eye-gazing
Main State
Main State - Menu Expanded
Mini-map direction tracking
NOVA is still in progress at the moment. I'll keep posting the progress here as we move forward. Please contact me if you have any further questions!