Project Overview

NASA SUITS (Spacesuit User Interface Technologies for Students)↗️ gives students an authentic engineering design experience supporting NASA’s Artemis mission—landing American astronauts safely on the Moon! This activity challenges students to design and create spacesuit information displays within augmented reality environments.

CLAWS (Collaborative Lab for Advanced Working in Space)↗️ is a multi-disciplinary group of student designers, researchers, engineers at the University of Michigan. Our purpose of the project is designing the spacesuit information display that enables astronauts to conduct various tasks and we are going to present and test the project at Johnson Space Center(JSC) with astronauts and EVA specialists in May 2023.

Project Info
Duration
My Role
Team

NASA SUITS (Spacesuit User Interface Technologies for Students)

Sep 2022 ~ May 2023

UX Designer

9 UX/UI Designers,
13 AR Developers,
11 Web Developers,
9 Hardware Engineers

CONTEXT

Before Diving into The design process, let me clarify what we are creating!

We are creating AR assistant system for astronauts (yes, for real astronauts!) to conduct various tasks in the space. As NASA launches the Artemis program for sustained human presence on the lunar surface and ultimately, Mars, engineers are considering what technology will best aid astronauts to safely and successfully complete science and exploration missions.

→ Our AR assistant system, N.O.V.A, is designed to allow astronauts to support NASA's Artemis mission in the space in the future.

Take a look at our last year's product ↓

This is a final pitch video from last year's product(HOSHI). Please watch it as a reference only.
We are creating a new product this year(N.O.V.A), but this is a good reference to understand our product as a whole.

RESEARCH

User Interviews with Space Experts

We conducted interviews with 3 former astronauts, 1 researcher, and 1engineer at NASA to better understand the external factors at the space environment and learn unfamiliar circumstances such as UIA Egress, EVA, Geological sampling, etc.

We created 5 different user interview protocol scripts based on each interviewee's role and experience. After we obtained the contact info from NASA, we started researching each participants' experience and knowledge since each participant had/have different responsibilities and experience at NASA.

Research Goals
View Full Interview Protocol →

Key Findings & Insight

01  Astronauts' safety comes first
  • Consumables (batteries, water, oxygen, etc.) are limited, so vital information should always be updated accurately and quickly to astronauts.
  • It should ensure astronauts' safety as well as to increase the working efficiency for missions.
02  Minimizing cognitive load is important
  • While on a mission in space, there are a lot of things go around
  • With the complex view of the screen, it's distracting astronaut's attention
  • Essential data should be informed while keeping the distraction at minimum
03  Reduced mobility due to spacesuit, especially hand gesture
  • Mobility and the speed in the space suit is reduced a lot.
  • Hand gesture is not accurate with the heavy and chunky space suit.
  • Voice commands are needed to enable the astronaut to conduct the task.

Design Objectives

Wait, But Why AR?

ANALYSIS

Four Main User Scenarios in The Space

We identified 4 different user scenarios according to our key findings as well as the NASA SUITS Challenge Full Scenario scripts.
Within EVA (Lunar Extravehicular Activity), there are 4 scenarios as you can see below. We wrote down the user scenario scripts for each scenario.

View Full User Scenario Scripts →
DESIGN

Design for AR? How Is It Different From Design for 2D Environment?

We understand the design for AR is different from the design for 2D mobile or web. Design principle should be different for the AR design. We set the design tenet for AR environment as follows:

01 | Safety First
02 | Ease of Onboarding
03 | AR as A Tool

AR design should be immersive experience, but safety of the design experience comes first. Blocking user's sight with augmented UIs and distracting their attention with AR popups could be dangerous.

Although astronauts practice AR environment before conducting tasks in the space, learning curve of the product should be easy to utilize. Designing familiar gestures and interactions is important.
AR technology is not just fascinating and fun technology. It should to be a tool that adds value and solves a real human problem. Here, it's a tool for astronauts to conduct various tasks.

Before designing Low-fi prototype - set the Field of view

We set the field of view first before diving into low-fi prototype. User is seeing the middle screen/view when they wear HoloLens. After set the eye-gazing setup, the user can see the left or right view and the system tracks the user's gaze.

Forward View
Eye Gazing - Left View

How can users Interact with the system?

Primary
Secondary
Feedback

Voice & Eye-Gazing

Hand gestures
Audio & Haptics

We implement the voice assistant "VEGA," a personal voice command system which provide information needs during EVA. Eye-gazing acts as a "cursor" to navigate the interface.

All actions that can be done by voice can be performed by hand gesturing on the interface.

It is important to give feedback to the user while using the system. We give astronauts audio and haptic feedback as well as visual feedback.

Design Guide - Microsoft’s Mixed Reality Toolkit (MRTK)

Microsoft’s Mixed Reality Toolkit (MRTK) is a cross-platform toolkit that accelerates cross-platform MR app development for Virtual Reality (VR) and Augmented Reality (AR). We used MRTK3 as our design guide and UIs for our product. It works as an API to interface with Unity.

Low-fidelity Prototype - Overview

Highlight key feature - 01 Main state

Why Expandable Menu?

Highlight key feature - 02 Vital check

When the user voice command or eye-gazing the vital panel, it opens the expanded vital panel. Then they can check each vital at a glance and in detail. In case of in danger, the panels that have problems turn into red color.

← If the vital is in danger, the user receives an alert message in any state. A warning message pops up just below the menu at the top.

Highlight key feature - 03 Navigation

Highlight key feature - 04 Messaging

AR Develop Progress: Menu, Vital, Mini-map

→ 

Eye-gazing

Main State

Main State - Menu Expanded

Mini-map direction tracking

What Next?

NOVA is still in progress at the moment. I'll keep posting the progress here as we move forward. Please contact me if you have any further questions!