← Home

Head-mounted procedures (in space!)

Team: Stephen Trahan, Jenn Tran, Kristina Lustig, Gordon Liu

Role: Technical lead

Completed: Aug 2013

Summary

Astronauts on the International Space Station (ISS) follow procedures for everything. Those procedures, and the mechanism for viewing them, aren't great. As a graduate project working with the NASA Ames Research Center, we built a head-mounted procedure viewer prototype in order to address some of the issues with crew member efficiency.

Keep reading for more details on our process

The challenge

Crew members on the ISS spend most of their time carrying out scientific research. For each research task, they must follow a rigid, complex procedure. The procedures, and the way they are used, cause a considerable reduction in efficiency; procedures are static text documents and are only viewable on wall-mounted laptops throughout the station.

Notice the distance between the procedure (laptop on the left) and where he's working. Image courtesy of NASA

Every project needs a mission statement, right? Right.

We set out to explore operator workflow in isolated environments in order to optimize procedure execution.

Building domain knowledge

A lot of the time, designers have some background knowledge of the domain they are designing for. That's not really the case when it comes to working on a space station. So to better inform our approach, we started with a literature review on topics like the psychology of spaceflight, the structure of the ISS, and the cognitive processing of tasks.

We paired the literature review with a competitive analysis of other procedure-like tools such as medical procedure viewers, websites like Instructables.com, as well as the viewer currently in use on the ISS. Here, we focused on portability, rich media, customization, contextual awareness, and tool retrieval. Our analysis led us to a few recommendations:

Unreachable users

Designing tools for astronauts has a unique constraint in that you will never be able to do contextual research due to the prohibitive cost of sending designers to space. So, we had to make due by talking to anyone we could get our hands on.

We interviewed NASA experts within organizations like Operations Planning, Payload Science, and Wearable Computing to build up even more domain knowledge. We were also extremely fortunate to interview two astronauts with mission experience - one retired and one less than 6 months removed from commanding the ISS1.

If they looked at the procedure and saw that it was 20 pages long, literally, they'd be like, 'It's going to take me longer to read the procedure than to just do what I need to do.'

- Procedure writer

Since we couldn't research our end users in context, we went on a nation-wide search to perform contextual research with individuals whose work was similar to the ISS crew in some way. We met with commercial pilots, deep sea divers, lab technicians, paramedics, construction managers, and pit crews based on similarities their tasks shared with payload science and maintenance work.

We asked commercial pilots to visualize their workflow through a card mapping excercise.

Extracting insights and visioning

Naturally, we synthesized our research through affinity diagrams (one domain-specific and one for analogous domains) and various other models (i.e. cultural, sequence, artifact). From this, we extracted eight key issues to address:

So many notes. Our visioning sessions resulted in over 400 individual ideas.

After several visioning sessions and activities, we narrowed down our vision to one utilizing a head-mounted display2. The HMD would allow for hands-free interaction through voice interaction and leverage augmented-reality (AR) to provide information in context within a crew's work environment.

One of our rejected visions had the crew using a wrist-worn device along with a tablet.

Prototyping an HMD

Lo-fidelity testing, rapid iteration, available hardware - these are all things imperative to prototyping. These are also all things that HMDs utterly fail at. Basically, if you want to test an application on an HMD, you have to build it. That is dumb. So, we had to find a way to quickly iterate and get feedback without building out functional software each time.

We tested by having users run through a pretend procedure in our mockup space station.

Gordon and I ended up building a prototyping tool that would let us use the actual HMD in tests but would display our wireframes instead of working software3. This allowed us to test six iterations within a few weeks.

The final version was designed to have a pretty simple, linear structure (while allowing users to jump to anywhere if desired). It splits steps into individual screens and provides in-context information and AR overlays to address the verbosity of current procedures.

A map of the prototype UI. Procedures are viewed linearly, but users can access other features through a menu.

A view of the prototype's interface from a user's perspective.

Hardware limitations

For hardware, we chose to use the Epson Moverio due to it being one of the few available HMDs and its Affordability compared to the rest of the market. The down side is that the Moverio doesn't have any external sensors, so again we had to find a way to get what we needed. We ended up pairing the Moverio with a tablet to process all voice commands and handle the AR. The tablet then relayed what to display to the HMD. While not pretty — or ideal — it got the job done.

This is what happens when every tech demo is either vaporware or still in production.

What we learned

The final prototype was well received; gaining high marks in our final readiness test. I believe we also succeeded in exploring if an HMD is worth pursuing in the future. I think that it is, but the hardware just isn't ready yet4.

If you want to know more about the project, you can get additional information at our project website. The code for the final prototype, as well as our prototyping tools, is available on github.

1. Seriously, astronaut time is so valuable that being able to snag an hour was a huge win.

2. While not only addressing many of the issues found, this project provided an opportunity to explore the feasibility of HMDs in a 0G environment.

3. A moderator would sit at a computer during a test and serve static images to the HMD based on commands spoken by the tester.

4. Most are unavailable, prohibitively expense, or not advanced enough to be useful.