← Home

Digitizing complex test setup

Team: NASA Ames HCI Group

Role: Design lead

Completed: Apr 2014

NOTE: Due to the nature of this work, I can't publish much detail of our research findings nor the final product. Consequently, there will be some obvious omissions and I'll primarily be using abstractions to give a general idea of the process and solution.

Summary

The NASA Ames Arc Jet Complex tests vehicle heat shielding by simulating flight- and re-entry conditions (i.e. lights stuff on fire). They configure the equipment — and verify that configuration — for each test based on steps described in a task worksheet. Their old, paper-based process needed some improvements, so we created an adaptive web application for authoring worksheets on desktop computers, and executing them on mobile devices.

Keep reading for more details on our process

The challenge

The Arc Jet is an extremely complicated piece of equipment that can simulate flight- and re-entry conditions in a variety of different atmospheres. In order to configure this equipment properly for each test, engineers create task worksheets describing the equipment and configuration. The steps on this worksheet are executed by the facility technicians and may have follow-up quality assurance actions; some steps require additional data entry, and every step must be signed by the individuals responsible for it.

A model being tested in the AHF. Image courtesy of NASA.

These worksheets were paper-based, which had the trade-offs you'd expect — it's portable and you can write on it, but it gets cluttered with revision markings, requires physical storage, it's not search-able, etc.

We were asked to look in to a way to make this paper-based approach more efficient by reducing errors and increasing worksheet completion.

Our process

Like most projects, we started with a contextual inquiry to produce flow and sequence models and identify breakdowns. We performed an artifact analysis of hundreds of existing worksheets to lay out the information being captured (both planned and unplanned). We also did quick competitive analysis of applications that structure discrete tasks and met with another organization at Ames that encountered a similar type of problem.

We modeled the worksheet structure using — more or less — a CFG.

We wanted to focus on a few areas that surfaced during our research:

Overall, we wanted our design to give users the authoring flexibility of a word processor, yet the structure needed for things like searching, linking, and sign-off.

Our solution

There are effectively two phases of the worksheet life-cycle — authoring and execution — and the system is designed to take advantage of the user's situation in each of these phases.

In the authoring phase, the system is designed for the engineers creating documents on a standard, desktop web browser. Our artifact analysis identified that a worksheet is made up of a standard set of components that can be tailored according to the specific test's needs. So, we treat these components as building blocks that authors add and configure in whatever order needed.

Engineers author a worksheet by adding and configuring standard elements.

In the execution phase, the system is designed for the technicians and QA representatives executing tasks on the move. To mark a task complete, a technician must sign for it and, as you may expect, NASA culture is a bit of a stickler for authentication. In order to speed up that process, each technician uses a unique 4-digit PIN when marking a step complete.

Users sign off on steps by entering their unique pin number.

Some of these tasks are time sensitive, and if they take too long, then the work needs to be repeated and the entire test can be thrown off of schedule. In order to help the users keep track of these thresholds, the system automatically start a timer when an time-sensitive step begins.

An active timer remains in view at all times and alerts the user when completed.

Test articles are referenced throughout a worksheet via their 16 character serial numbers, leading to many error opportunities when referencing an item or swapping one out at the last minute. Additionally, entire worksheets are often copied for subsequent tests and only the serial numbers are changed. To reduce the possibility of error, authors can create variables to reference throughout the worksheet.

If an author updates a variable, then all references are updated along with it.