[UPA logo]



Why ask Why 
in a Usability Evaluation?

Larry Wood

If you've ever kept company with a 5 year-old you have experienced how their incessant desire to understand life results in an unending stream of "whys?" Over the years I've found myself becoming impatient with my children and grandchildren. So I was surprised to find that "asking why" in a usability evaluation had a more profound effect on the outcome than I had anticipated. 

In this evaluation of a University web site, users were asked to attempt tasks and at each step were instructed to indicate not only what action they intended perform, but why that particular action seemed appropriate. The surprising result was that even when the users performed the appropriate actions, the reasons they gave did not correspond with the designers' intentions. 

I was asked to perform the evaluation by a member of the Brigham Young University (BYU) library staff who had been hired to oversee the library Web site, but who had not been part of its original design. He explained that the site had been designed "by librarians for  librarians" and his attempts to convince his colleagues that most students could not use the site were ignored.  The librarians insisted that any reasonably intelligent person should be able to perform the tasks without training. 

New students, slated to receive training in basic library research strategy, were used for the study.  The students were asked to complete a series of typical research tasks:

  1. Find terms related to a chosen topic based on Library of Congress Subject Headings,
  2. Find an article related to the topic in a general encyclopedia,
  3. Find a specialized encyclopedia related to the topic,
  4. Find a book on the topic, and
  5. Find an article on the topic in a professional journal. 

Each task is composed of three to five steps, and all of the tasks can be started from the library home page, which has the menu structure shown below:

How to Use the Library 


BYU Library Catalog 

 & Periodical Indexes 

Electronic Journals 

Course Reserve Catalog 

Digital Media Library 

Subject Research Guide 

MORE…(ILL, Other Libraries, Ask A Librarian) 


General Information 



Participants were given three opportunities to choose the appropriate action on each step of each task. If they failed, they were told the appropriate action, and were moved to the next step. 

The participants' attempts to perform the first task:

"Find terms related to a chosen topic based
on Library of Congress Subject Headings"

are summarized in the table below. 

Approximately a third (7/22) chose the appropriate response on their first attempt.  This suggests that 1/3 of the students understood how to use the interface.  However, analysis of the reasons given for the choices suggests a different story.  There is little difference in the rationale presented by the successful and the unsuccessful students. 

This suggests that many of the correct responses were guesses rather than informed choices and the usability of the site is actually lower than the numbers would suggest.

Responses Reasons
[UPA logo] Chose the BYU Library Catalog (7) (2) Can browse and see       what comes up

(2) Seemed to be the       broadest category

(1) First on the list

(1) No other choice made sense


Chose Subject Research Guides (8) (8) Searching for a subject
Chose Databases and Periodicals (7) (4) Contains listing of topics

(2) Library of Congress is a database

(1) Uses databases for other research


Chose MORE… (2) (2) Looking for link to Library of Congress

The results on the other tasks were similar.  

Based on this test, the library has made a commitment to redesigning the site using a user-centered approach.

This study convinced me of the need to interpret results in the context of understanding the users cognitions that drive their actions.  Asking "why" is a simple but effective way to increase the yield of a usability test.


For a more complete report on this evaluation contact Larry Wood at WoodL@byu.edu