Home
[an error occurred while processing this directive]

Resources: UPA 2006 Idea Markets

Analyzing usability study results: Is it magic behind the curtain?


Activator: Emma J. Rose, University of Washington

What happens when the usability study ends and data analysis begins? What strategies do usability professionals employ? How do we approach the subjective nature of qualitative analysis? This dialogue can highlight the consensus or disparate perspectives in techniques that usability professionals use to understand the data they collect.

Thought Starter Questions

  1. What formal or informal approaches do you use when analyzing data from usability studies? Is your approach to analysis holistic or process based?
  2. What has informed the way you analyze data, is it based on theory? Does your organization have a set way? Have you developed your own method?
  3. How did you learn to analyze data? Was it in a formal training or classroom setting? Was it from a mentor or colleague? Did you develop your own techniques?
  4. What is your level of confidence in the way you analyze usability data? Are you satisfied with the way you approach your data? Are you looking for new ways or new ideas?
  5. Do you have a particular starting point when looking at the data you have collected?
  6. Is there a technique you have used in the past but now eschew due to experience?
  7. How do you report or defend your analysis techniques when reporting usability results?


Executive Summary

The discussion about analyzing data from usability studies attracted a wide range of opinions and comments. Attendees represented a broad area of usability professionals including those who worked as usability engineers or designers for companies, those who were external consultants working in the public and private sectors, and academics. The discussion highlighted the diversity of methods and approaches for analyzing usability study data and produced three main themes:

  • Formal vs. informal approaches to data analysis
  • Factors that impact data analysis
  • Reliability and uncertainty

When discussing how they analyze the resulting data generated by usability studies, attendees had a variety of methods and approaches. These approaches differ among the individual attendees who contributed to this discussion, but also depended on several factors including type of study, organizational support, audience, and deliverable. In addition, strategies for analyzing user data vary based on the experience of the individual researcher conducting the study. Finally, attendees painted the act of data analysis as both a science and an art. They rely on evidence from a study to create findings and recommendations but there can be a level of uncertainty when it comes to trying to understand the data. They acknowledge that a large part of data analysis comes from previous experience, sensing what participants are feeling during the study and developing the skills to see larger patterns in the data.

Note: Quotes from the discussion below were paraphrased for this report

Formal vs. informal approaches to data analysis

The attendees of the discussion generated the following list of approaches to data analysis. This list is arranged as a continuum from most formal at the top to least formal at the bottom. Attendees used a combination of methods, but the majority of responses tended to be in the middle of the continuum from formal to informal.

  • Statistical analysis: A small portion of attendees mentioned using a formal statistical approach to analyzing data. Others include some knowledge of these methods, but as one person stated: “I need to understand statistics well enough in order to estimate it”. Approaches included capturing frequencies and ranges and analyze Think Aloud Protocols in MS Word's Outline view to do a “rough cluster analysis.”
  • Calculating metrics: Often the first step in data analysis, attendees mentioned that they calculate success and failure rates and questionnaire data such as Likert scales.
  • Analyzing notes for patterns: Attendees look for trends in their notes to identify problems, which included techniques such as using different colored highlighters to flag problems or analyzing notes within a spreadsheet. This analysis helps to approach notes from a study on what participants did or said. One attendee stated: “I rely on my own notes of what a participant says during a study, when it matches what they do.” Researchers look for patterns by examining at how all users perform on each task and then looking for trends across tasks and across different types of users.
  • Physical observations: Some mentioned observing the physical reactions of a participant during a study to clue in on problem areas. These observations include facial and bodily expressions especially in regards to frustration or confusion. One attendee said, “It's like you can “feel” what your participant is going through”.
  • Analysis “on-the-go”: A small portion of attendees mentioned the most informal approach to data analysis. This approach involves taking informal notes during a study to capture what aspects of a design should be changed. No formal analysis is done post study, but instead changes to the design are made immediately.

Factors that impact data analysis

Due to the diverse approaches to analysis, it was helpful to discuss the factors that impact data analysis. The attendees mentioned the following parameters:

Depends on the type of study

The type of study that researchers are conducting can impact which methods they choose to use to analyze the resulting data.

  • Summative studies: Studies conducted in order to generate metrics or measure formal improvements, such as benchmark studies require more formal data analysis methods since they are collecting data for metrics.
  • Formative studies: When the focus of a usability study is to diagnose problems and offer solutions, less formal data analysis techniques are typically used.

Depends on organizational support for UCD

An organization’s support for user-centered design can impact how study data is analyzed.

  • High level of support: If the organization has had prior experience and success with user-centered design and believe in the philosophy for designing products, data analysis techniques can be more informal. In addition, one attendee pointed out that if the researchers are external consultants with high levels of credibility, the analysis can be more informal because the focus is more on producing the answer to a research question, not necessarily explaining the thinking or evidence that produced the answer.
  • Lower level of support: If user-centered design is a new process or has less organizational support, data analysis of study results needs to be more formalized and the methodology behind the analysis needs to be transparent and accessible to key members of the organization.

Depends on the audience

Related to the concept of organizational support is that of audience. The audience for the results of the study is an important consideration for choosing how to analyze data.

  • External: If the results of the usability study are being presented to an external audience such as stakeholders, management or a client, the data analysis approach is often more formal in order to generate a more formal report of results which could include both qualitative and quantitative data. Even with this focus on formal results, one attendee stated, “Clients like hard data, but the recommendations come from more informal analysis and conversations.
  • Internal: If the audience for the results of a study is a project team or the person who has conducted the study is a member of the team or a designer, the analysis technique can be more informal.

Depends on the deliverable

There appears to be a correlation between the type of deliverable that is expected from the usability study with the level of formality used during data analysis. The more formal the report, such as long written reports, the more formal the analysis. Conversely, if the deliverable is informal, like a memo or just a list of changes, then the data analysis too is informal. As one attendee stated, “The report is the analysis, [it] is the filter for your thinking.”

Depends on the experience of the researcher

Several attendees discussed how their analysis techniques had changed over time, as they became more experience researchers.

  • New to the field: Usability researchers who are starting out in the field may tend to do more formal analysis to create credibility for findings and recommendations. As one attendee stated, “In the beginning [when you are new at doing studies], you are much more focused on yourself as a researcher.”
  • More experienced: As usability researchers gain more experience, run more studies, and build expertise they tend to do less formal data analysis. Several attendees stated that it was not necessary to separate out different types of evidence such as hard and soft data. One said: “It gets more integrated as you get more experienced, you figure out what is important and what is not, you get a sense what you should be paying attention to.” In addition, several attendees mentioned that running numerous studies helped them to spot trends and patterns quicker. One stated” “We [usability experts] add to findings based on what we know [our expertise]”.


Reliability and uncertainty

A final theme to emerge from the discussion included the levels of confidence researchers had about their data analysis techniques. Several acknowledged that the act of planning a study introduces a level of influence in the outcome, since as one stated: “I write the questions.” Several other quotes from attendees highlight this level of uncertainty in analyzing results:

  • You can't really say anything ‘for sure'
  • It's a “Wild Wild West” atmosphere, you have all these pieces of data and you use them to build an argument
  • I think there is a little arrogance in analysis because I interpret the data as how people think.

ttendees offered several suggestions on how to ensure reliability. One strategy was to retain the same methodology and approach across studies. This is especially essential to compare results from different versions or iterations of products. In addition it is helpful to rely on participant verbalizations and actions over other less certain points of data such as researcher observations of participants, or our own assumptions.

Conclusions

It is helpful to acknowledge that there is no “one size fits all” approach for data analysis and that many factors influence how researchers analyze results. This discussion helped to highlight the different methods and strategies that have emerged to inform our practice.

Based on this discussion, we can conclude:

  • It is helpful to document your approaches to data analysis for each study, reflect on these practices and note how they change over time.
  • Considering the criteria presented in the section on “Factors that impact data analysis” before conducting the study can be helpful in dictating the direction of the subsequent data analysis.
  • Thinking about how to formalize or communicate data analysis approaches could be helpful when mentoring new usability professionals or when teaching or training usability techniques in the classroom.
Usability Resources UPA Store UPA Chapters UPA Projects UPA Publications Conferences and Events Membership and Directories About UPA