upa - home page JUS - Journal of usability studies
An international peer-reviewed journal

Use of Card Sorting for Online Course Site Organization Within an Integrated Science Curriculum

Alison Doubleday

Journal of Usability Studies, Volume 8, Issue 2, February 2013, pp. 41 - 54

Article Contents


Results

The following sections provide results for the card sorts for both Cohort A and Cohort B.

Cohort A, Stage 1: Open Card Sort

Results of the card sorts formed a distance table and a cluster tree within xSort. The distance table (Figure 3) indicates the normalized distance between all cards, so that cards always grouped together receive a value of 0 and cards never grouped together receive a value of 1. The cluster tree analysis (Figure 4) depicts the most common groupings so that cards clustered more closely to each other are, on average, grouped together more often by participants than cards spaced further apart on the cluster tree, also called a dendrogram.

Figure 3

Figure 3. The distance table created in xSort (figure use and adaptation permission from xSort). Cards always grouped together receive a value of 0, and cards never grouped together receive a value of 1.

Figure 4

Figure 4. Cluster tree created in xSort (figure use and adaptation permission from xSort). Cards clustered more closely to each other are grouped together more often by participants than are cards spaced further apart. As you move from left to right in the dendrogram, the groupings are weaker but include a larger number of cards within each group.

After the open card sort, the author established (using xSort) which cards were grouped together most often by participants. Results of Cohort A’s open card sort revealed that Cohort A participants preferred to keep the biomedical sciences materials separate from clinical materials. In fact, Cohort A preferred to have two completely separate course sites for these different parts of the first year curriculum, and clinically related cards were often left as unsorted cards. Alternatively, if clinically related cards were sorted, participants noted during the think-aloud process that these clinical groupings should appear on a separate site.

Cohort A participants consistently ranked content that had to be accessed most frequently throughout the semester of higher importance than content that would only be accessed occasionally or infrequently. Interestingly, in the think-aloud protocol participants tended to make clear distinctions between required and optional resources. If a resource or material was indicated as required most participants wanted to have it grouped with other required material and organized by the date in which that topic would be covered in class. If a resource or material was indicated as optional or supplemental, participants ranked it as lower in importance and grouped it with other optional materials. These supplemental resources were often arranged in subgroups by content topic, rather than by date covered, and several participants questioned the value of including content that was not required.

Cohort A, Stage 2: Site Development and Usability Test

The author used these data to develop 13 group names that reflected the primary theme of each group (Table 2). For example, if discussion boards and blogs were consistently placed within the same group by the majority of study participants but some participants named this group “blogs and boards” and others named it “conversations,” a consensus group name might be “communication.” In some cases, all participants used the same category name (this was true for “announcements” and “grades”). For other categories, participants may have grouped similar items together but arrived at very different category names.

Table 2. List of Categories Created After Open Card Sort With Cohort A

Table 2

Much of the interpretation and identification of the themes used to construct consensus categories came out of the think-aloud protocol. The fact that faculty members took notes during these sessions, as well as screen captured participant movements, enabled the author to go back and review participant explanations of the names they provided to each category and the motivation for using a specific name and for grouping certain items together. For example, the consensus category “educational resources” was called “supplemental material,” “enrichment,” “extra resources,” “extra educational materials,” and “articles and recordings” by different participants. The author decided upon the category title “educational resources” because all participants viewed these resources as clearly distinct from the essential resources required for their sessions with faculty. Required materials were put in the category “Daily Materials” which was called “class materials,” “lecture stuff,” and “course materials” by different participants. The author decided upon the name “Daily Materials” to avoid the use of the term “lecture,” which is limited in the new curriculum and to support participants’ emphasis on the importance of frequency of access. The consensus categories formed the basis for the Blackboard course site navigation menu. In some cases, it was necessary to deviate from card sort results alone and rely on faculty experience and knowledge about the intended structure of the new curriculum. Some examples include the following: (a) Although most participants preferred the elimination of “Blackboard support,” it had to be retained as a default category in Blackboard; and (b) because of additional laboratory activities in the new curriculum, the term “Lab Documents,” used by numerous participants, would be too broad and inclusive to be useful so materials were subdivided into specific lab categories such as Dissection Lab and Microanatomy Lab.

Using the categories and information architecture gleaned from Cohort A’s open card sort, the author, with assistance from the Office of Dental Education at UIC, constructed a template course site in Blackboard (Figure 5).

Figure 5 

Figure 5. The initial integrated course site developed using results of the Cohort A (traditional curriculum) card sort (figure use and adaptation permission from Blackboard Inc.)

Within the course site navigation menu bar, the consensus category names became the menu tabs and the cards within each category translated to folders that appeared when clicking on a given tab. This organization reflected the hierarchy revealed through the card sort, wherein the category cards (named by the participants) were located higher in the hierarchy than the individual cards. Because Cohort A also placed a high amount of importance on frequency of access, required date of access, relevant discipline, and medium (as revealed through the think-aloud protocol), some of the menu bars (based on the consensus categories) were further subdivided.

Usability testing

The five Cohort A participants recruited for the scenario-based usability test were able to successfully complete all of the assigned task scenarios. Cohort A participants were able to navigate through the template site and complete each task scenario without assistance from the faculty observer within 1 minute per task scenario.

Cohort B, Stage 4: Semi-Closed Card Sort

The second card sort was necessary to determine if the site organization continued to logically and consistently represent course activities after a year of use. Additionally, comments from both students and faculty indicated that there were unanticipated issues with the initial site design. For example, Cohort A’s open card sort suggested that students preferred to have content grouped by discipline related activities (anatomy, physiology, biochemistry) and then by medium (PowerPoints, videos, notes, animations). After a year in the new curriculum, Cohort B students disclosed that the volume of available resources on the course sites “often overwhelmed them” and that organization by medium within a given activity did not allow them to quickly find potentially helpful materials. Cohort B students also complained that they would have preferred to have materials organized into groups based on the corresponding case or scenario to which each resource related, rather than by whether the material was accessed more or less frequently, a factor that had been very important to Cohort A students, based on the think-aloud part of the open card sort and subsequent usability test with the initial site.

Results of the semi-closed card sort also produced a cluster tree and dendrogram. Results of Cohort B’s semi-closed card sort revealed that, even after using Cohort A’s open card sort data for template site organization, the course site structure still required modification in order to better align with the structure of organization expected by student users in the integrated curriculum. The think-aloud protocol used during both card sorts also suggested that Cohort B preferred a more complex architecture wherein the organization of resources on the site, regardless of whether they were initially organized by activity or by date, maintained a connection in name to the case scenarios that formed the basis of the small group learning sessions in the new curriculum.

Consistent with the Cohort A students, the majority of Cohort B participants believed that clinical materials should be housed on a site distinct from the biomedical sciences materials. Also consistent with Cohort A, Cohort B ranked content accessed most frequently as higher in importance than content accessed infrequently. During the think-aloud part of the card sort, however, Cohort B participants did not, specifically, mention frequency of access as a factor in their sorting, while Cohort A participants did.

Results of the Cohort B card sort and usability test differed from those of Cohort A in a few key ways:

Cohort B: Site Modification and Usability Testing

Modifications based on these results included name changes for several groups, consolidation of some content, and the incorporation of additional information on the course home page. The modified course site is shown in Figure 6.

Figure 6 

Figure 6. The modified course site developed using results of the Cohort B semi-closed card sort (figure use and adaptation permission from Blackboard Inc.)

Specifically, the “Educational Resources” category, containing supplemental or non-required videos, PowerPoints, recordings, and animations, was changed to “Small Group Learning (SGL) Resources” and then further subdivided by case/scenario. “Daily Materials” was changed to “Plenary Sessions” and then further subdivided by date. These changes reflected the emphasis placed by Cohort B on grouping content by activity (plenary sessions, small group learning) rather than by discipline yet maintained the priority both cohorts placed on being able to access content for specific sessions with faculty based on date of access. The “Assignments” category was further subdivided by relevant case/scenario so that all assignments or quizzes related to one of the specific cases/scenarios with which the students were working in their small group learning activities.

Usability testing

The five Cohort B participants recruited for the scenario-based usability test of the modified site were able to successfully complete all of the assigned task scenarios. Cohort B participants were able to navigate through the modified template site and complete each task scenario without assistance from the faculty observer within 1 minute per task scenario.

 

Previous | Next