Pilot User Study: Suffragists Speak

Rosalie Lack, Joanne Miller, Sally Thomas

SIMS 213, April 27, 1999



 

Introduction

The System

The system being evaluated in this user study is the Suffragists Speak thematic archive, a website centered around twelve oral histories of women suffragists who were active between the years 1910 and 1920.
 

Purpose and rationale of the study

The purpose of this usability study was to obtain feedback from users about the design and content of the Suffragists Speak website. We wanted to test the site with potential “real” users, in this instance we tested three graduate students in history.

Specifically, there are parts of the site that we thought might be confusing to users, based on the heuristic study, and we wanted to get feedback on our adapted designs.
 
 

Method

Participants

Three history graduate students, one man and two women, all in their sixth year (coincidentally).

Participant 1: Male, between 31-35 years old. Graduate student in History. Has used the Internet for three years, but rarely (approximately 1 time per month) for scholarly research.

Participant 2: Female, between 26-30 years old. Graduate student in European History. Has used the Internet for five years, sometimes (5 times per month) for scholarly research.

Participant 3: Female, between 31-35 years old. Graduate student in History. Has used the Internet for at least 2 years, but rarely (approximately 1 time per month) for scholarly purposes.
 

Apparatus

For all of the tests, we used South Hall computers with a large (21 in) monitor, fast processor speed, and fast Internet connection. Due to technical difficulties, the third tester was unable to hear the sound files.

We videotaped the computer screen during each of the tests so that we could refer to it for reference if necessary.
 

Tasks

Scenario 1: You are a history graduate student writing a dissertation on women reformers in the American progressive era, with a special focus on Jane Addams, the founder of Hull House, a settlement house in Chicago. You go to the site to try to find out more about Jane Addams.

Scenario 2: You are a teacher interested in international connections to the American woman’s suffrage movement. You recollect that there is a mother-daughter British “team” but cannot remember their names. You go to the site to try to find out.

Scenario 3: You are interested in finding out general information about Alice Paul. In addition, you want to learn specifically about her activities in 1917.

We adapted our scenarios so that they would be more straightforward and simpler to describe to the participants. For each scenario, we were mainly concerned with how the users approached the task. We were not too concerned with how long the task took; instead we wanted to know how they would approach the task. Would they use the Search page? Would they use the “Browse Terms” option? Would they remember where to find relevant information in other parts of the site?
 

Procedure

1. Greeting and introduction

We started the test session by explaining a bit about the website. Each participant signed an informed consent agreement and filled out a brief demographic survey. We told the participants to think out loud as they worked and to remember that we were testing the system, not them. We warned the users about bugs in the system and the incompleteness of some parts of the site.

2. Browsing

Instead of giving the participants a demonstration of the site, we allowed them to browse the site on their own for 15-20 minutes. We answered any questions they had and encouraged them not to spend too much time reading the content (in the interest of time). This was not always easy, because these students are interested in the content!

3. Scenarios

After about twenty minutes we stopped the user and moved on to the scenarios. We read the scenarios and timed the response for each one. We answered questions and helped when users got stuck.
 

4. Post-test Discussion and Questionnaire

After they were finished with the scenarios, the facilitator asked each participant about their overall impressions of the site. She also asked about specific things that came up in the course of the test, such as why the participants made the decisions they chose. Each participant then filled out a post-test questionnaire. As a token of our appreciation, we gave the users certificates for a frozen yogurt at Yogurt Park.
 

Test Measures


Although we tried to be quantitative by timing the users as they worked on the scenarios, we really wanted a qualitative assessment of the design and content of the site. These qualitative elements were our dependent variables.
• Where users would get stuck
• Their reactions to the presentation of the information
• What they were expecting when they clicked on a link
• What they thought might be missing

We also wanted to gather feedback on specific elements of the site. These elements that we have control over were our independent variables.
• Introductory materials on the Home page (vs. always visible in the left frame)
• The timeline (hide/show, and the combination and quantity of information presented)
• The usability of the Search page and the usefulness of the “Browse terms” option
 

Results

We took notes throughout the tests. The summaries of each test are presented below, while the complete user incident logs can be found here. Summaries of the participants’ observations serve as our “response variables” since response time and “errors” were not our concerns. Rather, we were trying to gather information from the users on the design and content of the site. Their observations are helpful in directing our next steps (see the discussion section below).

The scenario completion times (in the following table) are limited in their usefulness. Most of the users went straight to the search page to begin their scenarios. A search that takes too long will only frustrate users and will probably cause them to give up and leave the site unsatisfied. Therefore, an overly long completion time will indicate a flaw in the system. However, users learn from one scenario to the next, so the shorter scenario times in scenarios two or three may reflect information learned in scenario one.

Scenario Completion Times:
 
Scenario 1  Scenario 2  Scenario 3
Participant 1  11 minutes 5 minutes  11 minutes
Participant 2 15 minutes 13 minutes Did not perform (had done this scenario previously with low-fi prototype)
Participant 3 9 minutes 7 minutes 5 minutes

 

Participant 1, Comments

• Link names within the Meet the Suffragists page [and throughout the site]
• Inconsistent order of lists (Meet the Suffragists page and the Oral History page contain lists of the suffragists but in one list the women are listed in the order of importance to the movement and in another in alphabetical order)
• Blue triangles icons (in Dynaweb) are confusing
• In search results, doesn’t see the arrows (surrounding the results in the text) that link to the next hit
• Need complete book citation
• Might want to break down Bibliography by subject
 

Participant 2, Comments

• Questions “hide timelines” option (since she was using a large screen it might not have occurred to her that on a smaller screen you might not want the timeline to always be visible)
• Wants annotation of contents: How much is here? Where does it come from?
• Likes division of Primary and Secondary sources
• In search results, doesn’t like being brought to the middle of a primary document
• Need to add complete citation to the Secondary Source book pages
• Add a note to the periodicals page that explains that these are “selected” periodicals, so undergraduates don’t think that the periodicals shown are the only ones relevant to the woman’s suffrage movement
 

Participant 3 Comments

• From the Home page she doesn’t know whether to click on “About this Site” or “Guide to the Site” (in fact they are the same thing)
• A “scope” statement about the site might be worthwhile to include, reminder to people of limited scope of the site
• Interview (audio): include date of interview and interviewer’s name; include source and that it’s a piece of a larger oral history
• Introductory material to the suffragist at the beginning of the oral history (or a link to the biography on another page of the site)
• Questions order (not alphabetical) of the Meet the Suffragists page
• Periodicals: include editorial note about “selected” articles/journals
• Search is confusing, overwhelming at times
• Consider adding links to related sites (women’s and suffragist)
 

Discussion

What We Learned

We gained a great deal of insight about the Suffragists Speak website from the user test. We learned that people think that the contents are useful and worthwhile and could be used for both research and teaching. In the post-test interview and questionnaire, users told us that they liked the variety and amount of material on the site. They liked the presentation of material in the introductory pages (Meet the Suffragists and Introduction to the Era). They appreciated the inclusion of sources that are difficult to find such as oral histories, diaries, ephemera, periodical articles, correspondence, and audio. The availability of these resources was a recurring theme in the users’ comments.

Criticism about the website centered around navigation, especially during the search process, and lack of precise information about the sources.

In terms of design changes, we learned that there are some things that we can easily change or add that will facilitate use of the site. These include:
• More links within the site
• More links to other sites
• Increased information about the origin and scope of the subject matter
• Numerous changes to the search results page (and the way that users navigate from there)
 

Changes for the Real Experiment

For the “real” experiment with more participants, the main thing we would change is to make sure that the testers reflect the wider demographics of our intended users: undergraduates, graduate students, faculty, and a few high school students.

In terms of changes to the test, we would attempt to create a scenario that prompted the user to go to the multimedia materials.
 

Changes to the Interface

We would change the search results screens, and how the user links from the list of hits to the actual results content. The search function was clearly the most confusing and frustrating part of the site for all of the users. Even though one user liked that she could search from a frame in the bottom of the screen without linking back to the search page, we would probably remove this option in favor of control (over the system) and design simplicity.
 

Other changes include:

• Provide a consistent order for lists of items (names of suffragists on Meet the Suffragists page and Oral History page), probably alphabetical
• Blue triangles icon in Dynaweb were unclear; in an HTML mock up of the Dynaweb pages we would include an image of the icon with text description of its function.
• Change the label “Bibliography” under Secondary Sources to something that better reflects what is there; testers didn’t realize that they would find links to web pages there. Maybe change it to Resources.
• Browse terms link on the Search page is still not clear even with the search tips. Make it larger, add more explanatory test directly below it.
• Timeline label did not indicate that the timeline would contain detailed information about the suffragist movement. We need to find a new label that better conveys this.
• Add more context to the search results page – one user did not like being brought right into the middle of a document. The left-hand frame of the Dynaweb interface should contain more contextual information as to where the user is with regard to the document as a whole. We could do an HTML mock up of a better implementation.
• Add a note on the Periodicals page that explains that the periodicals on the site are a selection of periodicals. This was pointed out as being important for undergraduates because they might think that we have provided all relevant periodicals.
• On the Home page we have an About the Site and Guide to Site link, different wording but they go to the same place; we need to make that consistent.
• The content on the timeline seemed random to many of the testers; they suggested that even for the world events materials we should only include information that had some direct effect on the suffragist movement.
• The “sort by” feature on the Correspondence, Periodicals and Books page was not seen by 2 of the 3 testers. We need to find a way to make that more prominent.
• The Search page still needs work. The choice of keywords vs. exact phrase is confusing and the format for entering a search for a name is not given (eg: “enter last name, first name”). None of the participants read the search tips – only after being prompted did they look at them. One suggestion was to make the options look more like Melvyl, which a lot of students (even non-UC students) use and have become familiar with.
• Add source information for the Oral Histories and the interviews; include when they were done and information about the interviewer.
 

Formal Experiment Design (Hypothetical)


In a formal experiment, we would want to compare the Dynaweb search results interface to an alternative that we have designed in HTML in response to user feedback. (We cannot actually do this because our HTML pages lose the functionality needed for searching full text in the documents on the site.)
 

Hypothesis

Users will feel more comfortable and be able to more easily interpret search results from our revised search results pages.
 

Factors

• Elimination of the left-hand frame on first page of search results (no extraneous information that has nothing to do with suffragists)
• Expansion of search results right away (providing more of an outline structure)
• Buttons on the bottom of the page that easily enable users to navigate back to search results and back to the search page

We were not able to define levels for our factors because they are all things we would either do or not do. For example we would either eliminate the left frame or not, or we would either fully expand the search results or not, and we would either include two buttons to aid in navigation or not. If, for example, we were experimenting with how many choices to put on a pull-down menu than we would clearly have levels. However, in our case defining levels does not seem relevant.
 

Response variables

• User satisfaction with search results (and if there are too many or too few how do they respond?)
• Ability to navigate around the results
• Frustration level
 

Testing

To test this hypothesis, we would have to use a similar user test to the one described in this assignment. We would have the participants think aloud as they performed searches, and we would ask them questions about where they thought they were (in the context of the search results), we would observe if they could get back to their original search results, and if they could get back to the search page. We would gauge their level of frustration or satisfaction by listening to their thoughts and asking probing questions as necessary.