SIMS 213
Spring 2003

 Home
 
 Assignment 1:

   Project Proposal

 
 Assignment 2:

   Personas & Task Analysis

 
 Assignment 3:

   Scenarios & Initial Design

 
 Assignment 4:

   Low-fi Prototype & Test

 
 Assignment 5:

   Interactive Prototype 1

   Interactive Prototype 1 Presentation

 
 Assignment 6:

   Heuristic Evaluation

 
 Assignment 7: 

   Interactive Prototype 2

 
 Assignment 8: 

   Pilot Usability Study

   Final Presentation
 
 Assignment 9:

   Interactive Prototype 3

 
Work Distribution


SIMS 213 Assignment 8 – Communications Spectrum
Pilot Usability Test
May 1, 2003

 

Contents

1. Introduction

 

2. Method

a. Participants
b. Task Scenarios
c. Procedure

 

3. Test Measures

 

4. Results

 

5. Discussion

 

5. Formal Experiment Design

 

6. Appendices

 

7. Work Distribution

 

1. Introduction
This pilot usability study was conducted to test the second interactive prototype of the Communciation Spectrum website. We returned to our target users with an interactive version of the product that incorporates changes suggested by the low-fi prototype study and the heuristic evaluation, .

Our test consisted of four users, one of which participated in the low-fi usability study. The main focus of the test was to determine how users interact with the Analyze Spectrum Usage section of the site. Our hypothesis was that users would find the new interface more intuitive and easier to learn than the previously implemented interface.

 
3. Method
 
3a. Participants

Participant 1 is a college graduate and a first year masters student in the School of Journalism at UC Berkeley. Participent 1 is a male between the ages of 26-35. He is a daily user of the web as an information source and has some familiarity with communications spectrum issues.

Participant 2 is a college graduate and a first year masters student in the School of Journalism at UC Berkeley. Participent 2 is a female between the ages of 26-35. She is a daily user of the web as an information source and has no familiarity with communications spectrum issues.

Participant 3 is a college graduate and a second year joint masters student in Journalism and Public Policy at UC Berkeley. Participent 2 is a female between the ages of 26-35. She is a daily user of the web as an information source and has no familiarity with communications spectrum issues.

Participant 4 is a college graduate and holds a masters degree. Participent 4 is a female between the ages of 26-35. She is a daily user of the web as an information source and has some familiarity with communications spectrum issues.

 
3b. Task Scenarios

We chose to modify our task scenarios for the pilot usablilty study. Our initial scenarios reflected the partial functionality of our low-fi prototype. That is to say, we intentionally chose very specific user tasks that mapped to existing results in the low-fi prototype, to provide a realistic user experience. Our second interactive prototype provides mocked up results for all possible selections, so we generalized the testing tasks to allow testers to use the site in a more natural manner. We also added another scenario to test the expanded functionality.

Task One: Browse the site and familiarize yourself with the available information about the communications spectrum.

This scenario replaced the specific 'take the tour' scenario of the first user test. We felt that it would be interesting to see how many of the subjects actually chose to take the tour, when left to their own devices.

Task Two: You would like to investigate how an industry uses the spectrum in two regions. Use the system to create this comparison.

Task Three: You are interested in determining spectrum usage for two different industries in a single region. Use the system to create this comparison.

Tasks two and three originally named specific industies and and regions that mapped to available data.

Task Four: You would like to view spectrum usage for an industry in a region. Use the system to create this comparison.

We added task four because of the expanded functionality of the interactive prototype.

 
3c. Procedure
Our testing procedure included three people and two laptops. We loaded a local copy of our project on one of the laptops and an event logging application on to the other laptop. One person led the testing, introducing the participant to the project, getting user consent and leading them through the task scenarios. The tester with the logging application recorded each navigational choice that the participant made. The third tester took notes and recorded comments that the participants made. After each test the test leader presented the participant with a the test survey.
 
4. Test Measures
Scenario 1 was designed to:
  • Determine whether the site is intuitive. We wanted to see if users would be drawn to the tour area, and once there if they would take the tour sequentially, or use the nagivational aides to jump through the tour.
 
Scenario 2 and 3 were designed to:
  • Determine if users found the data analysis interface easy and intitutive.
  • Determine how users interact with data analysis functionality.
  • Determine if users easily undertood the difference between the result set posibilities.
  • Determine if the two data comparisons return information that is valuable to users
 
Scenario 4 was desgined to:
  • Test new functionaliy that was not present in the first prototype
 
All four scenarios were designed to:
  • Test the navigability of the site.
  • Determine if users find the site terminology comprehensible.
  • Determine if the new interface to accessing the spectrum data was sucessful and clear.
 
5. Results
We gave users post test surveys to give us an indication of how well the low-fi prototype measured in the areas we were testing. We designed this survey based on a combination of Jacob Nielsen's heuristic test factors and of the test measures which we developed above. Several of the question results are shown below (in all cases, 1 indicated the extreme negative answer while 5 indicated the extreme positive answer.) Please see Appendix B for the full survey.

Rate how well you understood the wording and terminology used throughout the site.

 
1
2
3
4
5
User 1
 
 
 
X
 
User 2
 
 X
 
 
 
User 3      
X
 
User 4
 
 
 
X
 

Rate how well you were able to maintain awareness of your location in the site at any given time.

 
1
2
3
4
5
User 1
 
 
 
X 
 
User 2
 
 
 
X
 
User 3
 
 
 
X
 
User 4      
X
 

How well did the results of your selections in the Analyze Spectrum Usage section match your expectations?

 
1
2
3
4
5
User 1
 
 
 
 
 X
User 2
 
 
X 
 
 
User 3
 
 
 
X
 
User 4      
X
 

Rate the consistency of the interface, specifically the terminology and navigation techniques used on the site.

 
1
2
3
4
5
User 1
 
 
 
 
X 
User 2
 
 
 
 
X 
User 3
 
 
 
X 
 
User 4    
X
   

Rate how easy you felt it was to navigate through the site.

 
1
2
3
4
5
User 1
 
 
 
X
 
User 2
 
 
 X
 
 
User 3
 
 
 
X
 
User 4      
X
 

Were you able to easily understand the navigation path necessary to complete the three tasks.

 
1
2
3
4
5
User 1
 
 
 
X 
 
User 2
 
 
 
 
X 
User 3
 
 
 
X 
 
User 4      
X 
 

6. Discussion
 
The testing of our second interactive prototype unveiled a number of problems that should be addressed in our third prototype.
 
Analyze Spectrum Usage Section
The most apparent user problems came up in task two when users were first asked to analyze data, comparing one industry in two regions. One user had difficulty moving from the tour to the data analysis page. To resolve this issue, we decided to highlight the tour and analyze spectrum data links both on the navigation menu and on the front page. In addition we think that users would benefit by having direct links to the analysis section from the navigation elements within the tour.

Users also had problems with the step by step nature of the analyze data section. Users wanted to add additional regions during step one, rather than continuing through to step four of the process. We intend to solve this by revealing all of the steps in the process, as well as the user's current location. We think that by giving a sense of the over all data selection process, we will increase users' ability to navigate within the analysis area and reduce their confusion about where to select data.

The other problem that appeared throughout the analyze data process was confusion about some of the terminology. We noted which terminology caused problems and plan to change it.

Two of our users commented that when they clicked on a data selection, they would prefer to be sent directly to the next page instead of having to select ‘Continue to step two.’ They felt especially strongly about this on the page with the map. Users commented that on every map selection that they have used, clicking on the map caused an action to take place and that our two click interface was confusing to them. We plan to discuss this as a group to decide exactly what the best change would be in order to fix this problem.

 
Tour
When asked to browse the site for a few minutes, three of the four users went through the entire tour. Users expressed some confusion about the definition of an industry and how comparing spectrum usage by industry was relevant. This feedback suggests that changing the introductory text and adding more content will make the purpose of the tour more clear. Incorporating more data about industry-specific spectrum usage and content about how the spectrum is licensed and managed will also help users understand the relationship between industries.
 
Proposed Changes:
 
Analyze Spectrum Usage Section
  • Make it easier for users to navigate between pages during the selection process.
  • Add more links on the data page that allow users to perform a new search.
  • Include more explanations during data selection process either at the beginning or through bread crumbs.
  • Change the wording and terminology that users felt was confusing.
 
Tutorial
  • Provide users more information about industry spectrum use
  • Ddraw specific comparisons between different industries' usage.
 
7. Formal Experiment Design

We propose a formal experiment that tests the usability of alternative interfaces to the Analyze Spectrum Usage section of the site.

In our pilot usability study we observed that users appreciated the very structured approach the first time they interacted with this part of the system. But on subsequent attempts to access spectrum usage data, they found the step-through interface somewhat rigid and laborious. We observed that there is a distinct learning curve for first time users, but they scale it quickly. So it is obvious that we need to offer a more advanced, quicker interface to the spectrum usage data. The question is whether we should offer only the more streamlined interface, or whether we should continue to offer the introductory interface and supplement it with the advanced interface.

As an aside, we have a number of alternative ideas for the advanced interface and will decide on one for the third interactive prototype. For the purposes of this formal experiment design, you can assume that the advanced interface will at least allow the user to make all variable selections in one screen rather than having to step through 3-4 screens to select industries and regions.

Hypothesis

We believe that users who first use the introductory interface and then move on to the advanced interface will be able to accomplish tasks more quickly and will be more satisifed, than those users who are only presented with the advanced interface.

Factors and Levels

The independant variables are the alternative interfaces, as described above. We will also control the tasks that users are instructed to do. All users will do the same four tasks, that will be carefully designed to be at an approximately equal difficulty level. Another independant variable is the computer literacy and Web familiarity of our testers. All of our testers should use the Web at least two hours per week whether from work or home. We also want to standardize the subjects' familiarity with the subject matter. It wil be allright for them to have some acquaintance with communications spectrum usage and management issues, but we do not want any experts in the field.

The dependant variables that we will track will be:

  • the time to completion and errors for each task
  • the total time to completion for all tasks
  • user satisfaction.

By tracking the time to completion on each task we should gain some understanding of the learning time required for the advanced and introductory interfaces. By measuring the total time to completion, we can determine whether the introductory plus the advanced interface results in net gains in efficiency for users of the system. To gauge satisfaction we will ask users to give verbal feedback about their experience after they have completed the tasks (more on this below.) For quantitative analysis of satisfaction we will ask them to fill out a post test survey that asks questions about their impressions of how easy, efficient and enjoyable the system was to use.

To improve the quality of the time data we will ask users to try to complete the tasks in sequence, on their own, without pausing to give feedback as they go. If they get stuck and cannot proceed without our help, we will mark it as an error and keep timing the task until they complete it. For the group that tests both interfaces we will ask for their verbal feedback regarding the introductory interface before they go on to use the advanced interface. We will ask both groups for general verbal comments about the advanced interface after they complete the tasks and before they fill out the post test survey.

Blocking and Repititions

The study will be a between groups study with two groups. One group of testers will use the advanced interface to accomplish all of the tasks. Another group will use the introductory interface for the first two tasks and the advanced interface for the other two tasks. The interfaces will be presented in separate versions of the site. Not as alternative entry points to the data on the same site. Because the learning curve tends to be scaled quickly on this site, we will have each user run through the four tasks only once. We will need a large number of testers, but only for a few minutes each.

 
8. Appendices
Appendix A: Informed Consent Form
Appendix B: Usability Test Script
Appendix C: Demographic and Post Test Questionnaire
Appendix D: Usability Test Logger (borrowed from Rashmi Sinha's IS271 Materials)
Appendix E: Usability Test Logs (link coming soon)