[an error occurred while processing this directive]


Usability Testing - Discussion

[an error occurred while processing this directive]

We found the usability testing very helpful. We were pleased that there did not seem to be major problems with our interface, and that the users seemed to like it. Beyond that, it was useful to get feedback on various specific problems. We plan on making the following changes in our next iteration in response to the results of our usability testing:

  • Modify the "login" and "cancel" buttons on the login screen. Currently, neither has the focus, so a user must either tab or move the mouse to the button they wish to click. Also, the cancel button appears first, although the login button is likely to be selected more frequently. We will change the order of the buttons and give the "login" button focus so that users can hit the return key after entering their password and have the information submitted to the system. The buttons on other screens will also be re-ordered to provide a consistent location for the cancel function.
  • Modify the comment text box. The text box in which users type comments does not wrap the text. We will change the parameters of the text field so that the entered text wraps to a new line when the edge of the field is reached.
  • Modify each of the rating scales to include an explicit "no rating" option, which will be the default selected value. This will have three benefits: it will make it easier to understand that each of the ratings is optional, it will provide an easy way to "de-select" a single rating (rather than having to reset the entire form), and it will be more consistent with the standard usage of radio buttons (which should always have one option selected, rather than appearing with no option selected).
  • Make noun-verb form of rating a course more conspicuous. Only one of our testers used the noun-verb method of rating a course. We believe this may be because the link to rate a course on individual course pages is not very noticeable.
  • Add email addresses to comments. Many of the testers indicated an interest in communicating with other users outside of the system. We will add a field to the database to store the user's email address. If an email address is present, it will be displayed along with the user's comments.
  • Add course workload rating. We will add an additional rating for course workload, probably expressed in hours of time required per week. This will be in addition to the course difficulty rating. Although they will probably be correlated, the extra rating will give users another dimension of knowledge about the courses they are interested in.
  • Clarify the rating scale for course difficulty. Although only one tester had a problem with this rating, it will be easy to add a short description of the rating scale next to the average course difficulty rating.
  • Make entry page more informative. The entry page needs to contain more information about the system, such as who is allowed to use it, why the user must login, and how the information will be used. We may even put the login form on the entry page.

The following items may not be feasible to change for the next prototype, although we hope to include some of them. All would be worth considering for implementation in future:

  • Add edit/delete comment functionality. User comments on the system suggested the usefulness of a function that allows users to edit or delete their own comments. After being edited, the comment should indicate that the entry has been edited, perhaps by showing the date of modification.
  • Complete implementation of the "people who liked this course also recommend..." function. The current prototype has only a static set of links representing this function, although we have done preliminary work on the actual implementation.
  • Implement the registration function, including user preferences such as whether their email address should be shown. In the current prototype, registration is a completely off-line system administration function. We envision that the actual registration may require off-line verification (for instance, to confirm that the person requesting an account is a SIMS student); however, the site could have a form to submit a request for an account.
  • Improve the search mechanism. We would like make sure that inexact searches will return accurate results, especially for common items like the course number which may be entered in a variety of forms. We do not want users to think that their search retrieved no results because there is no course when in fact their search query was entered in a slightly different format than the database entry.
  • Improve categories. We have organized the SIMS courses into categories based on the categories used in the career section of the SIMS website. These categories should be improved so that users can find the courses they are looking for. Ideally, this should be consistent across all SIMS-related web sites.
  • Complete implementation of the multi-year features. The current prototype contains only data for the most recent year when each course was taught, and does not include the envisioned commands for paging through multiple years, although the database is structured to support multiple years.
  • Expand the database to include all courses. The current prototype does not include the seminar courses (290 series).
  • Add graphical display of rating distribution. This is not the most important change on our list, but if we had time, we would like to try adding a graphical display. It, like the extra course load rating, would give users another bit of information they might find useful.

Based on the results of the pilot test that we performed, we would make the following changes to our testing methods in a more comprehensive test:

  • Create forms for collecting standardized testing data. For example, create a form with spaces for all of the test times recorded.
  • Collect standardized data on the following observations:
    • whether user prefers browse or search
    • whether user prefers noun-verb or verb-noun method of rating a course
    • whether user prefers direct submit or preview option for adding a comment
  • Ask users more specific questions in the written questionnaire. For example, ask users to check off from a list of available functions those functions that they would like added to the system.
  • Ask users to compare the ratings for two courses to see if they understand the scale used.

Overall, we found the usability testing pilot study quite useful. Despite being a limited sample size, it provided useful data that suggests quite a few changes to the interface.


Last Modified: Apr-26-2001

Copyright 2001: Linda Duffy, Jean-Anne Fitzpatrick, Sonia Klemperer-Johnson, James Reffell