[an error occurred while processing this directive]


Usability Testing - Log of Critical Incidents

[an error occurred while processing this directive]

Tester 1
Tester 2
Tester 3
Tester 4
General comments about all testers


Tester 1

Task 1

Actions:

    1. Clicked on "login" link on left of page.
    2. Fills in fields, uses mouse to click "submit" button.
    3. Types is250 in search field; hits return; search unsuccessful.
    4. Types IS250 in search field; hits return; search unsuccessful.
    5. Gives up searching; chooses "browse by course number" instead.
    6. Finds IS 250 and clicks on link.
    7. Scans ratings/comments

Comments:

    1. Wanted to hit return when submitting username and password, but submit button did not have focus.
    2. Liked that "submit" button for searching did have focus.
    3. Didn't understand difficulty ratings. Is a high number more or less difficult?
    4. Liked the details of the comments, how they expressed students' feelings.

Task 2

Actions:

  1. Enters "user interface" into search field and submits.
  2. Gets three results; clicks on IS 213.
  3. Sees no ratings/comments posted so goes to IS 213 homepage on SIMS website.
  4. After looking at current year's syllabus, hits back button to return to CoCoFo.
  5. Sees IS 245 is recommended by students who recommended IS 213 so clicks there to see comments.
  6. Since IS 245 is not a similar topic, user returns to search results page by hitting back button.
  7. Clicks on each of the two other courses returned in search results, looking at ratings and course homepages.

Comments:

Task 3

Actions:

  1. Finds and clicks on "rate a course…" button.
  2. Chooses IS 206 from the pull down menu.
  3. Inputs ratings and comments. One of the radio buttons does not take the ratings.
  4. Clicks "preview".
  5. Decides to edit comment. Clicks "edit".
  6. Edits comment and clicks "submit".

Comments:

  1. Wanted to be able to type in the name of a course when choosing a course to rate, rather than being restricted to pull down menu.
  2. Liked the large text box for adding a comment.
  3. Noticed that text doesn't wrap when entering a comment.
  4. Noticed that, nevertheless, comment text is wrapped when comment is submitted.
  5. Wondered if she needed to log out of system.

Post test interview

General comments:

  1. Likes color of the interface. It's comfortable and doesn't shout out.
  2. Would like a way to email students who have posted comments.
  3. Would like to see graphical distribution of ratings, not just average of ratings.
  4. Would not generally respond to individual comments unless had extremely strong feelings about the comment.
  5. Would like a one line explanation of the difficulty ratings, but that the other ratings were self-explanatory.
  6. Did not understand what was wrong with the search results, whether search was case-sensitive or not.
  7. Would find the system useful if implemented.

General observations of tester:

  1. User had problems searching.
  2. User didn't notice a missed click on instructor ratings.
  3. User used a lot of the links to course home pages.
  4. User did not seem to notice "add a comment" link on course description pages.

 

Tester 2

Task 1

Actions:

  1. Looks at search function but does not use it.
  2. Clicks on "browse by course number" link and finds IS 250; clicks on IS 250.
  3. Logs in to read comments.
  4. Reads comments for IS 250.

Comments:

  1. Assumed logging in would show comments.
  2. Noticed and approved of recommended course feature.

Task 2

Actions:

  1. Chooses not to search for course.
  2. Clicks on "browse by subject" link.
  3. Doesn't see "user interface" subject so scans all courses and finds IS 213 listed.
  4. Clicks on IS 213 to read comments.
  5. Decides to conduct a search; enters "user interface design" and submits search.
  6. Clicks on CS UI course.

Comments:

  1. Only found course by scanning, not by searching.

Task 3

Actions:

  1. Finds and clicks on "rate a course…" link.
  2. Selects IS 206 from drop down menu.
  3. Scans the ratings form and fills in the fields.
  4. Previews rating and submits.

Comments:

  1. Noticed that the text box doesn't wrap and wondered if carriage returns were needed.

Post test interview

General comments:

  1. Liked the CoCoFo system.
  2. Would like to edit, append, or delete own comments. Users might regret what they have written at the spur of the moment.
  3. Wondered about registration and how users would be assigned user names or whether they could be picked out by user.
  4. Wondered how site would be advertised/marketed.
  5. Discussed the pros and cons of being able to post anonymously. Anonymity might engender stupid comments. Non-anonymity might limit what is said, but in a good way, limiting to constructive criticism.
  6. Would like to have the ability to respond to individual comments so that conversations could develop. If conversations became irrelevant, they could be taken outside of system.
  7. There would be times when it would be suitable to reply to a comment, and other times when it would be more suitable to reply in a personal email.
  8. Vaguely understood the ratings system. Would rely more on comments than on the ratings, and therefore wasn't too concerned with the ratings system. Another reason why non-anonymity would be useful - so that you would know what type of personality was posting what.
  9. Thought the ratings system might have gone up to 10, not just 5.

General observations of tester:

  1. User first browsed for a course and then searched for a wider selection.

 

Tester 3

Task 1

Actions:

  1. Attempts to log in but is unsuccessful because of invalid username (our fault).
  2. When valid username used, logs in and reads information on entry page.
  3. Clicks on "browse by course number".
  4. Clicks on IS 250 and reads comments.

Comments:

  1. Thinks system is very nice.

Task 2

Actions:

  1. Does not search.
  2. Clicks on "browse by subject" link.
  3. Scrolls to bottom of page and back up again before seeing IS 213 link.
  4. Clicks on link.
  5. Reads comments.

Comments:

Task 3

Actions:

  1. Clicks on "browse by course number" and finds IS 206; clicks on link.
  2. Clicks on "add a comment to this course" link on the IS 206 page.
  3. Fills in ratings and comment.
  4. Clicks on "submit" without previewing comment first.
  5. Scrolls down to see all of the IS 206 comments.

Comments:

Post test interview

General comments:

  1. Was uncertain about when login was required.

  2. There was no indication that login was necessary on the entry page.
  3. Didn't find course subject categories very useful. Didn't know where IS 213 would fall, so scanned courses rather than categories.
  4. Thought there should have been a User Interface category.
  5. Would like to see more information about each user who had posted a comment, such as their background, education, and degree status.
  6. Would not be interested in responding directly to a comment.
  7. Would like to have seen email address of commenters and possibly whether they would be interested in accepting emails from other students.
  8. Would like to see a rating of the course load, e.g. how much time per week was spent on the course.
  9. Found the rating system intuitive. Liked the 1-5 scale, and 1-10 scale would be too large.
  10. Liked aesthetic of CoCoFo; nice and minimal, not much clutter.

 

Tester 4

Task 1

Actions:

  1. Clicks on "login" link on left, inputs information and submits.
  2. Clicks on "browse by course number".
  3. Finds IS 250 and clicks on it.
  4. Notices links to course home page and professor home page.
  5. Scans comments and ratings.

Comments:

  1. Thought login was a way to browse (because of location under "browse by") until gap between "browse by" links and "login" link was noticed.
  2. Wanted to be able to hit "enter" to submit login information.
  3. Likes the protection of the system that needs a username and password to enter, but would like to know who the user base is, i.e. who will see comments and ratings.
  4. Found the "browse by course number" listings well organized.
  5. Understood ratings system.

Task 2

Actions:

  1. Clicks on "browse by subject" rather than search.
  2. Finds Human Computer Interaction courses after scrolling to bottom of page and back up.
  3. Looks further for individual User Interface group of courses, but returns to HCI courses when none is found.
  4. Does browser search for "interface" and finds a CS UI course.
  5. Clicks on IS 213 and reads comments.

Comments:

  1. Wondered why there wasn't a specific User Interface subject heading.

Task 3

Actions:

  1. Clicks "rate a course…" link.
  2. Clicks on drop down menu and chooses IS 206.
  3. Fills in ratings and comment fields.
  4. Notices that the text does not wrap and adds carriage returns to force wrapping.
  5. Clicks "submit" without first previewing.
  6. Scans through comments to find just-submitted one; goes through to bottom and back up to top before seeing it.

Comments:

  1. Wondered if non-wrapping in text box was a problem.
  2. Found the drop down menu well organized.

Post test interview

General comments:

  1. Finds the system offers many useful features, it meets a real need, and is very "robust".
  2. Finds that it is somewhat controversial, a good thing. Anything that requires a login should be somewhat contrtoversial. But because login requires a compelling reason, should know what the material beyond the login is.
  3. Would like to know more about the system - e.g. who has access, for whom is it meant - before seeing it.
  4. Would like to see a "rate the rater" or "rate the comment" function.
  5. Likes that it is not anonymous; would have less trust in comments by an anonymous user.
  6. If anonymous, or if threaded, would be afraid of flaming wars.
  7. Would possibly like email links to commenters, but would make it an "opt in" system.
  8. Would like to edit or delete own comments, but if edited or deleted, would like to see a flag indicating such changes.
  9. Thought workload rating would be useful.

General observations of the tester:

  1. User spent a lot more time than the other testers looking around and browsing before focusing on given tasks.

 

General observations of all the testers

  1. No one tried a different search term unless the search returned zero items. This might be because the users were satisfied with the results, or because all of the users have a general knowledge of the courses that could be returned. Therefore, they know that when user interface is searched for, IS 213 should come up. They may be satisfied when it does come up and not be compelled to search any further.
  2. Three of the four users used the verb-noun interaction style for adding a comment. Only one of the users even seemed to notice, let alone use, the noun-verb interaction style.
  3. Users had a strong preference for browsing rather than searching, even when given tasks that directed them to search rather than browse.

 

 


Last Modified: Apr-23-2001

Copyright 2001: Linda Duffy, Jean-Anne Fitzpatrick, Sonia Klemperer-Johnson, James Reffell