Results

Overall, we got positive feedbacks from our participants. They like the overall look and feel of the tool. They all found the prototype to be an improvement of the previous metadata tagging tool. Two out of three of them are seriously considering of using the tool that will be implemented based on this prototype as soon as possible. They also provided a lot of suggestions on how to improve the prototype. The following tables show the detailed results from the pilot usability study.

Table 1: time taken by our participants to complete each task.

Task Participant 1 Participant 2 Participant 3
Scenario A
Task 1 90 sec 80 sec 20 sec
Task 2 70 sec 70 sec 10 sec
Scenario B
Task 1 20 sec 70 sec 5 sec
Task 2 25 sec 165 sec 10 sec
Scenario C
Task 1 130 sec 40 sec 5 sec
Task 2 90 sec 155 sec 15 sec

Table 2: the number of errors and the list of errors for each scenario.

Task Participant 1 Participant 2 Participant 3 Common Errors
Scenario A 0 2 (e1: did not notice the pull down for activity type, so could not find the quiz activity type; e2: did not realize he need to click on the Go button to view the quiz template) 1 (e1: missed the Go button) 2 of the participants missed clicking on the go button
Scenario B 0 0 1 (e1: missed the update button) 1 participant missed the update button
Scenario C 3 (e1: thought the metadata indicate what to modify is the same as the metadata: debugging, he went to the bookkeeping section to look for the debugging tag, but could not find it; e2: clicked on learning goals in the bookkeeping section; e3: looked for this metadata under the testing metadata of cognitive learning section) 0 1 (missed the update button agin) There are no common error patterns


Table 3: the detailed user feedback from the usability study.

TaskParticipant 1Participant 2Participant 3Overlapping comments
Scenario A
  • (-) thinks there are too many clicks, requested that we remove the update and the go buttons
  • (n) not sure why the bookeeping section is the first section in the process train
  • (n) wants to be able to edit the list of bookkeeping metadata
  • (n) wants the metadata: learning goals, description, and data modified to show up in the default window because they are the most useful metadata in his opinion
  • (n) strongly believes that the metadata that can be auto-generated should be auto-generated
  • (+) thinks the interface is straightforward
  • (n) assumes that many metadata will be auto-generated
  • (n) believes the interface need to handle situations where the instructor needs to change the activity type (e.g. from Quiz to brainstorm) without losing any metadata
  • (n) thinks that some metadata are not useful (e.g publisher, identifier, relation), so they should be removed from the bookkeeping section
  • (n) the metadata: coverage should be in the topics section
  • (n) the metadata: learning goals should be in the cognitive learning section
  • (n) some metadata (e.g course name, course number, date created, semester the activity was used) need to be added to this section
  • (-) (3) thinks the update and the go buttons are not necessary and easy to miss
  • (n) (2/3) wants many metadata to be auto generated
  • Scenario B
  • (+) finds the instructor notes section easy to use
  • (-) confused about the relationship between personal notes on the right and the list of instructor notes metadata on the left
  • (+) finds the instructor notes section intuitive
  • (n) suggests there needs to be a section for attaching notes to TA and Lab Assistants in the instructor notes section
  • (n) believes that date information needs to be attached to each instructor note
  • (+) the prototype matches his assumption that the personal notes types corresponds to all the instructor notes checked for that activity
  • (-) the tool should not lost text that he has already entered (this was a bug in the demo)
  • (+) (3) finds the instructor notes section easy to use or intuitive
  • Scenario C
  • (-) confused about where he should go for task 1, he thought that the phrase indicate what to modify was the same as a debugging activity
  • (n) the user suggested to show the instructor notes and the cognitive learning metadata in a list format in the Preview section
  • (-) had some confusion with the difference between the Topic section and the Cognitive Learning metadata section
  • (+) once he recognized what the cognitive learning section is about, he found the tool straightforward to use.
  • (n) for the activity that is tagged with application, it also needs to be linked to an activity to indicate what is this activity an application of.
  • (-) when clicking on an arrow for a metadata, the check in the checkbox next to that metadata went away.
  • (n) need to think about the question: should one metadata expand at a time, or should we allow more than one metadata to expand at a time?
  • (-) (2/3) confused about what kind of metadata goes under the cognitive learning section
  • General Feedback
  • (+) likes the overall arrangement of the interface
  • (+) likes the Preview section
  • (+) likes the cognitive learning area of the metadata tagging. Thinks it will be useful.
  • (-) the cognitive learning area seems to have too many categories for the metadata
  • (-) Does not like the process train because he thinks the different metadata tagging tasks is not a process. So it was difficult for him to find what he was looking for. (There is a mismatch of the mental model)
  • (+) likes the overall look
  • (+) likes the Preview bar
  • (n) requested a search or a browsing tool for looking up a metadata
  • (+) likes the UI prototype much more than the old interface
  • (n) the color scheme of the prototype doesn’t bother him
  • (n) not sure how he would feel about this tool if he needs to use it to tag 50 activities in a week
  • (n) want access control so that TA and LA can only access the Instructor Notes metadata tagging tool; only the instructor should be able to tag the activity with cognitive learning and topics section
  • (n) the tool needs to be able to handle cases when the user switch context in the middle of a task
  • (+) The tool is painless; it is easy to use, and it is a big step of improvement from the previous tool
  • (+) like the Preview Bar
  • (+) really like the Cognitive Learning section
  • (+) likes the progress train
  • (n) would be nice if the tool allows keyboard shortcuts
  • (n) would be nice to have the drop down help feature for the progress train to assist the first time users
  • (-) If the user has less time, they would not fill out the personal notes section. So it is good that the interface design for personal notes does not interfere with users who are pressed for time.
  • (+) (3) likes the over all look and feel of the tool
  • (+) (3) likes the preview bar
  • (+) (2/3) likes the progress train


  • Table 4: the user rating results collected from the pilot study Follow-Up Questionnaire


    Questionnaire Rating Scale
    1 2 3 4 5
    Disagree Somewhat Disagree Neutral Somewhat Agree Agree

    * means participant answered with "I don't know"



    Question Participant 1 Participant 2 Participant 3
    1. I found the Metadata Tagging Tool intuitive and easy to use. 3 4 4.5(great in general, but I hardly ever noticed the update button)
    2. I think the Metadata Tagging Tool would be valuable for both instructors and TAs. 5 5 4.5
    3. I think that the Metadata Tagging Tool would be useful for future (new) users of UCWISE. 5 5 5
    4. I think that the Metadata Tagging Tool would be easy and intuitive for future (new) users of UCWISE. 3 4 4.5
    5. I think that the Metadata Tagging Tool is an improvement over the current form of managing and tagging metadat in UCWISE. 5 5 5
    6. I do not think I can use the Metadata Tagging Tool because there are key features missing. 3 1 1
    7. I think the Bookkeeping Metadata section is a helpful feature. 2 3 5
    8. I think the Bookkeeping Metadata section contains all the features I need to accomplish my task. * 5 4
    9. I think the Cognitive learning section is a helpful feature. 4 5 5
    10. I think the Cognitive Learning section contains all the features I would need to accomplish the task of tagging my activities. 4 4 4.5
    11. I think the Personal notes are a helpful feature. 5 5 5
    12. I found the Preview section helpful. 5 4 4
    13. I found the Navigation Bar to be intuitive. 1 5 5
    14. I found the Navigation bar a useful feature to help me navigate/move between the various pages of the Metadata tagging tool. 1 5 5
    15. I liked the method in which the metadata tagging tool helped me select and tag my metadata. 5 3 5
    16. Overall I did not find the Metadata tagging tool overwhelming or cumbersome to use. 5 4 5