SIMians    
  Course
  Comment
  Forum


Design Evolution

  

   Overview
    and Problem
   Statement

   Personas and
   Scenarios

   Interface
   Design   
  

   Design 
   Evolution

   CoCoFo
   Prototype

Initial Conception

Through our initial focus group and questionnaires, SIMS students expressed to us a number of needs that could be addressed by a web-based bulletin board or discussion forum. Course comments and ratings, announcements, and housing postings were named as possible subject areas for the general interest forum we had been considering. Our initial sketches and designs reflected our attempts to organize a general interest discussion forum in a way that would be useful for all of the expressed needs. Some aspects of the initial design, like the fixed navigation elements, have been retained throughout the evolution of the design.


From Initial Sketches to Paper Prototype

Before creating our paper prototype, we made an essential change to our conception of the interface. We decided to focus on designing an interface for the user need that had been the most strongly conveyed to us: course ratings and comments. Redefining the scope of our project allowed us to tailor our design to address that need.

Changes: Our paper prototype retained the left-side navigation elements (which now included the ill-defined special function of a course 'shopping cart') and basic layout, added a search function at the top of the screen, and provided a few different ways of browsing for specific courses. The course comment pages included a graphical rating scheme and a "people who liked this course also liked" feature. Users could add a comment using a link on the left-side navigation bar, but only if they were logged in and already viewing a specific course.


From Paper Prototype to 1st Interactive Prototype

Testing the paper prototype gave us valuable feedback regarding the flow of the interface. We realized that users wanted both noun-verb and verb-noun access to rating a course. The number, arrangement, position, and terminology of the constant elements caused confusion. Users expressed a need for an introduction to the site that would give them both a context and starting points for using the site. Finally, users felt a need for explicit "exits" after task completion. We created the 1st interactive prototype in response to these observations, using static HTML pages linked with scripts, using canned search and comment results.

Changes: In our 1st Interactive Prototype, we added a start page with a brief explanation of the site and direct links to the most-used functions. We also enabled verb-noun interaction through left-hand "add comment" button and noun-verb interaction through links on each course page. Constant elements were simplified by eliminating extra search functions, removing the "Shopping Cart" function, some browse functions, and the "New User" function. We also added basic navigation links at the bottom of each course and course comment page.


From 1st to 2nd Interactive Prototype

The largest change from the 1st interactive prototype was behind the scenes: we converted the mostly static HTML pages to fully dynamic ones using Coldfusion and an Access database. This allowed us to test the system in a more realistic fashion, as user comments and ratings would be reflected in the system as they interacted with it.

Changes: For our 2nd interactive prototype, we made a number of terminology and labeling changes based on feedback from the heuristic evaluation, such as the change from "Add a Comment" to "Rate a Course," which more clearly reflected the users' expectations of the site. Other significant changes based on the heuristic evaluation included:

  • Removing the forced preview restriction.
  • Disallowing completely blank comments.
  • Switching to a numeric (1-5) rating system.


From 2nd to Final Interactive Prototype

The pilot usability test confirmed that the flow of our interface was essentially on target and that users liked the system and found it usable. It also revealed the places where the flow was impeded and where the interface was still somewhat confusing. During the test users also noted some possible additional features.

Changes: New features added to the Final Interactive Prototype include addition of a course workload rating and the display of each commenter's email address (if included in their registration information). Several changes were made to the site's forms, such as changing the order in which buttons are presented (Login or Submit first, Cancel last), adding explicit "no rating" selections for each of the rating scales, and ensuring that entered text wraps in the comment text box. In addition, various text clarifications and cosmetic improvements were made.


Future Plans

We would like to see the CoCoFo system running and available to the student body. In order to do so, we anticipate needing to rebuild the database and Coldfusion pages in order to more effectively integrate the latest changes to the interface. We would also have to implement a registration function, possibly in conjunction with an existing SIMS system, populate the database with all of the needed courses, and find an appropriate location for the system. We would also like to add a graphical display of course ratings and an edit and deletion feature for old comments. We think that the system has the potential to be a useful and popular tool for SIMS.


Thoughts on Evaluation Methods

We found all of the major evaluation methods to be essential to the successful evolution of our design.

  • Low-fi Prototype Test: The low-fi test gave us valuable insight into the flow our users expected and required as they progressed through the tasks. The paper prototype was especially useful for this, as users did not get distracted by more surface issues -- and when we forgot a "minor detail" that could have held up the whole test (specifically, there was no "Submit" button on the login form), we were able to add it on the fly!
  • Heuristic Evaluation: The heuristic evaluation showed us that users needed more context in order to understand and use our system. Heuristic evaluation was particularly useful for this, since an outside group without our preconceived notions of the design but with equivalent expertise was better able to see what was needed.
  • Pilot Usability Test: The pilot usability test helped us validate our design. It was also very useful in identifying numerous improvements and added features that could help us approach a mature design.

In addition to the immediate usefulness of the pilot usability test results, it will also provide a useful benchmark: if we proceed to build a course rating and comment system for SIMS based on our design, we can perform this test on a larger scale to assess the usability of the functional system. In addition, the usability test framework can be used to assess the effect of adding more features to the system. Each new feature can be tested in the context of the whole design, allowing us to manage the balance between a rich feature set and usability issues caused by feature bloat.

At this point in the design process, we feel that continued iterations of usability testing and heuristic evaluation would be an ideal combination to continue refining the design.

 


Last Modified: May-08-2001

Copyright 2001: Linda Duffy, Jean-Anne Fitzpatrick, Sonia Klemperer-Johnson, James Reffell