The current prototype of SFMOMA-DAM emphasizes getting images
into the system--the creation and management of object and
image records in the DAM database, and search--discovering
what images are cataloged in the system. This is the core
functionality upon which the rest of the system will be build.
We conducted a pilot usability study in an attempt to assess
system flow and functionality in relation to new task scenarios.
The current prototype is the third iteration of the interactive
prototype. It is inspired by design recommendations resulting
from a formal heuristic evaluation conducted by peer review,
and a reassessment of user needs.
It was our hope for the third prototype to make the core
functionality of Quick Search and adding object and image
records dynamic with a set of ASP scripts connecting to the
DAM database. These scripts were not ready for the usability
testing, however, so the tested version remained hard-coded.
Unfortunately, given the push to create dynamic pages we didn't
leave ourselves enough time to fully hard-code all of the
pages either. We feel that the in-between state of the interface
contributed significantly to the problems test subjects had
completing the scenarios. Ideally, usability testing would
have been postponed until after the interactivity was in place.
The purpose of this evaluation is to identify usability problems
in the SFMOMA-DAM second interactive prototype. This report
includes the following:
|
Method
Participants
The pilot usability study included four participants from
SFMOMA. Volunteers were solicited from a group of Museum staff
involved in the system development. All four participants
reside in the Collections Department. The following table
summarizes participant demographics:
Participant
|
Age
|
Gender
|
Highest Education
Achieved
|
Years Using Computers
|
Comfortable with
Computers
|
Experience with
databases
|
Experience Cataloging
|
Experience Cataloging
Digital Assets
|
Experience with
Web Interfaces
|
P1
|
53
|
F
|
graduate school
|
4
|
yes
|
2 years
|
1 year
|
1 year
|
no
|
P2
|
26
|
M
|
BA
|
14
|
yes
|
8 years
|
no
|
no
|
5 years
|
P3
|
42
|
F
|
MA
|
15
|
yes
|
11 years
|
not speicified
|
yes
|
5 years
|
P4
|
46
|
M
|
MFA
|
15
|
yes
|
12 years
|
10 years
|
3 years
|
7 years
|
Apparatus
The test required web access.
Tasks
The task scenarios have changed in recent iterations. Previous
tasks emphasized searching for objects, requesting images,
and the assigning metadata to images. A new priority emerged
in the process of developing the system database. This change
in priority effects our task analysis. A new emphasis is differentiating
the creation of object and image records, and searching for
images associated with objects. The tasks given to test participants
included the following:
1a. Search for all objects with "blue" in the title.
1b. Locate all digital image versions of the object titled
"football blue".
1c. How many objects are related to the "Drawer 3 of Blue
Chest of Drawers" object? Which one is the root object?
2.a Add new object record for new acquisition.
2.b Catalog an image file for the new acquisition.
Procedure
Participants were scheduled for user observations on site
at SFMOMA. Four participants were tested in three sessions.
Participant 2 and Participant 3 worked collaboratively on
tasks and were observed simultaneously. Testing took place
in participants' workspace at the Museum. One group member
served as facilitator and two served as note-takers and observers.
Roles remained consistent throughout testing.
While all test materials were prepared and organized at the
start of each test, the on-site location, use of participants'
workspace, and consecutive scheduling limited the group's
preparation of the testing space as recommended by Nielson.
Each participant was greeted and oriented to the testing process
by the facilitator. Each participant consented to the process
by signing consent forms. Participants were asked to complete
a brief form describing demographic information and related
work experience.
|
Test Measures
- Navigation --Measure system flow to support tasks.
- Clarity of terminology --Assist user in system visibility
and match system to organizational context.
- Ease of adding new records --Assess new design.
- Search --Assess new design.
- Learnability --Assess ease of use.
- Overall satisfaction --Assess total user experience of
new system.
|
Results
The pilot usability study yielded significant feedback on
the third iteration of the interactive prototype, as well
as on the process of the usability testing.
Task completion:: All participants completed all tasks.
Time to complete tasks:
participant
|
Task
1a.
|
Task
1B. |
Task
1c. |
Task
2a.
|
Task
2.b
|
1
|
5 minutes |
5 minutes |
6 minutes |
6 minutes |
5 minutes |
2 & 3
|
1 minute |
4 minutes |
5 minutes |
8 minutes |
9 minutes |
4
|
2 minutes |
1 minute |
5 minutes |
8 minutes |
5 minutes |
Errors: A total of two errors occurred during testing. Both
were IE errors as a result of missing data in input fields.
Reported Satisfaction: Participants rated the system satisfactory
on each test measure.
Reported Satisfaction
|
Scale: Very Satisfied
(1) ---Very Disatisfied (5)
|
Participant
|
Navigation
|
Clarity of Terminology
|
Ease of adding new
object record
|
Ease of adding new
image record
|
Search
|
Learnability
|
Overall Satisfaction
|
P1
|
2
|
2
|
2
|
1
|
1
|
2
|
2
|
P2
|
1
|
3
|
2
|
2
|
1
|
1
|
1.5
|
P3
|
3
|
3
|
3
|
3
|
3
|
3
|
3
|
P4
|
3
|
3
|
2
|
2
|
3
|
3
|
3
|
AVG
|
2.25
|
2.75
|
2.25
|
2
|
2
|
2.75
|
2.4
|
Written Feedback:
Participant
|
Features easiest
to use
|
Features most difficult
to use
|
Overall comments
|
P1
|
- initial search
- viewing images
- retrieving metadata
- entering data
|
--- |
An introduction to database before using
would be helpful and more typical to its use, though you
probably want a cold approach. |
P2
|
- Search through viewing images--I found all of the
components related to existing objects.
- Images very easy to use and intuitive.
|
Creating new informatin was more difficult,
but part of that difficulty is not about DAM, but about
learning a new system. |
- Clear and thorough.
- Navigating is intuitive.
|
P3
|
--- |
--- |
--- |
P4
|
|
Setting up relationships of new images. |
Excellent progress to prototype 3. Need
to see version populated with data. |
|
Discussion
|
The user observations inspired significant changes and rethinking
in the design of the user interface.
Navigation : Users followed the system flow as designed
to complete assigned tasks. Home, Search, and Search results
appear easy to navigate. Problems arose due to the lack of
system visibility on some pages, though. Some of the pages
that caused confusion and proved challenging included: view
image versions, add object and image records. Users resorted
to using the BACK button in the browser to assist navigating
between these screens. An internal "back" button
that returns a user to the previous page could provide assistance
within the interface. In addition, we would like to re-implement
breadcrumbs on main navigation pages to assist in system visibility.
However, addtional documentation including descriptive titles
and instructions will also improve system visibility and assist
users with navigation.
Clarity of terminology : We continue to struggle with
terminology in use at the Museum. Problems arise in describing
art objects as objects in an image management system and differentiating
records of images of art objects. The Museum has a history
of using terms such as item and main to represent objects
in a catalog. In addition, while there seems to be a positive
response towards the delineation of relationships between
objects and between objects and images, there appears to be
confusion on the descriptive terms. For instance, the third
prototype uses "related objects" to identify all
objects within a family of art objects. Users identified a
mismatch in their experience using "family" or whole
and parts. Other troublesome terminology includes subtle distinctions
between root/parent/child for objects and view/versions for
images.
Ease of adding new records: These screens are meant
to be used only by the few users who will have permission
to enter records into the database, which suggests that learning
the terminology may not be a problem as long as it is logical
and consistently applied. Nevertheless, trying to better match
the users existing vocabulary would be a good idea, if such
a vocabulary existed. We need to improve feedback so that
users always know where they are in the add record/add image
process. This might be able to be cleared up with additional
informational text on the screens and with more consistent
labeling. But there also seemed to a be a general resistance
to the idea of dealing with objects first, when the real goal
is to catalog (in this case) images.
Search : The newest iteration of quick search appeared
unnecessarily complicated to users. Our recommendation is
to return to a previous iteration and set a criterion default
that is consistent with other Mueseum search interfaces that
utilize "starts with".
Additional concerns:
- Redesign pages where content "above the fold" does not
match user expectations.
- Object records need to be Confusion about the "object
record" should diminish when the missing "three images"
are more readily visible.
- All pop-up help windows should be reviewed and possibly
redesigned to highlight key points first, and then show
more complex details.
- Need to provide additional labeling and visual feedback
to assist system visibility.
|
Formal Experiment Design
|
The user interface for the SFMOMA Digital Asset Management
System has evolved to support two broad classes of users: 1)
those who want to find, view, and use digital images, and 2)
administrators and digital imaging specialists who want to catalog
images and manage the digital asset collection. Accordingly,
formal usability testing would involve designing two experiments.
Here we describe only one experiment, one which targets administrators
and digital imaging specialists. The goals are 1) to measure
how long it takes to enter new record sets (object + one set
of images) into the system; 2) How many data entry errors are
made entering the record sets; and 3) whether the rate of record
creation improves as more records are entered. The interface
will vary on two dimensions: 1) The order of record creation
and 2) display of parent / child relationships.
The study uses a within-groups design (all participants evaluate
the same set of interfaces).
|
Participants:
|
The study design requires a minimum of 18 participants. Participants
should be evenly split between imaging center administrators-those
responsible for coordinating image creation and cataloging,
and digital imaging specialists-those responsible for actually
creating the images. Participants should also have moderate
to advanced computer skills. Given these criteria it is likely
that participants would have to come from multiple institutions.
|
Tasks
|
- Add 15 new record sets, comprising of one object and one
set of images.
- Ten of the new records are for simple objects (no children).
- Three are child records.
- Two are grandchild records.
- The order of objects for which record sets must be created
is the same for each participant.
- The amount of information entered for each record set
is approximately the same for all objects
|
Hypotheses
|
- Allowing users to chose the order of record creation will
result in faster record creation.
- Requiring users to add object records first, then image
records will result in fewer data entry errors.
- Hiding Parent / Child relationships will result in faster
record creation.
- Always displaying Parent / Child relationships will result
in few data entry errors
- The time it takes to add a single record set (object and
images) will be slowest for records 1-5,
will increase dramatically for records 5-10, and will level
off between records 10-15.
|
Response Variables (dependent variables)
|
- Time: Time to add a new record set (object and images)
*
- Errors: Number of data entry errors made while entering
the new records **
|
* Record set = The information that needs to be
entered into the database to adequately describe one
object plus one set of images related to that object.
** A data entry error is defined as 1) entering incorrect
or partial data into a field, 2) attempting to proceed
without entering any data into a required field, 3)
not adding both components of a record set: that is
both object and image records for a given object.
|
|
Factors / Levels (independent variables)
|
- Order of record creation
- Object Records first
- Image Records first
- Either Object or Image: User choice
- Display of Parent / Child relationships
- Always displayed
- Hidden: User chooses to display
|
Blocking and Repetitions
|
Three trials per block, 18 total participants.
Record Set order (1-15) same for each block.
Displayed |
Hidden |
Obj |
Img |
Img |
Ch |
Ch |
Obj |
|
|
Displayed / Hidden: |
Whether parent / child relationship information
is always displayed or hidden until user requests it. |
|
Obj: |
Enter object records first
|
|
Img: |
Enter Image records first |
|
Ch: |
Choice of user |
|
|
|
|
Appendices
Data:
Materials:
|
|