Site visits and surveys

you know those days... / gato-gato-gato

Site visits

Back in the middle of 2012, Kenji and I got out from behind our desks and went to visit some Libraries Australia users at theirs. This was an opportunity for us to see people using Libraries Australia Search for real tasks and to see the problems they encounter. This type of site visit is usually run as a master / apprentice arrangement. The analyst or designer acts as the apprentice, learning from and observing the everyday tasks of the user, who takes the role of the "master". The apprentice has the opportunity to ask questions and gain insights not possible from a distance. While only a small number of users can participate (our handful of site visits took weeks of travelling, documenting and planning), the depth of understanding gained is much greater than through most other research methods.

Here's an example of what we learned on the topic of comparing records:

  • A lot of users often need to compare various records, looking for the one that suits them the best ("the right record").
  • Users aren't just comparing records in the NBD, they may also need to compare to a record in WorldCat, for example.
  • When deciding which record is better users initially check the length and number of fields.
  • Sometimes more than two records need to be compared.

We also noticed that the screens used most often are the search forms, the search results, and the full record view. Although we knew this already, it's something that the site visits really hammered home.

Knowing so much more about how people are using the interface, and why, and what they sometimes have trouble with is now helping us make decisions while designing the interface. Having this information clear in our minds will also help us choose the changes to pursue down the track, when we have only a limited amount of time left, and many possible enhancements to choose from.

Surveys

Around the same time we ran a survey asking about problems with, or ideas for improvements to, Libraries Australia Search. The survey was easy to create and run. It gave everyone a chance to have their say and provided us with lots of useful information and ideas. For example, we were told that printing takes too many clicks, reporting duplicates isn't easy and title searches return some irrelevant results. But a survey also has its limits. Here's an example.

One person told us "I currently find it difficult to readily identify holdings - it used to be much simpler." This is an intriguing comment, and immediately many questions spring to mind. When was it simpler? Why are they now harder to identify? What screen are you on when you have this problem - brief results or full view? What holdings are you trying to identify and why? Would this <insert specific idea> help? A survey is very useful, but it can raise more questions than it answers! If the respondent has given us their contact details, and we have the time, we can ask these questions. Sometimes the respondent doesn't have the time to answer, or finds it difficult to explain without being in person, or finds it frustrating that we are even asking these questions - can't we just understand?!

Unfortunately, if we aren't sure we understand the problem, it's much less likely that we will be confident enough to make a change, as we know our interpretation may or may not be correct.

In a nutshell

While both surveys and site visits are very useful, an in-person experience on a site visit is much less ambiguous than a comment in a survey like "search often brings up a lot of irrelevant matches" or "should be like Google search". On the other hand, site visits have a limited reach while surveys give everyone a chance to give feedback. Together they complement each other as two quite different ways of learning about a system and its users.

This entry was posted in Project Blog, Search Redevelopment Project and tagged , , . Bookmark the permalink. Both comments and trackbacks are currently closed.

Comments are closed.