Posts tagged ‘usability’

Why we test

I came across this great quote that pretty much sums up the case for usability testing, in an article that isn’t actually about usability, per se:

‘All character is action’ goes the old Hollywood cliché — that is, we learn far more about people by how they behave than we do from what they tell us about themselves. – Mark Earls [via Conversation Agent]

The article itself is a pretty interesting analysis of the divergent reactions to Michael Jackson’s death and the role of social media in the response – worth a read.


July 6, 2009 at 6:00 pm

Privacy, usability and the path of least resistance

Going back to the issue of privacy – I saw this post at ReadWriteWeb and it confirmed something I’ve suspected for awhile: that although many people claim to be concerned about their privacy online, many of them never use the options available to them for managing their various profiles. In fact, nearly 60% of the people in the study cited didn’t know if their profiles were public or not. This is a perfect example of what Kevin Kelly calls “triumph of the default.” The vast majority of users are stymied by the sheer number of choices available, so they don’t choose (or they choose by not choosing).

I’m exploring this in my work right now with usability testing. We have software and services that are pretty robust and have a lot of features, but most of those features never get used. We’re trying to determine which options are the most useful to most of our customers so we can set intelligent defaults. We don’t want people to have to be super-users to get good results from our site, but it can be difficult to find the balance between ease of use and quality of results. This is one of the reasons we do iterative testing. Test, tweak, test again, tweak some more…the hard part is knowing when to stop.

July 1, 2009 at 8:00 am

Excellent usability test idea

I added some links to my resource page because I read Aaron Schmidt’s post a few weeks ago and got inspired to try some different usability methods. 5-second Tests seem like a great way to test variations of a design. I’ve been looking for quick and easy ways to test small content blocks on our site, and this may be the best option. There’s even a site to help you do it.

The great thing about the fivesecondtest site is that you can participate in tests of other interfaces and you will come away with useful ideas to apply to your own design. NYPL’s Infomaki does the same thing. They’ve created their own usability test site, and I spent a fair amount of time answering questions just to see the types of things that they were asking – it’s sort-of addicting!

May 22, 2009 at 5:38 am

Putting Evidence-based Practice to Work

Frank Cervone and Amanda Hollister

The problems of website design:
web development/hci is an intricate mix of technology and design

  • majority of librarians haven’t been trained in hci
  • gaps in understanding the significant differences between the online and in-person experiences

the more an org. depends on its public for achieving mission, the more it should employ dialogic features into website

Evidence-based practice (andrew booth definition)

  • data provides the primary evidence for making decisions, not anecdotal stories or “common sense”
  • evaluation occurs early in the process

what happens now:

  • decisions are made based on Beliefs of what is needed (often biased); assumptions, anecdotal evidence and preferences
  • evaluation, if it occurs, happens afterward (too late!)

derived from evidence-based model of medicine
fundamental precepts:

  • study phenomena
  • contrast results to other studies of same or related phenomena
  • combine results

define problem>find evidence>evaluate evidence>apply results of evaluation>evaluate change>redefine problem and go back through cycle

Setting: context-where is this being used?
Population: who are users?
Intervention: what is being done?
Comparison: what are alternatives?
Evaluation: what does success mean?

Levels of evidence

first usability test in 2001
focused on Electronic resources/home-grown resource finder
2nd test
how are people using catalog?
Data mining
tried not to make assumptions
dispelled some myths

Looked at areas of site with highest reported difficulty or frustration
Restructured web development process
Web advisory group – reps from all areas of library
Induction process:
required reading list (usability research, etc.)
required training:

  • in usability
  • conducting a usability test
  • other soft skills


  • site usability has improved – proven by stats
  • debates less rancorous about how to proceed; can always go back to data
  • easier to develop strategies for incremental improvements over time – not locked into tight academic schedule

metasearch, e-journals, virtual reference, electronic resources
why should I go here?

Anecdotal evidence is good for identifying problems to look at, but usually comes from skewed user group – studies will show how representative they are;

Breadcrumbs: putting users on the right path

users are often “lost”
tool-based website design
dynamic, page-based crumbs – code found on google
temporary session cookie
code needed “minor” tweaking – endless path of crumbs
consultant hired:
each xml crumb file traps:
pages visited, IP, page timestamp, does not collect off site pages
does not collect browser navigation

Data analysis
pick a page to analyze
put data into magic box; out comes user paths that end on selected page
can see # of clicks to the page
ideal paths and less than ideal paths
how can you nudge users back to where they meant to go?

Advantages of trapping xml path data:

  • huge amount of data
  • real-time usability testing
  • flexible
  • no observer effect


  • Doesn’t track sessions that leave the website
  • can’t see user response (frustration)

Future directions: implement predictive track analysis
implement timestamp analysis

code for breadcrumbs:

October 30, 2007 at 10:26 am

"To live a creative life, we must lose our fear of being wrong." - Pearce