Posts tagged ‘evidence’

Putting Evidence-based Practice to Work

Frank Cervone and Amanda Hollister

The problems of website design:
web development/hci is an intricate mix of technology and design

  • majority of librarians haven’t been trained in hci
  • gaps in understanding the significant differences between the online and in-person experiences

the more an org. depends on its public for achieving mission, the more it should employ dialogic features into website

Evidence-based practice (andrew booth definition)

  • data provides the primary evidence for making decisions, not anecdotal stories or “common sense”
  • evaluation occurs early in the process

what happens now:

  • decisions are made based on Beliefs of what is needed (often biased); assumptions, anecdotal evidence and preferences
  • evaluation, if it occurs, happens afterward (too late!)

derived from evidence-based model of medicine
fundamental precepts:

  • study phenomena
  • contrast results to other studies of same or related phenomena
  • combine results

define problem>find evidence>evaluate evidence>apply results of evaluation>evaluate change>redefine problem and go back through cycle

Setting: context-where is this being used?
Population: who are users?
Intervention: what is being done?
Comparison: what are alternatives?
Evaluation: what does success mean?

Levels of evidence

first usability test in 2001
focused on Electronic resources/home-grown resource finder
2nd test
how are people using catalog?
Data mining
tried not to make assumptions
dispelled some myths

Looked at areas of site with highest reported difficulty or frustration
Restructured web development process
Web advisory group – reps from all areas of library
Induction process:
required reading list (usability research, etc.)
required training:

  • in usability
  • conducting a usability test
  • other soft skills


  • site usability has improved – proven by stats
  • debates less rancorous about how to proceed; can always go back to data
  • easier to develop strategies for incremental improvements over time – not locked into tight academic schedule

metasearch, e-journals, virtual reference, electronic resources
why should I go here?

Anecdotal evidence is good for identifying problems to look at, but usually comes from skewed user group – studies will show how representative they are;

Breadcrumbs: putting users on the right path

users are often “lost”
tool-based website design
dynamic, page-based crumbs – code found on google
temporary session cookie
code needed “minor” tweaking – endless path of crumbs
consultant hired:
each xml crumb file traps:
pages visited, IP, page timestamp, does not collect off site pages
does not collect browser navigation

Data analysis
pick a page to analyze
put data into magic box; out comes user paths that end on selected page
can see # of clicks to the page
ideal paths and less than ideal paths
how can you nudge users back to where they meant to go?

Advantages of trapping xml path data:

  • huge amount of data
  • real-time usability testing
  • flexible
  • no observer effect


  • Doesn’t track sessions that leave the website
  • can’t see user response (frustration)

Future directions: implement predictive track analysis
implement timestamp analysis

code for breadcrumbs:


October 30, 2007 at 10:26 am

"To live a creative life, we must lose our fear of being wrong." - Pearce