A List Apart
Brightly Colored Food
City of Sound
Croc o' Lyle
Digital Web Magazine
Dive Into Mark
Guide to ease
Joel on Software
Noise Between Stations
Off the top
Signal vs. Noise
The Dublin Core 2003 Conference is currently going on in Seattle this week. A couple of the attendees and I will be sharing our notes(and photos) when we've recovered(it's actually still going on). But until then, enjoy the conference proceedings online.
Peter Merholz muses on users seeking products and comes up with some intersting thoughts about hypertext patterns. Rather than getting the "lay of the land" first, users move to an actual product and then start to compare.
He ends with a foray into decision making, and looking for useful resources. While I have more thoughts on the matter, I think it boils down to Return on Experience - everyone has an intrinsic level of effort they'll invest to achieve some expected value.
CIO article "Sleuthing out data" by Fred Hapgood features a couple examples of how auto-semiauto categorization enables businesses and reduce costs. There is a company list included if you're interested in this arena.
On SIGIA, Dick Hill points out this journal. Edited by Ben Schneiderman, the Winter Issue of IT & Society was dedicated to Web Navigation and contains articles ranging from user frustration, to PDAs, to browser design.
My sysadmin and I have been playing with graphviz today. I was playing with it on Mac OS X and he used Randal Schwartz's perl script in Web Techniques Column 58 (Feb 2001). He was able to quickly produce a diagram that shows user flow based on Apache referrer logs. The script feeds your log files to graphiviz's dot program and outputs a gif file.
We were both surprised that we didn't find more people writing about using graphviz to analyze of patterns of information-use. Graphviz seems so easy. I know James has been doing a lot of work on generating diagrams from referrer logs using OmniGraffle and Applescript.
Tanya Rabourn discusses information foraging, a theory that attempts to explain human information seeking behavior based on the food foraging theory from biology and anthropology. According to Pirolli and Card, "Information foraging theory analyzes trade-offs in the value of information gained against the costs of performing activity in human-computer interaction tasks." The advantage in using this theory as the basis for modeling information seeking behavior comes in the form of understanding users' cognitive mapping of knowledge and knowledge relationships and understanding attributes of information navigation such as scent. Tanya discusses 3 new tools which would benefit this area of study: 1) ACT-R, which uses network modeling of knowledge to model interaction, 2) analyzing user paths from web server log data and creating user profiles from that analysis, and 3) collaborative filtering or foraging for information groups.
Tanya's essay gives a concise summary of the literature and discusses some new methods for applying the theory. My eureka moment came last night when I saw James demonstrate his latest OmniGraffle experiments, which use web server logs to to create what he calls self-organizing site maps -- diagrams that show paths traveled between nodes/pages on a site to reveal real users' information seeking behavior. In a sense the relationships that emerge reveal the collective user base's cognitive map. It can be used to show where information scent was weak or strong and where content structure doesn't map to user peceptions.
I've been wanting a better way to test the information architecture of sites based on actual information use, and it's not until I read Tanya's essay and saw the visualization that James came up with that my brain was able to churn on this concept. It's nice to know smart and creative people.