Jakob's best Alertbox in a long time.

The advice on intranets and staff directories is useful in Jakob's latest piece Employee Directory Search: Resolving Conflicting Usability Guidelines. But that's not why I think it's the best Alertbox in recent memory. It's because it shows the complex and paradoxical issues that comes with any signficant design.

"It is very common to have conflicting usability guidelines. They are called "guidelines" rather than "specifications" for a reason: they are necessarily fuzzy because they relate to human behavior.
Interface design requires trade-offs. The challenge is in knowing how to balance the conflicting guidelines and in understanding what is most important in a given situation."

While he still suggests usability testing as the resolution to the guideline conflict (not always true), it's a refreshing dose of dogma-lite Nielsen.

Update: Christina's got an interesting take on why guidelines don't really help novices.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

"Not always true"

When you say While he still suggests usability testing as the resolution to the guideline conflict (not always true), what other ways are there that you would suggest? Looking at server logs? Contextual inquiry? Saying "I'm the usability specialist, so what I say goes"?

Why usability testing is not always the answer

I just mean that usability testing can't solve all problems or answer all conflicts. This is particularly true for IA issues dealing with large amounts of content. In this case, regular usability testing won't scale to tens of thousands of pages.

A usability test *would* be useful in Jakob's example of a screen-level conflict of a very specfic feature. But research from Rolf Molich, Jared Spool and others has shown that discount testing in complex systems reveals only a fraction of the usability problems.

I don't think we understand enough yet about HCI, usability, and IA to make user experience pronouncements by fiat (I AM the usability SPECIALIST!). But I hope that we can accumulate enough wisdom in the user experience space to actually apply professional judgement to design problems without resorting to user research. I think this happens on a small scale now, and hope it happens more in the future.

In the meantime, I think a deep *understanding* of business goals, user goals, the offering in question, and the delivery channel provides the best foundation for good design. (I even have a mnemonic ;)

U nderstand
S olve
E valuate
R efine

So I think Contextual Inquiry/Rapid Ethnography + Personas forms a better basis for making design decisions in most projects than constant reliance on user testing (particularly since I often don't have budget or time to schedule a lot of testing, and the field research and personas are more flexible than testing i.e. they apply to most situations, while testing only applies to the tasks tested).


Jess McMullin

Take 1 or 2 boxes and call me in the morning

Reading over the article I was more confused than anything, and I wasn't willing to pay the $200+ to see the report. Even after reviewing search query logs I am not completely convinced that end-users need/want to make the distinction. When someone is searching for a name, I'm guessing that it could possibly be a combination of all websites with that person's name, any documents they created, and their directory entry is a bonus. I see no reason of not putting 1 box if you can have it search both the search index and directory and presentation of results is useful and offers the distinction between the types of content(web/directory/etc.)

Interesting example of design solution to this problem

Keith Robinson has posted a couple screen snippets where he addresses this same challenge.

Jess McMullin

sounds familiar

Assuming that you have the techies to make it happen, didn't we decide that this sort of thing calls for a federated search? Instead of placing the burden on the user to determine which DB to query you can have one box and then a few results from each DB queried, plus a few best bets, of course.

"we shape our buildings ...

and thereafter they shape us" -- Winston Churchill.

Not all questions need to be answered through user testing, some can be answered by *management* fiat. Although it is true that users can be stubborn as mules with regard to change, sometimes they are quite malleable. It's all a matter of careful management.

I recall a pertinent example from Stephen Johnson's 'Emergence' ... a school building had two sets of stairs, and every class break it was a free for all as to which exit people used. One class, as an experiment, started using one stair for coming to class, and the other for leaving. Before long the rest of corridor users started doing the same and traffic flow improved.

Would user testing produced that result, or a recommendation for a bigger corridor/stairs?

If the question was for a public site then there would be additional ethical/marketing questions, but for an intranet site surely management should actually do some "managing"?