Web-Based Usability Testing has Lower Task Completion Rate

Microsoft researchers comparing remote web-based usability testing with lab testing observed that for web-based tests "task completion rates suffered, with 20% of specified tasks abandoned before completion."

Which makes me wonder - anecdotally, of the hundred or so participants I've worked with over the past year, they all try harder than I expect when I'm observing. Maybe lab-based testing inflates completion rates compared to actual use.

Admittedly the study lacked decent controls, so these results are very preliminary. More details are in the short article posted on Usability News.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Of course!

I'm not surprised. I've run a number of tests where the users say "Well, I would have given up on this a while ago if I was doing this on my own." I'm guessing that the task completion rate is much lower in actual daily activity, and goes up with web-based tests, and goes up even more with informal tests, and is highest with formal lab tests.

In a few cases testing an Intranet, users have said that they wouldn't even try to complete the task online — they would have picked up the phone or asked a coworker, or found the information on a shared drive they are familiar with. For these, I noted their response but asked them to show me what they would do if the alternative source was not available and they had to complete it online (and knew it was possible to complete online).

The funny part is that these tasks often had the lowest completion rate and/or longest completion time, which further reinforced their instinct to not use the Intranet for that particular task.