Adventures in remote usability, Part 3: Basic ClickTale

h4. Why ClickTale is awesome (and totally worth the money)

ClickTale’s site / session recordings are basically like watching tons of usability studies. I found that these recorded sessions — where you watch the mouse tracking across the screen, watch what the user is typing, etc. — strongly track the same issues we see in usability studies. In qualitative usability we get to hear the motivations and the “why”; ClickTale gives us a sense of the frequency of these problems.

ClickTale is a harsh mistress. You get to see every wart on your product unmediated by the soothing balm of a study participant trying to be nice and make a special effort because it’s your site and they don’t want to hurt your feelings, and anyway they’re getting paid to figure it out. You see exactly where people bail and where they do things that make you cringe and want to drink a martini.

(I just spent about 2 hours tonight watching ClickTale videos, and I’m about to pour myself a Hendrick’s Gibson to numb the pain. Sweet Moses do I love cocktail onions.)

To that last point: I was surprised to realize recently that our founders didn’t have a very high opinion of the data that we got from “the GoToMeeting qualitative studies”:, i.e. 5-7 participants going through proctored [remote] qualitative usability studies. Our CEO and I got into an argument that basically went like this:

CEO: It’s really frustrating that as soon as someone in usability has a problem, you spring to action and want to fix it immediately! But when Rama [other cofounder] told you how his girlfriend was confused by the site, you totally ignored it!

Me: I did NOT ignore it! The problem is that’s a totally dysfunctional way to manage UI changes — it’s just as bad as Design By CEO to drop everything and fix a problem JUST because someone’s friend or girlfriend or mom has an issue with the site!

In other words, the founders saw qualitative usability study data as no more interesting than anecdotal — on a par with customer support feedback, or feedback from various friends and interested parties. Of course, this does NOT mean that feedback from these sources is invalid: if the founder’s girlfriend was confused, then she was confused, and that’s absolutely something to look at. But because an investor or the CEO or whoever has a particular problem doesn’t necessarily mean that the problem should be solved first — and that is the bias when the CEO complains that she doesn’t get something about the site.

Anyway, I realized from this discussion that probably, unless you have a long track record of watching qualitative studies, you don’t learn that they are reasonably trustworthy at identifying the possible range of problems that users run in to. Our founders didn’t really buy into the notion that qualitative usability is worth all that much: it seems pretty labor intensive for returns that you can often get from, say, asking your mom or girlfriend.

I have to admit, they have a point. The main win you get is a relatively unbiased sample set — people who pretty much don’t care about your site.

Anyway: ClickTale is really cool and powerful because it removes the mediated aspect from the observation, and you also get a sense of how frequent the problems actually are. You start getting enough data points that you can actually say “oh crap, that’s happened to about 18 of 20 people, and it’s making them leave the site.”

h4. Why ClickTale sucks

The mouse tracking feature is not useful and I suspect it’s not particularly valid and interesting. It’s purported to simulate eyetracking-type data, but I see little reason to believe that a user’s mouse trail is an indicator of their gaze pattern. So I just ignore that part of their functionality. Anyway, we can tell from our Analytics and server logs where people are clicking and what pages they’re visiting.

It doesn’t record dHTML / AJAX interactions unless you install one of the special serverside modules.

It doesn’t record any password-protected (i.e. signed in) pages unless you install a special serverside module. They have a number of these written, but unfortunately not for us: our system is in Python, and strangely there’s no Python plugin for Clicktale, so we’re hosed until someone gets around to building one.

It’s priced for corporate use. Some might say “expensive” but that’s only if you wanted it for your personal blog; for a company, it’s perfectly reasonable. Basically it’s $300 / month for the minimally useful package. It is 100% worth the money if you’re a company with funding — this is mad hella cheaper than running the equivalent studies, and you can drink a martini while you’re watching.

h4. Using ClickTale for prototype testing

We wanted to test federated login on ChoiceVendor. (aka signing in with a Facebook, Google, Twitter, or LinkedIn account.) We were dithering back and forth about the UI for this: “Google published a bunch of UI research on federated logins a couple of years ago”: and I wanted to try that approach; we didn’t have consensus. So I threw together a prototype, added Analytics and ClickTale tracking, and sent out a mailer to our current users. I followed it up with a short survey asking the testers what they thought.

Afterward, I was able to reconnect the ClickTale prototype recordings to each of the survey responses and watch what each person did. This worked nicely for a small (3-page) clickthrough type of study with a very focused task.