How To Save Up To 70% On Your Usability Testing
This YouMoz entry was submitted by one of our community members. The author’s views are entirely their own (excluding an unlikely case of hypnosis) and may not reflect the views of Moz.
You likely recognize the value of usability testing services, but the pricing of the most popular services is holding you back. However, there's a way to do such testing at a reasonable cost without compromising on quality.
For the sake of comparison, I'm going to look at two tool different usability testing tools in this post: UserTesting.com and TryMyUI.com. Both tools currently offer remote usability testing with screen and audio recording. During a test, the user is instructed to think out loud as they try to accomplish the tasks you assign them. There are other usability testing tools on the market, but they are priced for the enterprise-level customer, with more complex offerings that cost thousands (or tens of thousands) of dollars a year.
- UserTesting charges $49 per test. You can also order credits in bulk (20 at a time for $900) to bring the cost down to about $45 per test.
- TryMyUI charges $35 per test. You can also pay $300 per month for 10 testing credits, bringing the cost down to $30 per test. However, you would have to test and iterate two to three times per month to make this valuable, a scenario that usually only occurs in big organizations with incredible traffic.
Typically, you need three to five tests for each round of testing (enough to observe patterns in your site's biggest problems. Supposing you want to continuously improve your site's quality and conversion rates, then you might do a round a month, or a round every two to three months. If you only order three tests at $35 each fours times in a year, then that's $420 per year. At the other extreme, testing five users per month, you're spending up to $2,700/year. (5 x $45 x 12 = $2,700.)
What if you could perform usability testing at a fixed cost of $15 per year, plus $5 to $10 per test? You'd then spend $135 per year perhaps (3 testers x $10 x 4 rounds + $15 flat fee). At most, it would cost you $615 (5 x $10 x 12 months + $15).
Depending on the volume of your testing needs, that's an annual savings of $285 to $2085.
Smart, affordable usability testing
Here's how to reduce your usability testing costs and justify to your boss why you should get a juicy, paid holiday.
First, to know how to reduce our costs, we have to know what we're getting with the above services. What are you paying for?
You get mainly two things from existing testing tools:
1) Screen and audio recording software that will result in your usability testing videos hosted on the tool's server (at least temporarily)
2) Recruitment of users to do your tests
Until recently, the second point was an especially big hassle because recruiting people was difficult to arrange.
Enter Amazon's Mechanical Turk, or MTurk for short.
MTurk is a crowdsourcing tool that allows you to farm out various tasks to members of their user testing panel known as "Turkers". You make an account, deposit money, and then type out exactly what you want the Turkers to do. MTurk is the middleman that provides you with access to a humongous array of people. In exchange, MTurk takes a 10% cut on any work that's completed and approved.
MTurk makes recruiting dramatically easier
A large part of the savings comes from your being able to pay $10 per test. (And if your tests are shorter than the standard 10 to 15 minutes, you can offer even less compensation.)
What about recording the screens and audio?
Screencast-o-Matic (SOM) offers recording for $15 per year, or $30 for three years. Similar to existing remote usability testing tools, SOM will host your videos on their server and/or allow you to download them.
What about screening to get representative users for the test?
It's true, both UserTesting and TryMYUI offer to screen your testers. You want representative testers, don't you? So perhaps there's still a reason to pay them a premium for recruiting?
However, the screening process offered by those tools is just based on a simple form. You can get equivalent results by having Turkers fill in a demographics questionnaire, and assigning what's known as a "qualification" to those that meet your demographic criteria. Then you post your testing task with the qualification requirement.
If you want to get more into the details of screening Turkers, I shared some tips on it in this Give It Up presentation to SMX Israel last year (where I first shared my ideas on getting cheaper usability testing):
Here are the slides I used, so you don't need to squint at them in the video—especially the ones with tips on screening Turkers.
[Note: I no longer work for Internet Marketing Ninjas.]
Going back to the question of representative users, usability pioneer Steve Krug says to "recruit loosely, and grade on a curve". Is domain knowledge relevant to every single task? Probably not—it's most certainly not necessary for uncovering usability problems with your site.
Other details
The above advice regards website or desktop app testing.
If you want to do usability testing on mobile apps, I don't know whether or not Screencast-o-Matic can record app use on a phone or tablet if the app is running on the the operating system rather than the browser. (Incidentally, if this is what you need to do usability testing on, check out this post on mobile usability issues.)
TryMyUI, however, does have nifty features that work within Apple's limiting terms on third-party apps collecting user data from another app.
I haven't discussed post-testing surveys because they're kind of secondary. If you want these, include them in your MTurk task description for testers to do after using your site, and of course pay accordingly.
A neat feature that both sites offer in various formats is customer experience analytics. These allow you to gain insight into customer experience without watching all of the recorded footage. If you're testing three to five users per round, that will save you from 45 minutes to more than an hour of video watching time. An even better option is to hire two to three Turkers to watch the videos and find the patterns.
I've also done data analysis on Excel spreadsheets with Turkers, and it's been great. Plus, you're getting fresh eyes to look at the videos.
The testing described above is basically what you see described in Steve Krug's guide to usability testing, Rocket Surgery Made Easy.
If you'd like to download sample usability testing task descriptions to use on MTurk, you can get them at cro.seoroi.com/mturk-doc.php.
Comments
Please keep your comments TAGFEE by following the community etiquette
Comments are closed. Got a burning question? Head to our Q&A section to start a new conversation.