Testing the Accuracy of Visitor Data from Alexa, Compete, Google Trends, Doubleclick & Quantcast
The author's views are entirely their own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.
SEOmoz.org had 13.8mm visits from 6.25mm unique visitors last year (2011). Those numbers are pretty exciting, but what's not exciting is the external perception created by third-parties like Compete, Alexa, Quantcast, Doubleclick and Google Trends for Websites. These sites report massively lower and wrongly trending data - and SEOmoz isn't alone in experiencing this frustration. We're among dozens of sites I've talked to who've gotten emails and comments lamenting our poor growth or crummy year thanks to these horrifically inaccurate services.
Here's a screenshot of our actual traffic from Google Analytics:
Now let's look at the comparison to each of those services:
Above is Alexa's estimate of SEOmoz's web traffic for the past few months. It's hard to tell how accurate they are, because they're not showing any exact numbers, only "percent" of "reach." They do correctly note that traffic was down in December (the last two weeks of the year were very slow for us due to the holidays, which is a good thing - even SEOs deserve a break) :-)
Historically, Alexa showed a much longer timespan and much more inaccurate data, at one point estimating that our traffic had dropped year-over-year since 2009. I've had well respected VC funds reach out and ask why we were struggling and whether we felt the SEO market was drying up because of those charts... Now, Alexa's ranking us as the 472nd most popular site in the world, which is definitely way, way off.
Next up is Compete.com's estimate of SEOmoz's traffic. They're much more specific, but tragically, way off the mark. For a time, I'd hoped Compete would be a much better competitor to Alexa, but those hopes died a few years back. This chart isn't just wrong, it's directionally backward (we grew when they showed us shrinking and shrunk where they show us spiking at year-end) and off by almost two full orders of magnitude (our daily traffic is about 2X what they estimate our monthly traffic to be).
How anyone can trust that data is beyond me, since you can easily compare many sites who publish their traffic details (as we do) against Compete and see this discrepancy. To be fair, I've heard that for the top 1-2,000 most popular sites on the web, they're not bad, though I can't personally confirm this.
Quantcast's estimate of traffic looks equally terrible to Compete. It's directionally wrong and off by multiple orders of magnitude as well. Quantcast's saving grace is their "Quantified" program, which shows actual, truly accurate and measured numbers for sites that opt-in. I wish they'd stick to that model exclusively rather than providing these random guesses on sites they've not included in the program, though. I'm also really struggling to understand how 17,671 unique people could create only 11,005 visits... That's a brain teaser.
Google's my last, best hope, and since they capture such a large percentage of sites' traffic in Google Analytics, I'd expect they have a pretty excellent data-modeling system to work off. Apparently, that belief is mistaken. Google's by no means as bad as Compete or Quantcast (and possibly better than Alexa), but it's still way off. The directional data is sort-of close, but the daily unique visitors count shows at ~200K in December. Our analytics says it's ~47K daily or 722K for that month.
Since Trends and Doubleclick are both under Google's operating umbrella, you might be tempted to think they use the same data... In fact, Doubleclick Ad Planner's estimate of Moz traffic and Google Trends for Websites appear to have at least slightly different numbers (hard to tell for sure based on GG Trends' incomplete graphs). One thing I can tell for sure - neither is accurate, nor even directionally correct.
The over-time charts don't quite match each other (though they're close-ish); it looks like Doubleclick is showing higher traffic to SEOmoz generally than Trends for Websites. The closest data point is their estimated time on site, but I'm not sure I can give them credit for that. If you put on a blindfold and throw enough darts, one of them will probably get close to the board. It's hard not to feel that way about these numbers, too.
Now here's the rub:
Recently, Ani López wrote about Comparing Google Trends for Websites vs. Google Analytics Data and showed a few examples that suggested greater accuracy than what we see with SEOmoz (and OpenSiteExplorer, too FYI). Thus, I'm asking for two favors from you to help get a better sense for the relative usefulness of these tools.
The first is to take the quick survey linked-to below:
please take me! (opens in a new window)
The second is to, if possible, take screenshots of your own analytics vs. Trends/DoubleClick/Compete/Quantcast/Alexa and share them in the comments below. For anyone who puts together a compelling side-by-side, I'll happily include links in this blog post to your site and to the images showing your traffic vs. what these third parties report. Hopefully, that incentive can help spur transparency from those of you willing and able to share some broad site stats.
Thanks as always for your help - looking forward to getting a broader view of these tools' performance. For now, I'd remain highly skeptical, but we might revisit the topic if we get very compelling data in the survey and in the comments (otherwise, I'll just update this post at the end of the week with the survey results, and since they're anonymous, provide full data).
Comments
Please keep your comments TAGFEE by following the community etiquette
Comments are closed. Got a burning question? Head to our Q&A section to start a new conversation.