Why Won't Google Penalize/Ban the Site I Spam Reported?
The author's views are entirely their own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.
One of the most common complaints I've heard from webmasters and SEOs in the past 18 months has been around Google's inaction on spam reports. While the web spam team has been aggressive about asking webmasters to contribute via their spam report form (though they prefer the version in Webmaster Tools as this helps verify the identity of the reporter), they've (seemingly) been much more hands-off in penalizing sites that are engaging in these practices. Naturally, many SEOs feel that this validates the spam tactics, but there may be more to this story.
It's certainly true that 2-3 years ago, spam reporting that happened publicly in the SEO world - on prominent forums/blogs/sites - would often find themselves the victim of swift punishment. The SEO community has noticed that trend decline dramatically and at the same time seen (or, at least, felt) that Google's web spam team is no longer taking a significant quantity of actions directly against individual sites. Common webmaster complaints (and plenty of Q+A we get here at SEOmoz) goes something like this:
My competitor has clearly been buying links from low quality sources in obvious ways. I've spam reported them for the 5th time in the last 6 months, but they're still ranking. I'm thinking I should just give up and buy those same links so at least I'm not behind them - it seems that Google doesn't care much anyway.
I've got more messages like this in my inbox than is healthy, and I suspect that while the web spam team may be taking some targeted action, they've chosen to go a different route in the last couple years. Why?
There's likely a number of factors at work, but I'll try to detail those I'm familiar with, have heard about directly from folks on the web spam team (and other Googlers / community members who interact with them) and some of my own speculation:
- Zapping Individual Spammers Isn't Scalable
It's certainly the case that hundreds, possibly thousands of spam reports pour into Google each week. It would take a literal army of reviewers 20-30 minutes to review each case, make the right call and determine whether to remove the links' value, penalize the acquirer the provider or both and add comments to the case. There's no way that process can work with a team as small as web spam (which, to my understanding, has fewer than 500 people worldwide, possibly much less).
Far preferrable in Google's eyes is to record interesting and new types of manipulation, classify a solid quantity of each type and then work on coding algorithms that catch the worst stuff first, then go in descending order of "negative impact to the search results." This process takes time - sometimes even years, as Google needs to ensure that their changes don't have negative blowback on innocent parties. From what I've heard, they use much the same instinct as judicial theory and say they'd rather have 10 guilty spammers get away with it than penalize 1 innocent site. This obviously makes engineering these system very hard.
_ - Removing the Value Certain Links Pass May Be a Better Option
Instead of penalizing or banning the site(s) responsible, Google may, in many cases, prefer to simply remove the value the links pass. This is virtually undetectable by webmasters, particularly if some links lose their value but the reported competitor continues to gain other links (legitimate or not). It could very well be that much of what webmasters perceive as inaction is actually already being addressed and it simply isn't "those links" that are making the site rank as well as it does.
At SEOmoz, we certainly want to help with this, but our link analysis tools have nothing like the sophistication of Google's webspam team. Comparing mozRank and toolbar PageRank may be of value in some cases, and looking at mozTrust on a page or domain may be helpful as well. However, the sad truth is that there aren't any foolproof ways to determine whether a competitor is gaining value from a manipulative link short of engaging in this (relatively intensive) process.
_ - Webmaster Spam Reporting Volume Has Dramatically Increased
It's certainly possible that webspam is struggling under an unexpectedly high load and one where it may not be ROI positive for Google to put massive talent against the problem. Webspam engineers are handpicked from other parts of the search quality and web search team and they need to be 100% trustworthy, loyal and committed (as well as incredibly smart and talented). Google won't abide by speculative hiring of a few hundred or thousand extra hands to help police spam and then potentially release those individuals out into the wild. As SEOs who've attempted to hire former Google webspam team members know (Dave being one of the more vocal of these), there's close to zero opportunity there.
If it is indeed due to high volume, then it could be Google is taking just as much action (or maybe more) than in years prior, but the sheer quantity makes it so that many webmasters feel their requests are being ignored. Frustrating? Absolutely, but I have strong convictions that it won't be a problem the company ignores for long. Bing is a credible threat, and while they're relatively poor at spam today in comparison to Google, it's an area that could, if exposed, cause Google real pain.
_ - A Shifting Mindset About Spam & Link Buying at Google
A possibility that I personally think is remote but many more cynical webmasters argue is the idea that Google has decided in many cases not to take action when the results are still good from a user's perspective. If CNN spams their way to the top for popular news queries or Amazon buys links to boost their books' rankings (please note that I'm not suggesting either of these are happening in any way), Google simply might not care that much. There's so many things the webspam team can worry about - international spam has certainly been something they've been vocal about and it's still a massive issue, so that could certainly be sucking bandwidth.
I personally find this explanation somewhat antithetical to Google's mindset and, more specifically, how the webspam team emotes about their job. However, it's certainly not impossible that a milder version of this exists in which webspam has simply prioritized spam complaints and worries less (and thus addresses less) those results and spammers whose sites do provide a positive, relevant user experience (which has generally been where I see/hear complaints).
_ - Big Changes are in Process at Google Web Spam
If you've heard Google's webspam chief talk about paid links in the recent past, he's certainly been hinting that they're cooking up something big for release in the "near future." It could be that Google's quite aware the problem has been getting worse, but have concentrated efforts on a scalable, big release and thus have been ignoring or putting off many of the individual requests which they feel will be addressed in whatever it is they're launching.
This is pure speculation, but I wouldn't be at all surprised to see a system emerge that resembles (or at least takes cues from) Google MapMaker, which leverages a kind of social PageRank to allow users themselves to contribute to the construction of maps in areas where Google doesn't yet have them. I doubt users will have the ability to actually remove/penalize sites or pages, but a feedback army of spam reporters building up reputation and making suggestions that can then feed into Google definitely seems like a cultural and intellectual fit (it also might be a reason they've started a marketing presence with a Twitter account).
I'd love to hear more from those of you who feel impacted by this issue. Does this trend match your experience? Have you seen Google take some actions in the recent past? How do you deal with those who spam and seem to benefit? And which of these scenarios, if any, do you find likely?
p.s. For those who may not be aware, SEOmoz strongly recommends against link manipulation and buying links (not on ethical grounds, but for practical, business reasons).
Comments
Please keep your comments TAGFEE by following the community etiquette
Comments are closed. Got a burning question? Head to our Q&A section to start a new conversation.