Skip to content
Search engines 5511dd3

Bad Information from SEO-News

Rand Fishkin

The author's views are entirely their own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.

Table of Contents

Rand Fishkin

Bad Information from SEO-News

The author's views are entirely their own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.

Following Nick's lovely recommendations for community building, I'll be doing a bit of link baiting in this thread. Actually, that's a lie; I'm just ticked off about this article from SEO News' Rob Sullivan about a conference call he had with some Google employees. I'll just refute piece by piece, since several items are seriously problematic:

Is Pagerank Still Important?

The short answer is yes - PageRank has always been important to Google. Naturally they couldn't go into detail, but it is as I suspected. Google still uses the algorithm to help determine rankings... My feeling however is that they've simply moved where the PageRank value is applied in the grand scheme of things.

Real Answer - No. PageRank (what we view in the toolbar and the Google Directory) is virtually useless; it's old data, inaccurate data and only really useful to link sellers who think they can pull the wool over unsuspecting buyers' eyes (that's not to say all link sellers do this, but there certainly are quite a few). What MAY still be useful is global link popularity (the total value and importance of all the links pointing to your site/page). PageRank is supposedly a measure of this, but there's a big distinction and Rob should have made that clear. He does that little in another article that he links to, but even that one has plenty of other items I find objectionable - like, "guess which factor determines which top results are returned?  You guessed it – PageRank." I'll have to deal with that another day.

Are Dynamic URLs Bad?

Google says that a dynamic URL with 2 parameters "should" get indexed. When we pressed a bit on the issue we also found that URLs themselves don't contribute too much to the overall ranking algorithms. In other words, a page named Page1.asp will likely perform as well as Keyword.asp.... The difference however is that in almost all cases I've seen the static URLs outrank the dynamic URLs especially in highly competitive or even moderately competitive keyword spaces.

This started out well. I give my clients the same advice - more than 1 dynamic parameter may still get indexed, but I don't recommend it. URLs themselves don't contribute to rankings - that also fits with my experience and knowledge. But, I can't get over how poorly the last line is explained. As MSN noted in my interview with them (and as competent webmasters around the web know), the reason that static URLs typically rank better is not due to anything in the algorithm specifically, but because they are more likely to be linked to and more likely to be used in important places (like homepages or top level category pages or big articles). Again, providing full information goes a long way towards not confusing readers.

Does Clean Code Make That Much of a Difference?

Again, the answer is yes. By externalizing any code you can and cleaning up things like tables, you can greatly improve your site. First, externalizing JavaScript and CSS helps reduce code bloat which makes the visible text more important. Your keyword density goes up which makes the page more authoritative.

Oh, brother. Not only is keyword density a nonsensical myth, but the idea that search engines aren't sophisticated enough to conduct the linearization and stemming tactics mentioned in IR books since the '70's is ridiculous. Just to put icing on the cake, what is it exactly about keyword density (or any type of keyword use or relevance) that makes a page "authoritative"? Authoritative means that it is a reference standard in the industry or niche - using more "keywords" or having more keyword importance can't possibly affect this. Links can, references can, mentions can, even popularity among visitors could arguably be considered, but keyword density? I'm not impressed.

In a very, very convoluted way, there could be legitimacy to this argument, but Rob doesn't use it. You could argue that cleaning up code makes people more likely to link to you and prevents errors which could cause spidering problems. It's a stretch, sure. But at least it's not flat out wrong. 

Oh, and this was nice, too:

Any reproduction of this article needs to have an html link pointing to http://www.textlinkbrokers.com.

Why TextLinkBrokers? The article is hosted at SEO-News.com... I'm a little worried, as I've used and recommended this firm in the past - and generally been happy. I'm not blaming them, but it sure is confusing.

Let's hope that Rob's a big enough guy to admit his mistakes and make amends. It's something I've done many times (most recently with the ranking factors article, which took a lot of guff until I could get some top folks to help me out with it).

Back to Top

With Moz Pro, you have the tools you need to get SEO right — all in one place.

Read Next

Why Building Links with Digital PR Is Hard — And That’s OK!

Why Building Links with Digital PR Is Hard — And That’s OK!

Feb 14, 2024
Breaking News: ‘PR Critical to SEO Success’ Is Not Breaking News

Breaking News: ‘PR Critical to SEO Success’ Is Not Breaking News

Jan 24, 2024
Driving Sales with Digital PR: What E-commerce Brands Need to Know

Driving Sales with Digital PR: What E-commerce Brands Need to Know

Jan 09, 2024

Comments

Please keep your comments TAGFEE by following the community etiquette

Comments are closed. Got a burning question? Head to our Q&A section to start a new conversation.