URL Rewriting: Increase Organic Traffic By Using Dynamic URLs That Look Static
This YouMoz entry was submitted by one of our community members. The author’s views are entirely their own (excluding an unlikely case of hypnosis) and may not reflect the views of Moz.
If you haven't noticed URL rewriting, you haven't been thinking like a search engine optimizer. More and more sites are using this technology to create human-readable (and SEO-friendly) URLs. If your site is one of them, good for you. Read on and you might find some reassuring data to back up your excellent decision. If you are not familiar with how URL rewriting will improve organic search ranking and traffic to your site, I don't blame you. It's not a hard concept to grasp, but it's rarely covered well in SEO publications. Few sites are using it, and this creates an opportunity for you to get ahead in ranking. If you have any search competition, you should already be thinking about how to improve the way you present data to engines.
What is URL Rewriting?
Let's take a look at a typical web activity:
- User clicks a link
- Browser looks up the address for the requested domain
- Browser sends a request to server using a query string with parameters
- Server finds the requested resource and sends a response
- Browser renders the response
The page sent back to the browser could be a static HTML file sitting in a directory, or it could be dynamically generated by combining data from multiple databases. In the case of a dynamic page, the server needs those query string parameters to look up and assemble data.
Let's look at an example of two URLs with query string parameters:
- http://www.facebook.com/profile.php?id=500608523&ref=profile
- http://www.facebook.com/people/Aaron-Bronow/500608523
Both of these URLs point to the same dynamic page. The difference is obvious. One is only relevant to the server and the other is relevant search engines, the server, and most English-speaking humans. This effect is accomplished using a rewrite engine. The second URL is interpreted by the server and converted to the standard query string format including name/value pairs beginning with a question mark (?) and separated by ampersands (&). Rewriting my profile URL gives the impression of an organized directory structure where there really is none. Can you imagine having a /people/ directory filled with sub directories of all 175 million active users?
Why Should You Be URL Rewriting?
Check out the trends for these search terms:
- "emergency neil strauss"
- "neil strauss emergency"
- "emergency book"
People organize terms in different ways. To reach the widest range of prospective customers you should try to capture as many variants as possible. Google Trends shows the first two terms doing nearly the same numbers. The third "emergency book" has been performing consistently for a few years, but seems to be climbing since the book's release. Think about the “compound interest” of having a page that matches all three of these. It’s not just about trends.
As another example, let’s look at the results for those terms:
emergency neil strauss
- Google returns Amazon at the top with this URL: http://www.amazon.com/Emergency-This-Book-Will-Save/dp/0060898771
- Barnes And Noble comes back lower on the first page with this URL: http://search.barnesandnoble.com/Emergency/Neil-Strauss/e/9780060898779
- Half.com is nowhere to be seen. I searched their site and came up with this URL for the same book: http://product.half.ebay.com/_W0QQprZ65617842QQcpidZ1383455015
neil strauss emergency
- Amazon is still dominating the results and adds this second result: http://www.amazon.com/Neil-Strauss-Emergency-book-list/lm/R31W8C826U23L3
- Wikipedia has shown up on the first page with a partial match for the author's name: http://en.wikipedia.org/wiki/Neil_Strauss
- And Barnes And Noble has fallen off the list.
emergency book
- Amazon just won't die. That "book list" page captures this term: http://www.amazon.com/Neil-Strauss-Emergency-book-list/lm/R31W8C826U23L3
The rest of the results on the first page are not relevant to my query. Google can't be faulted for this because the terms are very vague. Amazon has clearly thought of a way to grab far more search term variations than the competition. By creating a set of meaningful and relevant URLs, they’ve captured 300% more top results positions.
Organic search ranking is not the only gain from using friendly-URLs. There’s a whole world of un-indexed searching going on in desktop applications like browsers, chat clients, email, or anything people use to share links. If I were to copy and paste one of these human-readable URLs into a chat box or email and send it to you, there would be an instantly recognizable series of words. Would that make you more likely to click on it? Consider how Firefox 3's new "awesome bar" has improved bookmark recall. It's a tiny search engine for your history and bookmarks. Title and date used to be the only way to find that page you wanted to save. Now a more relevant URL places history and bookmarks higher than ever.
Are You Ready for URL Rewriting?
If your site is running on Apache, you have no excuse for not implementing URL rewriting immediately. The “mod_rewrite” module is free and easy to install. Use it to write “RewriteRules” into your server config (httpd.conf) or on a per-directory basis (.htaccess). You’ll need to know some basic RegularExpression for string matching, but with a bit of trial and error, you should have a robust rewrite engine running in no time.
If you’re on IIS 6, it’s a little trickier but still worth the time tinkering. There are a few IIS modules for sale that will run on any IIS directory, but they all do basically the same thing you can write yourself. First, you’ll need to host your pages within an ASP.NET application. You can convert a regular directory to an application with one click in the properties panel. This enables the .NET framework to intercept all requests within that directory and process them using a Global.asax file. This file contains event handlers that trigger whenever a new request is made. The steps for inspecting the incoming query string and using Context. RewritePath are straightforward, but beyond the scope of this article. If you have trouble, please ask me in the comments and I will provide links for your specific setup.
Microsoft has built a URL Rewrite Module for IIS 7. You should have no trouble finding it and setting it up on your IIS 7 server. The iis.net site has good documentation.
"But I have 10,000 pages of content and dozens of string parameters!"
If this sounds like you (or your site administrator), the process of restructuring all your URLs might feel like translating the complete works of William Shakespeare into lolcat captions. My advice to you is to take baby steps. You don’t need to come up with an SEO term for every parameter. If you have a product database, for example, consider using a rewrite rule that puts the product category in the first position followed by the product name. You can leave your original machine-readable query string at the end. It might look something like this:
http://www.wivesofwindsor.com/products/figurines/William-Shakespeare?ad=123456789
Obviously, this new URL is shorter than the old one, but it contains more information, and the benefit is search engines will pick up the product category (figurines) and product name (William-Shakespeare). Your data engine will still see “ad” in the query string. Win-win, right?
What Are You Waiting For?
Now you know what URL rewriting is and how it's being used (or not used) by some of the top sites. You could be missing out on essential organic search ranking just because you have ugly URLs. Implementing a rewrite engine to restructure your URLs could be crucial to your success. On the other hand, you might run a data-driven site that just has no real competition. In that case, maybe URL rewriting is not for you. Perhaps you can depend on decent page titles and header text to place you in organic searches. You don’t necessarily need to break the bank to implement URL rewriting. The wivesofwindsor.com site is written in C# and took me about an hour to build and deploy. Say you have 5,000 products manually entered into your site. You already have a database and an engine to parse your product IDs and categories. If you drop the product ID and start using a formatted product name in every link, you’ll need to use some kind of look-up method to match the product name from the query string to the product ID. This might be a few hours of work for a decent web developer. Worst case scenario is you hire an intern to convert each product name to a query-string-formatted value (like "William Shakespeare Figurine" to "William-Shakespeare"). This allows your database engine to select from the product table using an exact match and reduces the chance of duplicate results. Either way, it shouldn’t take more than a few days of planning and development to get your site competitive and highly usable.
One last search example:
emergency
- Google says IMDB has great title match but a terrible URL: http://www.imdb.com/title/tt0068067/
I'd like to take this opportunity to challenge you, dear reader, to build a movie database site with a clean, sexy layout, decent page-level SEO and URL rewriting. Let me know how long it takes for you to rank higher than IMDB.
Comments
Please keep your comments TAGFEE by following the community etiquette
Comments are closed. Got a burning question? Head to our Q&A section to start a new conversation.