

Duplicate Content in a Post-Panda World
In a land ravaged by pandas, one man will teach you everything you need to know about duplicate content. Learn how to spot duplicates in the wild and stop them in their tracks.
Traditionally, the phrase Technical SEO refers to optimizing your site for crawling and indexing, but can also include any technical process meant to improve search visibility.
Technical SEO is a broad and exciting field, covering everything from sitemaps, meta tags, JavaScript indexing, linking, keyword research, and more.
If you’re new to SEO, we recommend starting with the chapter on Technical SEO in our Beginner’s Guide. Below are the latest posts on technical SEO, and we’ve included a few top articles here.
On-Site SEO : What are the technical on-page factors that influence your rankings? Our free learning center will get you started in the right direction.
The Web Developer's SEO Cheat Sheet : This handy—and printable—cheat sheet is invaluable for anyone building websites. Contains several useful references that cover a ton of technical SEO best practices.
MozBar : This free Chrome extension is an advanced SEO toolbar that helps you to examine and diagnose several technical SEO issues.
The Technical SEO Renaissance : Is it true that technical SEO isn't necessary, because Google is smart enough to figure your website out? Mike King puts this rumor to rest, and shows you what to focus on.
Technical SEO: The One Hour Guide to SEO : Want a quick introduction to the basics of technical SEO? Our guru Rand has you covered—all in about 10 minutes.
In a land ravaged by pandas, one man will teach you everything you need to know about duplicate content. Learn how to spot duplicates in the wild and stop them in their tracks.
Howdy, Mozzers. This is Russ Jones from Virante, Inc. I recently spoke at the Search Exchange conference in Charlotte, NC on the topic of programmatic, automated SEO solutions and realized that it could probably be more valuable in front of a larger audience. Of course, the attendees have a head start, so you better get to work.
In my own personal experience with working on SEO projects over the last four years, there have been numerous instances where a website undergoes a major revamp or you take up an ongoing SEO project and discover content indexation issues. The case I am specifically referring to is when you have a large number of old website pages that were not 301 redirected or removed using a 404 (not ideal) or just plain old content that lingers on in Google's index because it was just only delinked from the website's internal linking schema (we've all been there, right?)
Subdomains have often been the bane of many SEO-conscious organizations, but an easy solution might be right under your nose. By using subfolders in place of subdomains, you can unite your content under one domain. While this may seem difficult to do when two sites exist on two different servers, a reverse proxy can make the technical implementation quite simple.
On Monday 8/1, I was searching Google for 'mets tickets' and saw that we had slipped from page 1. Worse, we weren't even on page 2. I tried a few more queries that I knew we should be on page 1 for and still nothing. My heart was beating. Had we been Panda'd? It didn't make sense, but I was panicked. Then it hit me. I opened up our New York Mets page, but, just like Mike Mcd, I knew before I even clicked view source...content="noindex" on all of our product pages.
You may have noticed bulleted snippets popping up in Google SERPs. Here are 5 examples in the wild and some tips on how to get them.
In an effort to get their point across, webmasters will sometimes implement more than one robot control technique to keep the search engines away from a page. Unfortunately, these techniques can sometimes contradict each other: One technique hides the instruction of the other or link juice is lost.
Being a chrome junkie and also a keen productivity evangelist I'm predictably a huge fan of javascript bookmarklets. I use them all day long and over time I've built up a few that I have made myself that I thought I'd share today. What is a javascript bookmarklet? A javascript bookmarklet is a small piece of javascript code that you can execute in your browser by bookmarking a l...
Recently Firefox automatically updated to version 5, and with that update came a nightmarish scenario: virtually every Firefox SEO add-on suddenly ceased to function. By now many of these add-ons have been repaired, but at the time I was rescued by a side project of mine – a portable SEO Browser designed to run from a thumbdrive, complete...
Developers and technical SEOs have heard the search engine mouthpieces say it over and over: "Make pages primarily for users, not for search engines". If you ask me, there's one big reason why "primarily" sneaks itself into that statement: Faceted Navigation. Let's discuss how to provide a great user experience AND a search engine friendly faceted navigation.
In a follow up to my catastrophic canonicalization experiment, I take on 6 new, extreme challenges. Some work, some don't, and we all learn along the way (except me).
Dear SEO friends, Recently I considered implementing the Facebook Comments Box in several sites. I tried to consider the pros and cons. The obvious pros were (there are more): By default comments will be posted to the user's wall (though they can disable this)...
When Larry Page and Sergey Brin invented the PageRank algorithm way back in the late 90s, they struck gold and were smart enough to found Google to capitalize on its very real potential to change search. For many years Google even told us the PageRank number for each web page...
Just every now and again, search engines love to throw our merry band of SEO types the occasional curveball and keep us on our toes with new toys or updates. Yesterday was one such day for the world of structured data in web page design.