Technical SEO

Traditionally, the phrase Technical SEO refers to optimizing your site for crawling and indexing, but can also include any technical process meant to improve search visibility.

Technical SEO is a broad and exciting field, covering everything from sitemaps, meta tags, JavaScript indexing, linking, keyword research, and more.

If you’re new to SEO, we recommend starting with the chapter on Technical SEO in our Beginner’s Guide. Below are the latest posts on technical SEO, and we’ve included a few top articles here.

On-Site SEO : What are the technical on-page factors that influence your rankings? Our free learning center will get you started in the right direction.

The Web Developer's SEO Cheat Sheet : This handy—and printable—cheat sheet is invaluable for anyone building websites. Contains several useful references that cover a ton of technical SEO best practices.

MozBar : This free Chrome extension is an advanced SEO toolbar that helps you to examine and diagnose several technical SEO issues.

The Technical SEO Renaissance : Is it true that technical SEO isn't necessary, because Google is smart enough to figure your website out? Mike King puts this rumor to rest, and shows you what to focus on.

Technical SEO: The One Hour Guide to SEO : Want a quick introduction to the basics of technical SEO? Our guru Rand has you covered—all in about 10 minutes.

Most Recent Articles on Technical SEO

The World Series Spidering Problem
Will Critchlow

The World Series Spidering Problem

The issue I want to talk about is geo-delivery i.e. delivering different content to different visitors depending on their geographic location. When you don't know more information about the visitor (from sign-up information, cookie, etc.), the only way of doing this is through determining their location from their IP address. Whenever you start talking about selectively delivering content based on IP address, the topic of cloaking inevitably comes up.

SEO-Friendly FLEX Websites
V

SEO-Friendly FLEX Websites

When building a FLEX( Flash ) application that must be available to the user via Internet, always comes the next question : Will the website containing the application be SEO friendly?And the answer is NO. Google and other search engines cannot see inside your FLEX( Flash ) website/application and and index your pages. If SEO is important to you, and it should be important, you...

Removing ?PHPSESSID from a URL
T

Removing ?PHPSESSID from a URL

You’ve worked hard to prevent any duplicate content on your website. No copy-paste on your copy, no two url’s returning the exact same webpage due to incorrect usage of mod_rewrite and so on. But then, a couple days after the big launch, Google starts getting cluttered with dozens of references to your website, which harms your rankings. All due to an incorrect and insecure alternat...

Level Up Your SEO Skills With Our Free Training

Moz Academy Training

Complete courses to master SEO basics

Keyword Research Master Guide

Learn Keyword Research like the pros

Guide to SEO Competitor Analysis

Win rankings and traffic from your competition

Controlling Search Engine Access with Cookies & Session IDs
Rand Fishkin

Controlling Search Engine Access with Cookies & Session IDs

We've talked plenty in the past about methods to control search engine spiders' access to documents on your website, and we've discussed cloaking in several depths as well. But I feel that an under-utilized and extremely powerful methodology for serving unique content in different ways to visitors and search engines, based on the different experiences sought by the two, is critical to advanced ...

How to Make a Spider Crawl Backwards
Brian Brown

How to Make a Spider Crawl Backwards

First off, we need to begin with the usual disclaimers and warnings -- not that they are necessarily necessary -- but just in case ... your mileage may vary, no claims made, use at own risk, don't try this at home, performed on a closed track with professional driver, and so on. And please note that these ideas are strictly conceptual, at least that I'm aware, and have not been te...

Robots Exclusion Protocol 101
Sebastian

Robots Exclusion Protocol 101

<p>The Robots Exclusion Protocol (REP) is a conglomerate of standards that regulate Web robot behavior and search engine indexing. Despite the "Exclusion" in its name, the REP covers mechanisms for inclusion too. The REP consists of</p><ol><li>The original REP from <a href="http://www.robotstxt.org/orig.html">1994</a>, extended <a href=&qu...

A Comprehensive Guide to Hidden Text &amp; Search Engines
Eric Enge

A Comprehensive Guide to Hidden Text & Search Engines

Hidden Text is one of the challenges faced by webmasters and search engines. Spammers continue to use hidden text to stuff keywords into their pages for purposes of artificially boosting their rankings. Search engines seek to figure out when spammers are doing this, and then then take appropriate action.For the average every day webmaster, one challenge is that there are many ways to c...

Me vs. Web Developers: How Do We Get on the Same Page?

Me vs. Web Developers: How Do We Get on the Same Page?

Since I have been working at "place company name here", specialising in search marketing I have been constantly in battle against the web developers. (still a newbie, I am mostly a web designer needing to learn about SEO, so i got a job in online marketing)For two years the website has been in development and since I joined the company 5 months ago, we are still unable to do th...

How to Deal with Pagination &amp; Duplicate Content Issues
Rand Fishkin

How to Deal with Pagination & Duplicate Content Issues

This post has been a long time coming - there's a definite need to address internal duplicate content issues in an intelligent, simple, illustrated manner. I'm hoping that you'll be able to show this to the boss (or, more likely, the engineering team) and help them achieve that critical "AHA!" moment required for all true progressive changes. Fundamentally at issue is the crea...