![Logic, Meet Google - Crawling to De-index](https://moz.rankious.com/_moz/images/blog/banners/search-engines-5511dd3_2021-03-12-080918.png?w=1180&h=400&auto=compress%2Cformat&fit=crop&dm=1615536558&s=30b5b791c5675c219d763dfa25de29f7)
![Dr. Peter J. Meyers](https://moz.rankious.com/_moz/images/user/photo/22897-1429572425_2021-03-30-201914.jpg?w=160&h=160&auto=compress%2Cformat&fit=crop&dm=1617135554&s=28f43e08964b58ad5e5856feffbec91a)
Logic, Meet Google - Crawling to De-index
Since Panda, more and more people are trying to get a handle on their indexed pages. I discuss a common mistake in de-indexation - blocking crawl paths to the pages you want out of the index.
Traditionally, the phrase Technical SEO refers to optimizing your site for crawling and indexing, but can also include any technical process meant to improve search visibility.
Technical SEO is a broad and exciting field, covering everything from sitemaps, meta tags, JavaScript indexing, linking, keyword research, and more.
If you’re new to SEO, we recommend starting with the chapter on Technical SEO in our Beginner’s Guide. Below are the latest posts on technical SEO, and we’ve included a few top articles here.
On-Site SEO : What are the technical on-page factors that influence your rankings? Our free learning center will get you started in the right direction.
The Web Developer's SEO Cheat Sheet : This handy—and printable—cheat sheet is invaluable for anyone building websites. Contains several useful references that cover a ton of technical SEO best practices.
MozBar : This free Chrome extension is an advanced SEO toolbar that helps you to examine and diagnose several technical SEO issues.
The Technical SEO Renaissance : Is it true that technical SEO isn't necessary, because Google is smart enough to figure your website out? Mike King puts this rumor to rest, and shows you what to focus on.
Technical SEO: The One Hour Guide to SEO : Want a quick introduction to the basics of technical SEO? Our guru Rand has you covered—all in about 10 minutes.
Since Panda, more and more people are trying to get a handle on their indexed pages. I discuss a common mistake in de-indexation - blocking crawl paths to the pages you want out of the index.
“It’s official, Google is broken and my career is over. Time to hide under my desk.” A bit extreme? Yes. But, if you saw what I saw a month ago, your reaction would’ve been exactly the same. Let me explain.
Rich snippets -- we see them everywhere in the SERPs, with some verticals having a higher abundance of them than others. For the average searcher, these rich snippets help show them what they're searching for is within reach on a particular site.
Domain migrations are one of those activities that even if in the long-term can represent a benefit for an SEO process -- especially if the new domain is more relevant, has already a high authority or give better geolocalization signals with a ccTLD -- can represent a risk for SEO because of the multiple tasks that should be performed correctly in order to avoid potential non-trivial crawling and indexing problems and consequential lost of rankings and organic traffic.
Building websites using AJAX to load content can make them fast, responsive and very user friendly. However, it's not always been possible to do this without introducing # or #! symbols into URLs - and breaking the way URLs are 'supposed' to work. The method outlined here will let you build fast AJAX-based websites that also work well for SEO.
It's Organic and It's Local: We're Not Talking Vegetables
One of the biggest challenges many of my clients face is building the right SEO processes in place, so that any problems are quickly accounted for before they lead to bigger issues. Below are three things you should consider when trying to create a more streamlined process for making sure the technical foundation of the site is solid. Though none are considered "quick&q...
I've deliberately put myself in some hot water to demonstrate how I would do a technical SEO site audit in 1 hour to look for quick fixes, (and I've actually timed myself just to make it harder). For the pros out there, here's a look into a fellow SEO 's workflow; for the aspiring, here's a base set of checks you can do quickly. I've got some lovely volu...
In this post I will explain how to handle cases of planned downtime. That is, a short period of time wherein you purposely make your website inaccessible. This can be due to significant changes to the site or because of server maintenance.
For anyone that's experience the joys of doing SEO on an exceedingly large site, you know that keeping your content in check isn't easy. Continued iterations of the Panda algorithm have made this fact brutally obvious for anyone that's responsible for more than a few hundred thousand pages.
A website's code is like a play that tells a story to the search engine. If you have ink blotches and pages ripped or missing from the script it is hard for the search engine to understand the plot, if the search engine misses the plot it cannot tell others about it.
Matt Cutts announced at Pubcon that Googlebot is "getting smarter." He also announced that Googlebot can crawl AJAX to retrieve Facebook comments coincidentally only hours after I unveiled Joshua Giardino's research that suggested Googlebot is actually a headless browser based off the Chromium codebase at SearchLove New York. I'm going to challenge Matt Cutts's statements, Googlebot hasn't just recently gotten smarter, it actually hasn’t been a text-based crawler for some time now; nor has BingBot or Slurp for that matter. There is evidence that Search Robots are headless web browsers and the Search Engines have had this capability since 2004.
ASP.NET , generally speaking - is a web spider's worst nightmare. For SEO, it's pretty much the devil incarnate unless developers know how to leverage the framework to strip out the problems. So much so, that it requires an entire article dedicated to its problems and potential solutions.
In a land ravaged by pandas, one man will teach you everything you need to know about duplicate content. Learn how to spot duplicates in the wild and stop them in their tracks.