What tool do you use to check for URLs not indexed?
-
What is your favorite tool for getting a report of URLs that are not cached/indexed in Google & Bing for an entire site? Basically I want a list of URLs not cached in Google and a seperate list for Bing.
Thanks,
Mark
-
I've had good results using Google Search Console for checking which URLs are indexed. It's pretty straightforward and gives a clear overview of any indexing issues halloweensquishmallows.
-
-
I can work on building this tool if there's enough interest.
-
I generally just use Xenu's hyperlink sleuth (if you thousands of pages) to listing out all the URLs you have got and I might then manually take a look at them, however, see the guitar in demand I have not come upon an automatic device yet. If all people are aware of any, I'd like to recognize as properly.
-
This post from Distilled mentions that SEO for Excel plugin has a "Indexation Checker":
https://www.distilled.net/blog/seo/awesome-examples-of-how-to-use-seotools-for-excel/Alas, after downloading and installing, it appears this feature was removed...
-
Unless I'm missing something, there doesn't seem to be a way to get Google to show more than 100 results on a page. Our site has about 8,000 pages, and I don't relish the idea of manually exporting 80 SERPs.
-
Annie Cushing from Seer Interactive made an awesome list of all the must have tools for SEO.
You can get it from her link which is http://bit.ly/tools-galore
In the list there is a tool called scrapebox which is great for this. In fact there are many uses for the software, it is also useful for sourcing potential link partners.
-
I would suggest using the Website Auditor from Advanced Web Ranking. It can parse 10.000 pages and it will tell you a lot more info than just if it's indexed by Google or not.
-
hmm...I thought there was a way to pull those SERPs urls into Google docs using a function of some sort?
-
I think you need not any tool for this, you can directly go to google.com and search: Site:www.YourWebsiteNem.com Site:www.YourWebsiteName.com/directory I think this will be the best option to check if your website is crwled by google or not.
-
I do something similar but use Advanced Web Ranking, use site:www.domain.com as your phrase, run it to retrieve 1000 results and generate a Top Site Report in Excel to get the indexed list.
Also remember that you can do it on sub-directories (or partial URL paths) as a way to get more than 1000 pages from the site. In general I run it once with site:www.domain.com, then identify the most frequent sub-directories, and add those as additional phrases to the project and run a second time, i.e.: site:www.domain.com site:www.domain.com/dir1 site:www.domain.com/dir2 etc.
Still not definitive, but think it does give indication of where value is.
-
David Kauzlaric has in my opinion the best answer. If google hasn't indexed it and you've investigated your Google webmaster account, then there isn't anything better out there as far as I'm concerned. It's by far the simplest, quickest and easiest way to identify a serp result.
re: David Kauzlaric
We built an internal tool to do it for us, but basically you can do this manually.
Go to google, type in "site:YOURURLHERE" without the quotes. You can check a certain page, a site, a subdomain, etc... of course if you have thousands of URLs this method is not ideal, but it can be done.
Cheers!
-
I concur, Xenu is an extremely valuable tool for me that I use daily. Also, once you get a list of all the URLs on your site, you can compare the two lists in excel (two lists being the Xenu page list for your site and the list of pages that have been indexed by Google).
-
Nice solution Kieran!
I use the same method, to compare URL list from Screaming Frog output with URL Found column from my Keyword Ranking tool - of course it doesn't catch all pages that might be indexed.
The intention is not really to get a complete list, more to "draught" out pages that need work.
-
I agree, this is not automated but so far, from what we know, looks like a nice and clean option. Thanks.
-
Saw this and tried the following which isn't automated but is one way of doing it.
- First install SEO Quake plugin
- Go to Google
- Turn off Google Instant (http://www.google.com/preferences)
- Go to Advanced search set the number of results you want displayed (estimate the number of pages on your site)
- Then run your site:www.example.com search query
- Export this to CSV
- Import to Excel
- Once then do a Data to columns conversion using ; as a delimiter (this is the CSV delimiter)
- This gives you a formatted list.
- Then import your sitemap.xml into another TAB in Excel
- Run a vlookup between the URL tabs to flag which are on sitemap or vice versa.
Not exactly automated but does the job.
-
Curious about this question also, it would be very useful to see a master list of all URLs on our site that are not indexed by Google so that we can take action to see what aspects of the page are lacking and what we need for it to get indexed.
-
I usually just use Xenu's link sleuth (if you thousands of pages) to list out all the URLs you have and I would then manually check them, but I haven't come across an automated tool yet. If anyone knows any, I'd love to know as well.
-
Manual is a no go for large sites. If someone knows a tool like this, it woul be cool to know which/ where to find. Or..... This would make a cool SEOmoz pro tool
-
My bad - you are right that it doesn't display the actual URLs. So I guess the best thing you can do is site:examplesite.com and see what comes up.
-
That will tell you the number indexed, but it still doesn't tell you which of those URLs are or are not indexed. I think we all wish it would!
-
I would use Google Webmaster Tools as you can see how many URLs are indexed based on your sitemap. Once you have that, you can compare it to your total list. The same can be done with Bing.
-
Yeah I do it manually now so was looking for something more efficient.
-
We built an internal tool to do it for us, but basically you can do this manually.
Go to google, type in "site:YOURURLHERE" without the quotes. You can check a certain page, a site, a subdomain, etc... of course if you have thousands of URLs this method is not ideal, but it can be done.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Parameters as pagination
Hi guys, due to some changes to our category pages our paginated urls will change so they will look like this: ...category/bagger/2?q=Bagger&startDate=26.06.2017&endDate=27.06.2017 You see they include a query parameter as well as a start and end date which will change daily. All URLs with pagination are on noindex/follow. I am worrying that the products which are linked from the category pages will not get crawled well when the URLs on which they are linked from change on a daily basis. Do you have some experience with this? Are there other things we need to worry about with these pagination URLs? cheers
Technical SEO | | JKMarketing0 -
Webmaster tools question
Hi all. I have a question regarding http vs https. I have an https site and was wondering how to tell google in Webmaster tools to combine and use https. I have setup all sites in Webmaster tools. Both www and non www for both http and https. I see where to set up the www vs the non www but don't quite understand how to do the https part. I want all traffic to: https://www-creative -technology-solutions.com Thanks
Technical SEO | | twoacejr0 -
Using Wix and the keyword grader tool isn't working.
After reading other posts about Wix and SEO I think I need to change the web design provider to something I have more control of SEO options. Does anyone have any suggestions of something I can use?
Technical SEO | | benjacksoncook0 -
Use of Multiple Tags
Hi, I have been monitoring some of the authority sites and I noticed something with one of them. This high authority site suddenly started using multiple tags for each post. And I mean, loads of tags, not just three of four. I see that each post comes with at least 10-20 tags. And these tags don't always make sense either. Let's say there is a video for "Bourne Legacy", they list tags like bourne, bourney legacy, bourne series, bourne videos, videos, crime movies, movies, crime etc. They don't even seem to care about duplicate content issues. Let's say the movie is named The Dragon, they would inclue dragon and the-dragon in tags list and despite those two category pages(/dragon and /the-dragon) being exactly the same now, they still wouldn't mind listing both the tags underneath the article. And no they don't use canonical tag. (there isn't even a canonical meta on any page of that site) So I am curious. Do they just know they have a very high DA, they don't need to worry about duplicate content issues? or; I am missing something here? Maybe the extra tags are doing more good than harm?
Technical SEO | | Gamer070 -
Long URL
I am using seomoz software as a trial, it has crawled my site and a report is telling me that the URL for my forum is to long: <dl> <dt>Title</dt> <dd>Healthy Living Community</dd> <dt>Meta Description</dt> <dd>Healthy life discussion forum chatting about all aspects of healthy living including nutrition, fitness, motivation and much more.</dd> <dt>Meta Robots</dt> <dd>noodp, noydir</dd> <dt>Meta Refresh</dt> <dd>Not present/empty</dd> <dd> 1 Warning Long URL (> 115 characters) Found about 17 hours ago <dl> <dt>Number of characters</dt> <dd>135 (over by 21)</dd> <dt>Description</dt> <dd>A good URL is descriptive and concise. Although not a high priority, we recommend a URL that is shorter than 75 characters.</dd> </dl> </dd> <dd> URL: http://www.goodhealthword.com/forum/reprogramming-health/welcome-to-the-forum-for-discussing-the-4-steps-for-reprogramming-ones-health/ The problem is when I check the page via edit or in the admin section of wordpress, the url is a s follows: http://www.goodhealthword.com/forum/ My question is where is I cannot see where this long url is located, it appears to be a valid page but I cant find it. Thanks Pete </dd> </dl>
Technical SEO | | petemarko0 -
Should we block URL param in Webmaster tools after URL migration?
Hi, We have just released a new version of our website that now has a human readable nice URL's. Our old ugly URL's are still accessible and cannot be blocked/redirected. These old URL's use a URL param that has an xpath like expression language to define the location in our catalog. We have about 2 million pages indexed with this old URL param in it while we have approximately 70k nice URL's after the migration. This high number of old URL's is due to facetting that was done using this URL param. I wonder if we should now completely block this URL param from Google Webmaster tools so that these ugly URL's will be removed from the Google index. Or will this harm our position in Google? Thanks, Chris
Technical SEO | | eCommerceSEO0 -
Should Canonical URLs be used in Wordpress?
Wordpress offers Canonical URLs in the "All in one SEO" settings. I know that canonical tags for page content will cause the search engine to ignore the content, but I don't understand this setting in Wordpress. The Canonical URLs box for my blog had been checked until a couple weeks ago. I unchecked it (removing the canonical tag) and now I have about 300 duplicate content pages acccording to my SEOMoz reports. It appears that it's just the blog tag in the url now that is causing the confusion. Here's an example of the same url with two tags: http://www.rmtracking.com/blog/tag/aclu/ http://www.rmtracking.com/blog/tag/rfid/ Should I activate the canonical URL setting in Wordpress again. If not, how can I fix this? Your assistance is greatly appreciated. Regards, Brad
Technical SEO | | BradBorst0 -
URL Structure
Hi Guys, I'm in the process of creating a very exciting startup aimed at the baby industry. It's essentially a social commerce question where parents can shop for products, create lists of products and ask questions. The challenge I'm facing is how best to structure my URLs from an SEO standpoint. For example a common baby topic such as "feeding", can sit in all three categories: Shopping category aggregates all products related to feeding List category aggregates all lists related to feeding Question category aggregates all question and answers on feeding So for that keyword "feeding" you have 3 potential landing pages. What I was wondering is what is the most effective way of doing it? I was thinking of something along these lines: /shopping/feeding /baby_list/feeding /ask/feeding Would love to hear your points of view on this. Thanks! Walid
Technical SEO | | walidalsaqqaf0