Skip to content
Search engines 5511dd3

Testing Googlebot Visit Frequency

J

This YouMoz entry was submitted by one of our community members. The author’s views are entirely their own (excluding an unlikely case of hypnosis) and may not reflect the views of Moz.

Table of Contents

J

Testing Googlebot Visit Frequency

This YouMoz entry was submitted by one of our community members. The author’s views are entirely their own (excluding an unlikely case of hypnosis) and may not reflect the views of Moz.

We theoretically know which factors influence Googlebot visit frequency. It is definitely page link popularity, domain popularity (or trust), the frequency of changes on a given website, and other. Theory is good, but as usual we wanted to test it in practice.

But first of all - a question - why is Googlebot visit frequency so important? We are first on SERPs  for phrases that we wanted, so why we need our pages to be overloaded with Google robots? The answer is: it depends on two factors. If we have got a static content, not often refreshed or added, then we do not need more Googlebot visits. But it is extremely important in case of often updated websites, like the ones with news or classifieds, or other content that is constantly being created. In these cases  more Googlebot visits mean that our content is quicker and easier fetched and indexed by Google, which brings more long tail traffic.

We have conducted the experiment on a polish classifieds website (www.morusek.pl). A small upgrade in the application enabled us  to note every Googlebot visit with the URI of visited website and the timestamp. The experiment  has been running for 30 days.

Some general data at the beginning: first of all we can see how much Google's attention is spread among the pages. The most popular one - the home page was  visited for circa 20 times per day. Two following top popularity pages had a frequency of 15 visits per day, about 80 next ones  were visited at least once a day, while the whole bunch of others - 370,000 pages had less than that (in majority of cases it was once for the whole experiment).

Googlebot visit frequency for the top 150 pages

The main aim of the experiment was to research which factors influence Googlebot visit frequency the most. We have compared the frequency of top 300 pages, excluding ad details pages, because being of temporary nature they would not have sufficient data such as MozRank or internal links count.

The frequency was compared to three factors: internal and external links count (according to GWT), and the mozRank. To evaluate their influence we calculated the Pearson correlation coefficients of each set of factors with regards to the frequency. Coefficient of 1 would mean ideal correlation, while the minimum value is 0 (for positive correlation).

The most obvious and expected factor - the mozRank had a correlation coefficient of 0.6. It was measured basing on the top 20 URIs because less popular pages usually had insufficient data.

  Googlebot visit frequency vs mozRank

Much higher coefficient was in case of the internal links (0.86) and external links (0.7) which were based on the top 300 pages.

Correlation coefficient of Googlebot visit frequency vs. different features

We can see that the internal links data have the biggest coefficient with values slightly decreasing with the increase of the number of pages taken into account. So we can conclude, that the biggest influence on Googlebot visit frequency has the plain number of internal links, in other words - the internal crosslinking.

We also wanted to check whether linking page popularity will be more efficient in increasing visit frequency than sheer numbers. We have put links to less popular pages on the home page, but it brought almost no increase (like form 0 visits / day to 0.1 visit / day).

But that is not all. While analyzing the dependence of frequency vs. internal links count we have noticed two strange pages:

Googlebot visit frequency vs. internal links count

We can see two peeks of internal links count, with relatively low visit frequency. These two pages are in fact two first level categories ("Dogs" and "Cats") with lots of external links and great number of internal links (like in breadcrumbs of dog, cat-related categories or ads). These pages have much more links than their subcategories, but still some of these subcategories have higher frequency of visits (like the second and third most popular page: "Dogs -> Dogs for sale" and "Dogs -> Dogs for adoption").

I would suppose that the reason why these pages have much smaller Googlebot visit frequency is that being top categories cause them to contain many featured ads which appear first on the listing pages which in turn cause the whole page content to change less frequently (featured ads are less frequent than normal ads, and are there for 2-7 days depending on the purchased option). In case of their lower-level categories the number of featured ads decreases as they are spread among different subcategories.

But, going back to our correlation coefficients, if we exclude these two pages we will have the coefficient of 0.92 in case of internal links count, which is really very close to the ideal value (1).

  Correlation coefficient of Googlebot visit frequency vs different features

 

Conclusions

Of course in order to be more certain we should conduct more such experiments on different websites using different applications. Nevertheless, basing on these results we may conclude that the number of internal links pointing to each of the pages has the biggest influence on Googlebot visit frequency on pages that belong to one website. So  if we would like to increase the number of Googlebot visits we should enhance the internal crosslinking. I think that this conclusion is quite obvious, but still, it is nice to have everything checked empirically.

The second conclusion is definitely more revealing - enhancing ads by placing them at the beginning of result pages may hinder the indexation and cause other ads to be passed over by Google. Of course the solution is not to remove such a functionality, but to think of other ways of improving indexation (like related ads, or alternative listings that do not contain premium ads).

Back to Top

With Moz Pro, you have the tools you need to get SEO right — all in one place.

Read Next

How to Optimize E-commerce Sitemaps with 1M+ Pages — Whiteboard Friday

How to Optimize E-commerce Sitemaps with 1M+ Pages — Whiteboard Friday

May 17, 2024
7 Ways SEO and Product Teams Can Collaborate to Ensure Success

7 Ways SEO and Product Teams Can Collaborate to Ensure Success

Apr 24, 2024
6 Things SEOs Should Advocate for When Building a Headless Website — Whiteboard Friday

6 Things SEOs Should Advocate for When Building a Headless Website — Whiteboard Friday

Apr 19, 2024

Comments

Please keep your comments TAGFEE by following the community etiquette

Comments are closed. Got a burning question? Head to our Q&A section to start a new conversation.