Skip to content
Social media 673b1b8

Using Twitter as a Sitemap

J

This YouMoz entry was submitted by one of our community members. The author’s views are entirely their own (excluding an unlikely case of hypnosis) and may not reflect the views of Moz.

Table of Contents

J

Using Twitter as a Sitemap

This YouMoz entry was submitted by one of our community members. The author’s views are entirely their own (excluding an unlikely case of hypnosis) and may not reflect the views of Moz.

Introduction

Almost since the beginning of Twitter SEO experts have been speculating how Twitter can help them in their SEO projects. The most obvious usage - direct link building - was rejected immediately because of nofollow tags that were added to all external links. The only thing left was using it for indirect link building by creating social networks and link baiting.

However, there have been some speculations recently, that Google may treat nofollow in different ways - depending on the website. This, together with the fact that Google has now access to Twitter's database, caused us to test whether URLs on Twitter are really ignored by Google, as it officially may seem.

Research and methodology

We have run a research basing on a Polish pet-related classified website http://www.morusek.pl/ with circa 50 ads published every day, and its profile on Twitter: http://twitter.com/morusek .

The profile is updated every 30 minutes with automatic posts containing ad title, part of the description and bit.ly shortened URL. For the sake of the research we have changed published URLs, so that they contained ?utm_source=twitter. This way we could track Googlebot visits on ad details pages URLs with, and without this postfix. If Googlebot visited an URL containing utm_source=twitter, it meant that it found it via Twitter, as such URLs were not displayed anywhere else on or outside the website.

Morusek profile at Twitter

Results

After few days we had data from circa 190 ads, namely the date and time of their submission, first Googlebot visit from links published on the website (without utm_source...) and first Googlebot visit from links published on Twitter.com (with utm_source...).

The first and most important conclusion from the research is that there were visits from Twitter links.  In fact almost every ad had at least one visit for URL containing utm_source=twitter. This means that Google takes into account URLs placed on Twitter, despite of the fact that they are equipped with nofollow, and sends Googlebot to fetch their content.

That is not all! After analyzing the data, it appeared that visits via Twitter were circa 186 minutes faster than the ones through the website.

Average time between ad submission and first visit via Twitter or usual URL

The results may be visualized better by the following graphs. Please note that they were cropped to show only a part of the data so that they are easier to follow.

Googlebot visits through usual links

 Googlebot visits via Twitter links

Another conclusion, as we can see on the above graphs, is that Googlebot visits through Twitter links are more regular than those from internal links, where delays may be as big as 14 hours.

Moreover, the average time between link submission to Twitter and first Googlebot visit is in fact much shorter than the above mentioned 57 minutes. Note that the values show the difference between ad submission and first Googlebot visit, while ad submission time is not equal to link publication on Twitter. See the below example:

Googlebot visits through Twitter URLs

You can see the submission time and first Googlebot visit from Twitter links for 4 selected ads. Application that is used on Morusek's Twitter profile enables item upload every 30 minutes. We can see that the oldest ad was in fact published 16 minutes later than it could have been. If we exclude such extremes, and take into account only the most recent ads at each bulk upload, we have the average time between ad publication and first Googlebot visit via Twitter links decreased to 42 minutes.

Googlebot visits vs. page indexation

We have already proven that links published on Twitter are taken into account by Google, and cause Googlebot visits. But we do not yet know whether such links cause page indexation. It may happen that Google visits these pages only for the real-time search purposes, to evaluate the quality of Twitter profiles.

Well, it appears that such links really cause page indexation. We have not tested it on the larger scale, but the two ads that were not found by Googlebot via internal links, and reported only Twitter-related Googlebot visits, were in fact present on Google's SERPs after a couple of hours.

Problems

Everything so far looks great. We know that Twitter increases the speed of Googlebot visits for new pages, and improve (or cause) their indexation. There is however one problem. As you have probably noticed on the above graph (the one with visits via Twitter), not every green dot is accompanied by a blue one. It means that not all links published on Twitter were noticed and caused Googlebot visits. Out 189 ads that were evaluated, 50 were not visited via Twitter links at all (26%).

This might be caused by the way ads are uploaded to Twitter, i.e. every half an hour, up to 5 at once. Some may not have been published at all (for example if there were more than 5 ads submitted between two uploads). Or this may simply be due to lack of Google's trust towards either whole Twitter, or Morusek's profile only.

Conclusions

Nevertheless, it appears that Twitter may be a good way of increasing website indexation, especially for large websites, with lots of new content appearing every day. Thanks to profiles like the  one of Morusek we can increase the speed at which new pages are being found by Google, and allow indexation of the ones that cannot be found via internal links (for example because of changing content too quickly).

Of course the frequency and effectiveness of Googlebot visits probably depends on the profile's popularity, and update frequency rate. I also assume that Google may take other factors into account - such as the quality of updated content, and the quality or uniqueness of the linked pages. We may even imagine that Google uses similar quality measurement algorithms as it is in case of XML sitemaps.

Another conclusion (and a topic for future discussions),  is that profiles like the Morusek's one may be treated by some as spam since the initial Twitter purpose was very much different from this one. So it may happen that either Twitter will eventually block such "sitemaps", or Google may no longer take them into account.

Finally, if Google follows Twitter links, then maybe other nofollow ones are taken into account as well. Of course not all, but I bet that Wikipedia, Facebook, and maybe some other social media websites are treated similarly as well.

Back to Top

With Moz Pro, you have the tools you need to get SEO right — all in one place.

Read Next

The Future of Content Success Is Social

The Future of Content Success Is Social

May 21, 2024
How To Get Buy-In by Setting Strategic Content Marketing Goals

How To Get Buy-In by Setting Strategic Content Marketing Goals

Sep 27, 2023
Twitter’s Brand Equity: 17 Years & 12 Million Keywords

Twitter’s Brand Equity: 17 Years & 12 Million Keywords

Aug 01, 2023

Comments

Please keep your comments TAGFEE by following the community etiquette

Comments are closed. Got a burning question? Head to our Q&A section to start a new conversation.