The Twelve Days of Google Christmas
This YouMoz entry was submitted by one of our community members. The author’s views are entirely their own (excluding an unlikely case of hypnosis) and may not reflect the views of Moz.
Hello, Mozzers! Longtime reader, first-time writer.
All of the articles on Moz's blog are usually serious and always highly informative for our community, but for the holiday season, I was inspired by the site's Google Algorithm Change History to contribute something a little more lighthearted as a year-end summary. So, here is The Twelve Days of Google Christmas, which reviews all of the "gifts" that Google has given we marketers and webmasters in the past year (or so) as far as the important changes to the search engine's algorithm.
On the first day of Christmas, Google gave to me:
A hummingbird most recently-e (image: Wikipedia Commons)
Google Hummingbird, implemented by Google in August 2013 and announced on September 26, could be termed the "one update to rule them all" because it incorporates a lot of the changes that I discuss below. Of course, that would not be entirely accurate because Hummingbird is not an algorithm update but rather a new algorithm altogether.
Danny Sullivan summarizes:
Hummingbird is paying more attention to each word in a query, ensuring that the whole query — the whole sentence or conversation or meaning — is taken into account, rather than particular words. The goal is that pages matching the meaning do better, rather than pages matching just a few words.
Essentially, Google incorporates factors including the intent behind the search, the device on which the search was conducted, the location of the person, search history, semantic data, the Knowledge Graph, and more in an attempt to deliver better answers and results beyond just the keywords themselves.
While I was writing this essay, I remembered this tweet from Rand Fishkin with the SERPs for "warm places in the us to visit in december." As he tweeted, "Note how little KW matching vs. how much intent matching."
To understand how SEOs should respond to Google Hummingbird, I suggest reviewing all of the prior algorithm updates that I describe below.
On the second day of Christmas, Google gave to me:
Two authors writing (image: Social Grinder)
and a hummingbird most recently-e
LinkedIn limits the actions that companies can take on the network -- it prefers to let individuals do the talking. And for good reason. A common social-media marketing practice on Facebook, for example, has supposedly been for companies to act as pages and interact with people, other pages, and communities. But it always comes across as spam -- even under the guise of "participating in the conversation" in a "positive way." If I'm talking to friends and "Acme" company comes up and says something, I know -- and even people who are not marketers know -- that the user is selling something in the end. And no one likes that.
So, Google is rightly following suit by placing great importance on individual branding. It comes across as more real and authentic -- and more valuable in search results. If a person sees this post in Google, the fact that my picture and Google+ information will likely appear lends more credibility because I as a person am saying these thoughts rather than Moz as a business.
For this reason, Google+ authorship will become more and more important. Not because the markup will necessarily lead to higher rankings (though it may lead to higher click-through rates -- see below) -- but because Google seems to be placing more emphasis on brands rather than just on keywords, links, and other traditional factors. And both companies (or websites) and individuals now have brands in Google's eyes.
I have no evidence to support this assertion, but I would wager that articles written by people who are perceived by Google as authoritative will help the rankings of those specific writings in search results at some point. So, as I've written elsewhere, it will be imperative for businesses to build their own brands as well as those of their writers through online public relations.
On the third day of Christmas, Google gave to me:
Three in-depth articles (image: Moz)
Two authors writing
and a hummingbird most recently-e
In August 2013, Google implemented an algorithm change that includes (three!) snippets of lengthy, authoritative content from established brands in search results for broad, general queries. There are four main types of user intent behind keywords -- navigational, commercial, transactional, and informational -- and Google wants to improve the results for the latter since more and more content is, as Doug Kessler of Velocity Partners once bluntly put it, "crap."
Brands and publications that want to demonstrate their thought leadership on desired topics should incorporate the following in such pieces of content:
- At least 2,000 words
- Schema.org Article markup
- Google+ Publisher and Authors markup
- rel=next and rel=prev for paginated articles
- A clearly-defined logo most often via a Google+ Page
On the fourth day of Christmas, Google gave to me:
Four culling pandas (image: Wikipedia Commons)
Three in-depth articles
Two authors writing
and a hummingbird most recently-e
Google unveiled the first Panda update (in conjunction with the Farmer update) in February 2011 and the most-recent one in July 2013. The rolling updates are largely focused on poor content practices including scraping, duplicating, and copying -- whether intentionally or unintentionally. Common victims were sites that placed variations of the same content on two different pages that target long-tail keywords and just swamping out some of the keywords. Another practice had been to create numerous pages to target countless keywords but having little original content on each of them.
Fishkin released his analysis of the winners and losers here:
- It seemed that sites whose pages had fewer and/or less intrusive blocks of advertisements on them tended to be in the winner bucket, while those with more and more intrusive advertising tended to be in the loser group.
- Likewise, sites whose UI/design would likely be described as more modern, high quality, thoughtful and "attractive" were winners vs. the "ugly" sites that tended to be in the loser bucket.
- When it came to user-generated-content (UGC) sites, those that tended to attract "thin" contributions (think EzineArticles, Hubpages or Buzzle) lost, while those with richer, often more authentic, non-paid, and not-intended to build SEO value or links (think Etsy, DailyMotion, LinkedIn, Facebook) won.
- In the "rich content" sector, pages with less usable/readable/easily-consumable content (think AllBusiness, FindArticles) tended to lose out to similarly content-rich sites that had made their work more usable (think LOC.gov, HuffingtonPost)
On the fifth day of Christmas, Google gave to me:
Five! Penguin updates! (image: Wikipedia Commons)
Four culling pandas
Three in-depth articles
Two authors writing
and a hummingbird most recently-e
After its original release in April 2012, the last (and fifth!) official Penguin update occurred in October 2013. Google Penguin has been focused on devaluing the rankings of websites that have built too many bad links -- maliciously or not -- via link farms, link buying, forum and blog spam, exact-match anchor text, directories, article marketing, widgets, site-wide links in footers or blogrolls, "suspicious" reciprocal links, and more. Jimmy Ellis has a case study here at Moz outlining why he was hit and what he did in response.
The thing to understand post-Penguin is that the best practice is -- and, in fact, always has been -- to earn links rather than build them. As one example, please feel free to see a presentation I gave at SMX Milan in November 2013 on how to use public relations to build the best links. Essentially, it is using a collection of practices (blogger and e-mail outreach, and so on) -- that are merely just PR by another name -- to convince authoritative sites to give you links rather than you having to build them. What is called "SEO" is fast becoming, as I've also written, a collection of best practices that include PR.
On the sixth day of Christmas, Google gave to me:
Six graphs a-growing (image: Moz)
Five! Penguin updates!
Four culling pandas
Three in-depth articles
Two authors writing
and a hummingbird most recently-e
Google wants to move towards providing answers rather than lists of websites. If you Google "When is Christmas 2013?" you will likely see the answer right at the top of the page. So, the search engine unveiled the Knowledge Graph in May 2012 and has expanded it ever since. If you enter a general query about a topic -- usually a noun such as a brand, person, or movie -- you will increasingly see facts alongside the search results culled from authoritative websites about that brand, person, or movie.
For more information on the background and technical aspects of the Knowledge Graph, I recommend reading Andrew Isidoro's post here on Moz. The gist: To get found in the Knowledge Graph, it is important that you or your brand build an online brand that Google can recognize via Google+, schema.org markup, and more. If you want your data to be accepted into the Knowledge Graph, then you need (among other things) to establish enough trust so that your answers are viewed on a level that is akin to Wikipedia.
On the seventh day of Christmas, Google gave to me:
Seven SERPs a-SERPing (image: Moz)
Six graphs a-growing
Five! Penguin updates!
Four culling Pandas
Three in-depth articles
Two authors writing
and a hummingbird most recently-e
Perhaps as a precursor to the future SERP-layout changes that are described in this post, Google started to show only seven results to roughly 18% of queries in August 2012. As Dr. Pete noted in this excellent overview, these SERPs are showing more information for a fewer number of queries including:
- Expanded sitelinks
- Local information
- Image "mega packs"
As Dr. Pete summarized:
Ultimately, query intent and complex associations are going to start to matter more, and your money keywords will be the ones where you can provide a strong match to intent. Pay attention not only to the 7-result SERPs in your own keyword mix, but to queries that trigger Knowledge Graph and other rich data – I expect many more changes in the coming year.
On the eighth day of Christmas, Google gave to me:
Eight plusses a-plussing (image: Moz)
Seven SERPs a-SERPing
Six graphs a-growing
Five! Penguin updates!
Four culling pandas
Three in-depth articles
Two authors writing
and a hummingbird most recently-e
From the time that Google+ was unveiled to today, SEOs have debated the effect of Google's answer to Facebook -- namely, how much will one's Google+ presence affect one's presence in organic search? The answer, as with most things in our field, has been hotly contested:
- Studies from Moz and Searchmetrics found strong correlations between the number of +1s and high rankings
- Matt Cutts responded by stating that +1s do not directly lead to high rankings
- Eric Enge saw that +1s do not affect the SERPs in non-personalized results but do affect personalized results
I'm not a statistician, so I will leave the issue to the experts. However, I would suggest that +1s will eventually affect search results (if they do not already) simply because Google+ is the only social network whose activity Google can completely understand and incorporate (since, well, the company owns it). Moreover, links shared on Google+ are do-follow.
For this reason, it is imperative that brands and websites become active on the network. Tomorrow will be too late.
On the ninth day of Christmas, Google gave to me:
Nine schemas scheming (image: Moz)
Eight plusses a-plussing
Seven SERPs a-SERPing
Six graphs a-growing
Five! Penguin updates!
Four culling pandas
Three in-depth articles
Two authors writing
and a hummingbird most recently-e
Schema.org, announced in June 2011, is a consortium run by search engines including Bing, Google, Yahoo!, and Yandex to help websites to include what are called "rich snippets" or "structured data" into their sites so that additional information can appear in search results beyond just text-based meta titles and meta descriptions.
The benefit to search engines is obvious: schema code allows them to process and deliver relevant information more easily so that searchers can see more valuable information and answers directly in search results in the form of prices (e-commerce), starred ratings (reviews), publication details (authors and publishers), and a lot more.
The benefit to websites and marketers is also important (but less than obvious): while using schema.org code does not help search rankings (at least not yet), search results with structured data see higher click-through ratios (CTRs) -- see the data compiled in posts at Search Engine Land, Stealth KCA, and SEO Chicks (among others). The reason is logical: Even though a site may rank, say, third, more eyes may be drawn towards it than to a first-ranking SERP that is nothing but boring text.
On the tenth day of Christmas, Google gave to me:
Ten paydays a-plunging (image: Moz)
Nine schemas scheming
Eight plusses a-plussing
Seven SERPs a-SERPing
Six graphs a-growing
Five! Penguin updates!
Four culling pandas
Three in-depth articles
Two authors writing
and a hummingbird most recently-e
I loved this 2013 update that, as discussed by Barry Schwartz, targeted sectors with high levels of spam. Following my journalism career in Boston, two of my first SEO jobs years ago were in the porn and forex industries. That was my introduction to Internet marketing -- and I had unknowingly used black-hat practices as instructed by my bosses. (Ugh -- please, don't ask.) Thankfully, I later "saw the SEO light."
The more that your industry is competitive (and I've worked with many of them since my early days), the more that Google will single out your industry and essentially (and rightly) force you to use only "white hat" methods -- or else. I'm taking about the factors that we all should know by now: technical SEO that is focused on user intent and experience without any keyword stuffing or over-optimization, natural links that are "earned" rather than "built," and the regular production of quality, authoritative content. Everything else is just details.
On the eleventh day of Christmas, Google gave to me:
Eleven brands building (image: Moz)
Ten paydays a-plunging
Nine schemas Scheming
Eight plusses a-plussing
Seven SERPs a-SERPing
Six graphs a-growing
Five! Penguin updates!
Four culling pandas
Three in-depth articles
Two authors writing
and a hummingbird most recently-e
The 2012 Venice Update affected the way that brands appear in national versus local search. In short, Google became much better in determining a searcher's location and then adjusted accordingly. In a post here on Moz, Mike Ramsey highlighted a few ways that marketers need to adapt:
- Title tag and description: Use the location and keyword (without keyword stuffing, of course)
- Localized page content: Whenever possible, create original, quality content on a localized page that is relevant to people in that area
- Local link profile: Get quality, local links from authoritative sites in that location that point to the localized page
- Location-based reviews: Get numerous reviews from real, local residents on your site and Google+ page
- Schema: Incorporate schema.org and microformats.org to markup your address and other information
- Add a KML File and GeoSitemap to your website
If you have a location-based business, it is imperative to build your brand online nationally -- and locally among your customers themselves.
On the twelfth day of Christmas, Google gave to me:
Twelve domains dropping (image: Moz)
Eleven brands building
Ten paydays a-plunging
Nine schemas scheming
Eight plusses a-plussing
Seven SERPs a-SERPing
Six graphs a-growing
Five! Penguin updates!
Four culling pandas
Three in-depth articles
Two authors writing
and a hummingbird most recently-e
Certain types of domains as a whole have also been targeted -- rightly or wrongly, depending on one's opinion -- by a series of Google updates.
In September 2012, Google released the so-called EMD ("Exact-Match Domain") update that purportedly were ranking highly merely because of the presence of keywords in the domain (such as teethwhitening.com). The sites had seemingly few other positive attributes such as quality content, authoritative backlinks, and more. Dr. Pete has another good post on "Partial-Match Domains" here.
Todd Malicoat phrased the issue on Moz like this:
Think about the big brands Staples and Office Max; do they really DESERVE to rank better than a well built OfficeChairs.com or OfficeFurnitureOnline.com?
But I would ask this question: Why should a random site such as OfficeChairs.com rank more highly than trusted brands such as Stapels and Office Max just because they have keywords in the domain name?
According to a recent article by Matt McGee on Search Engine Land: "70 percent of US consumers said they look for a 'Known retailer' when deciding what search results they click on." Consumers have a definite bias towards established brands, and Google has a vested interest in giving searchers what they want -- and that means being biased towards brands as well. For the long term (if not even today), I would not be surprised if the best practice becomes avoiding keywords in your brand name and domain -- whether exact- or partial-match -- altogether. For example, the top three results (non-personalized, U.S.-based search) for "SEO software" do not have either "SEO" or "software" in the domain name.
What Will Google Do in 2014?
These eleven changes from Google are best described not as individual "presents" but as a single present that is wrapped in a bow in the shape of a hummingbird. All of these updates are becoming more and more integrated together into the new Hummingbird algorithm. It might not be the present that we want -- but it's what Google put under the tree this year.
What do you hope Google will give us next year? Happy Holidays, Mozzers!
Comments
Please keep your comments TAGFEE by following the community etiquette
Comments are closed. Got a burning question? Head to our Q&A section to start a new conversation.