Secrets of the 7-Result SERP
In August of 2012, Google launched 7-result SERPs, transforming page-one results. MozCast data initially showed that as many as 18% of the queries we tracked were affected. We’ve been collecting data on the phenomenon ever since, and putting some of the most common theories to the test. This is the story of the 7-result SERP as we understand it today (image created with PULP-O-MIZER).
I. 7-Result SERPs in The Wild
By now, you’ve probably seen a few 7-result SERPs in the “wild”, but I think it’s still useful to start at the beginning. Here are a few examples (with screenshots) of the various forms the 7-result SERP takes these days. I apologize in advance for the large images, but I think it's sometimes important to see the full-length SERP.
(1) The “Classic” 7-Result SERP
The classic 7-result SERP usually appears as a #1 listing with expanded site-links (more on that later), plus six more organic listings. Here’s a screenshot from a search for “some ecards”, a navigational query:
(2) The 7 + 7 with Local Results
It’s also possible to see 7-result SERPs blended with other types of results, including local “pack” results. Here’s the result of a search with local intent – “williamsburg prime outlets”:
(3) The 6 + Image Mega-Pack
It’s not just organic results that can appear in the #1 spot of a 7-result SERP, though. There’s a rare exception when a “mega-pack” of images appears at the top of a SERP. Here’s a “7-result” SERP with one image pack and six organic listings – the search is “pictures of cats”:
II. Some 7-Result SERP Stats
Our original data set showed 7-result page-one SERPs across about 18% of the queries we tracked. That number has varied over time, dropping as low as 13%. Recently, we’ve been experimenting with a larger data set (10,000 keywords). Over the 10 days from 1/13-1/22 (the data for this post was collected around 1/23), that data set tracked 7-result SERPs in the range of 18.1% - 18.5%. While this isn’t necessarily representative of the entire internet, it does show that 7-result SERPs continue to be a significant presence on Google.
These percentages are calculated by unique queries. We can also looking at query volume. Using Google’s “global” volume (exact-match), the percentage of queries by volume with 7-result SERPs for 1/22 was 19.5%. This compares to 18.5% by unique queries. Factoring in volume, that’s almost a fifth of all queries we track.
Here are the 7-result SERP percentages across 20 industry categories (500 queries per category) for 1/22:
CATEGORY | 7-SERPS |
Apparel | 23.6% |
Arts & Entertainment | 16.8% |
Beauty & Personal Care | 12.6% |
Computers & Consumer Electronics | 16.8% |
Dining & Nightlife | 27.2% |
Family & Community | 13.2% |
Finance | 19.2% |
Food & Groceries | 13.4% |
Health | 3.8% |
Hobbies & Leisure | 11.0% |
Home & Garden | 20.0% |
Internet & Telecom | 12.6% |
Jobs & Education | 21.4% |
Law & Government | 16.2% |
Occasions & Gifts | 7.8% |
Real Estate | 13.2% |
Retailers & General Merchandise | 29.6% |
Sports & Fitness | 28.6% |
Travel & Tourism | 36.2% |
Vehicles | 26.0% |
These categories were all borrowed from the Google AdWords keyword research tool. The most impacted vertical is “Travel & Tourism”, at 36.2%, with “Health” being the least impacted. At only 500 queries/category, it’s easy to over-interpret this data, but I think it’s interesting to see how much the impact varies.
III. The Site-Link Connection
Many people have hypothesized a link between expanded site-links and 7-result SERPs. We’ve seen a lot of anecdotal evidence, but I thought I’d put it to the test on a large scale, so we collected site-link data (presence and count) for the 10,000 keywords in this study.
Of the 1,846 queries (18.5%) in our data set that had 7-result SERPs on the morning of 1/22, 100% of them had expanded site-links for the #1 position. There were 45 queries that had expanded site-links, but did not show a 7-result count, but those were all anomalies based on how we count local results (we include blended local and packs in the MozCast count, whereas Google may not). There is nearly a perfect, positive correlation between 7-result SERPs and expanded site-links. Whatever engine is driving one also very likely drives the other.
The only minor exception is the image blocks mentioned above. In those cases, the image “mega-pack” seems to be the equivalent of expanded site-links. Internally, we count those as 6-result SERPs, but I believe Google sees them as a 7-result variant.
While most (roughly 80%) of 7-result SERPs have six expanded site-links, there doesn’t seem to be any rule about that. We’re tracking 7-result SERPs with anywhere from one to six expanded site-links. It doesn’t take a full set of site-links to trigger a 7-result SERP. In some cases, it seems to just be the case that the domain only has a limited number of query-relevant pages.
IV. 7-Result Query Stability
Originally, I assumed that once a query was deemed “worthy” of site-links and a 7-result SERP, that query would continue to have 7 results until Google made a major change to the algorithm. The data suggests that this is far from true – many queries have flipped back and forth from 7 to 10 and vise-versa since the 7-result SERP roll-out.
While our MozCast Top-View Metrics track major changes to the average result count, the real story is a bit more complicated. On any given day, a fairly large number of keywords flip from 7s to 10s and 10s to 7s. From 1/21 to 1/22, for example, 61 (0.61%) went from 10 to 7 results and 56 (0.56%) went from 7 to 10 results. A total of 117 “flips” happened in a 24-hour period – that’s just over 1% of queries, and that seems to be typical.
Some keywords have flipped many times – for example, the query “pga national” has flipped from 7-to-10 and back 27 times (measured once/day) since the original roll-out of 7-result SERPs. This appears to be entirely algorithmic – some threshold (whether it’s authority, relevance, brand signals, etc.) determines if a #1 result deserves site-links, probably in real-time, and when that switch flips, you get a 7-result SERP.
V. The Diversity Connection
I also originally assumed that a 7-result SERP was just a 10-result SERP with site-links added and results #8-#10 removed. Over time, I developed a strong suspicion this was not the case, but tracking down solid evidence has been tricky. The simple problem is that, once we track a 7-result SERP, we can’t see what the SERP would’ve looked like with 10 results.
This is where query stability comes in – while it’s not a perfect solution (results naturally change over time), we can look at queries that flip and see how the 7-result SERP on one day compares to the 10-result SERP on the next. Let’s look at our flipper example, “pga national” – here are the sub-domains for a 7-result SERP recorded on 1/19:
- www.pgaresort.com
- www.pganational.com
- en.wikipedia.org
- www.jeffrealty.com
- www.tripadvisor.com
- www.pga.com
- www.pgamembersclub.com
The previous day (1/18), that same query recorded a 10-result SERP. Here are the sub-domains for those 10 results:
- www.pgaresort.com
- www.pgaresort.com
- www.pgaresort.com
- www.pgaresort.com
- www.pganational.com
- en.wikipedia.org
- www.tripadvisor.com
- www.pga.com
- www.jeffrealty.com
- www.bocaexecutiverealty.com
The 10-result SERP allows multiple listings for the top domain, whereas the 7-result SERP collapses the top domain to one listing plus expanded site-links. There is a relationship between listings #2-#4 in the 10-result SERP and the expanded site-links in the 7-result SERP, but it’s not one-to-one.
Recently, I happened across another way to compare. Google partners with other search engines to provide data, and one partner with fairly similar results is EarthLink. What’s interesting is that Google partners don’t show expanded site-links or 7-result SERPs – at least not in any case I’ve found (if you know an exception, please let me know). Here’s a search for “pga national” on EarthLink on 1/25:
- www.pgaresort.com
- www.pgaresort.com
- www.pgaresort.com
- www.pganational.com
- en.wikipedia.org
- www.tripadvisor.com
- www.jeffrealty.com
- www.pga.com
- www.bocaexecutiverealty.com
- www.devonshirepga.com
Again, the #1 domain is repeated. Looking across multiple SERPs, the pattern varies a bit, and it’s tough to pin it down to just one rule for moving from 7 results to 10 results. In general, though, the diversity pattern holds. When a query shifts from a 10-result SERP to a 7-result SERP, the domain in the #1 spot gets site-links but can’t occupy spots #2-#7.
Unfortunately, the domain diversity pattern has been hard to detect at large-scale. We track domain diversity (percentage of unique sub-domains across the Top 10) in MozCast, but over the 2-3 days that 7-results SERPs rolled out, overall diversity only increased from 55.1% to 55.8%.
Part of the problem is that our broad view of diversity groups all sub-domains, meaning that the lack of diversity in the 10-result SERPs could overpower the 7-result SERPs. So, what if we separate them? Across the core MozCast data (1K queries), domain diversity on 1/22 was 53.4%. Looking at just 7-result SERPs, though, domain diversity was 62.2% (vs. 54.2% for 10-result SERPs). That’s not a massive difference, but it’s certainly evidence to support the diversity connection.
Of course, causality is tough to piece together. Just because 7-result SERPs are more diverse, that doesn’t mean that Google is using domain crowding as a signal to generate expanded site-links. It could simply mean that the same signals that cause a result to get expanded site-links also cause it to get multiple spots in a 10-result SERP.
VI. The Big Brand Connection
So, what drives 7-result SERPs? Many people have speculated that it’s a brand signal – at a glance, there are many branded (or at least navigational) queries in the mix. Many of these are relatively small brands, though, so it’s not a classic picture of big-brand dominance. There are also some 7-result queries that don’t seem branded at all, such as:
- “tracking santa”
- “cool math games for kids”
- “unemployment claim weeks”
- “cell signaling”
- “irs transcript”
Granted, these are exceptions to the rule, and some of these are brand-like, for lack of a better phrase. The query “irs transcript” does pull up the IRS website in the top spot – the full phrase may not signal a brand, but there’s a clear dominant match for the search. Likewise, “tracking santa” is clearly NORAD’s domain, even if they don’t have a domain or brand called “tracking santa”, and even if they’re actually matching on “tracks santa”.
In some cases, there does seem to be a brand (or entity) bias. Take a search for “reef”, which pulls up Reef.com in the #1 spot with four site-links:
Not to pick on Reef.com, but I don’t think of them as a household name. Are they a more relevant match to “reef” than any particular reef (like the Great Barrier Reef) or the concept of a reef in general? It could be a question of authority (DA = 66) or of the Exact-Match Domain in play – unfortunately, we throw around the term “brand” a lot, but we don’t often dig into how that translates into practical ranking signals.
I pulled authority metrics (DA and PA) for a subset of these queries, and there seems to be virtually no correlation between authority (as we measure it) and the presence of site-links. An interesting example is Wikipedia. It occupies over 11% of the #1 results (yeah, it’s not your imagination), but only seven of those 1,119 queries have 7-result SERPs. This is a site with a Domain Authority of 100 (out of 100).
VII. The "Entity" Connection
One emerging school of thought is that named entities are getting more ranking power these days. A named entity doesn’t have to be a big brand, just a clear match to a user’s intent. For example, if I searched for “sam’s barber shop”, SamsBarberShop.com would much more likely match my intent than results for barbers who happened to be named Sam. Sam’s Barber Shop is an entity, regardless of its Domain Authority or other ranking signals. This goes beyond just an exact-match domain (EMD) connection, too.
I think that 7-result SERPs and other updates like Knowledge Graph do signal a push toward classifying entities and generally making search reflect the real world. It’s not going to be enough in five years simply to use keywords well in your content or inbound anchor links. Google is going to want to want to return rich objects that represent “real-world” concepts that people understand, even if those concepts exist primarily online. This fits well into the idea of the dominant interpretation, too (as outlined in Google’s rater guidelines and other documents). Whether I search for “Microsoft” or “Sam’s Barber Shop”, the dominant interpretation model suggests that the entity’s website is the best match, regardless of other ranking factors or the strength of their SEO.
There's only one problem with the entity explanation. Generally speaking, I'd expect an entity to be stable – once a query was classified as an entity and acquired expanded sitelinks, I'd expect it to stay that way. As mentioned, though, the data is fairly unstable. This could indicate that entity detection is dynamic – based on some combination of on-page/link/social/user signals.
VIII. The Secret Sauce is Ketchup
Ok, maybe “secrets” was a bit of an exaggeration. The question of what actually triggers a 7-result SERP is definitely complicated, especially as Google expands into Knowledge Graph and advanced forms of entity association. I'm sure the broader question on everyone's mind is "How do I get (or stop getting) a 7-result SERP?" I'm not sure there's any simple answer, and there's definitely no simple on-page SEO trick. The data suggests that even a strong link profile (i.e. authority) may not be enough. Ultimately, query intent and complex associations are going to start to matter more, and your money keywords will be the ones where you can provide a strong match to intent. Pay attention not only to the 7-result SERPs in your own keyword mix, but to queries that trigger Knowledge Graph and other rich data – I expect many more changes in the coming year.
Comments
Please keep your comments TAGFEE by following the community etiquette
Comments are closed. Got a burning question? Head to our Q&A section to start a new conversation.