Google considers this to be spam. Sometimes pages get away with doing this, but generally you're going to eventually get a manual action reported in Search Console.
- Home
- MichaelC-15022
MichaelC-15022
@MichaelC-15022
Job Title: SEO Consultant
Company: OzTech, Inc.
Founder of Visual Itineraries, a closing tool/lead generation tool for niche and high-end travel agents. Co-founder of TheBigDay honeymoon registry; avid traveler, photographer, and still plays with cars and motorcycles. As of 2010, plays with airplanes too :-). Living the life in Bend,Oregon.
Favorite Thing about SEO
Truly interesting and lively people in this industry.
Latest posts made by MichaelC-15022
-
RE: Google Rich Snippets in E-commerce Category Pages
-
RE: Miriam's 7 Local SEO Predictions for 2019
I think most of the community is currently Christmas shopping online...and making decisions based on fake reviews :-p.
-
RE: Miriam's 7 Local SEO Predictions for 2019
Great predictions, Miriam!
I'll add one more...maybe it's more of a wish than a prediction...that Google will make some sort of serious strides towards cracking down on fake reviews (both positive and negative). Hopefully not as over-the-top as Yelp's approach (which throws a lot of babies out with the bathwater!) though.
-
RE: Client wants to rebrand but insists on keeping their old website live as well...
I'll second Miriam's points, above. There's substantial risk here if both sites are going to be visible to Google.
I'd block the old site in in robots.txt permanently. I'd never redirect the old site to the new, even if cleanup had been done. From the penalty recovery work I've done, it sure feels like Google keeps some sort of permanent flag on your site, even after you've done the cleanup. New, good links don't seem to have as much effect as you'd expect.
For the new site, spend the $$ and do some PR/outreach and some solid, strong links in addition to the core directory links you get via MozLocal. Do some community service work that gets a press mention; offer a scholarship to dentistry students from a specific school, so that the school will link to your scholarship page. A few really good links from newspaper stories will work wonders for getting the new site to rank, both in the 3-pack and in regular organic.
-
RE: Does a JS script who scroll automaticaly into pages could make some content "hidden" ?
Depending on how you cause the scroll to happen, Google might render the page unscrolled or scrolled. Usually if it's done in the onload() function via Jscript, Google will execute that script and render the page as it is after the script is executed. I've seen examples though where using JQuery's document ready function is NOT executed by Google to render the page.
Test in Google Search Console, using Fetch and Render as Googlebot.
-
RE: Location pages for Two location business
Hi Justin,
Don't sweat having the NAP of both locations on multiple pages if you don't mark those up with schema.org. FYI, multiple schema.org objects on a page is perfectly normal, even of the same type.
Be sure you have dedicated pages for each location, and on THOSE pages, mark the NAP up with schema. Then, in your Google My Business pages, you want to link to the specific location page that corresponds to the GMB page, NOT to your home page.
You can link back to the GMB page from the location-specific page on your website, or from all pages (e.g. in the footer).
-
RE: Is Pagination & thin text issue affecting our traffic?
It looks fine to me. You're using rel next/prev correctly, you've got plenty of text on the page. You're correctly setting rel canonical to the numbered page. All looks good to me.
-
RE: Should I disavow local citation page links?
I wouldn't sweat it. There are a jillion 3rd-tier business listing directories out there that are pulling that sort of data from the major directories. Yes, it's an issue if ALL you have is super weak links, but you'll need to be doing outreach for link-building anyway so that should not be a big deal.
I'd only disavow links that are actual spam. Not weak but legitimate links.
-
RE: Query results being indexed and providing no value to real estate website - best course of action?
Ideally, you'd set the meta robots in that page to noindex,follow. This will allow link juice to flow from all of those pages to the pages in your main navigation as well as removing them from the index.
If you cannot modify the section of those pages, then, at a minimum, you could tell Webmaster Tools to ignore the pre and start parameters (specify that the parameter merely sorts the data on the page). Then, you'd end up with just 1 page indexed per city, which is probably a lot better than where you are now.
-
RE: What is your opinion in the use of jquery for a continuous scroll type of page layout?
Google is NOT going to see the content that's rendered by scrolling. In general, more is better in terms of content on a single page (provided it's not crap of course). See this article from Search Engine Land.
For those same reasons, having it on separate pages isn't as good an idea. If you think about how RankBrain is supposed to work, Google is going to be looking for terms on the page that commonly co-occur with the page's primary target search term on other pages on the web about that topic. So, by farming subsections of content out to other pages, you're shooting yourself in the foot, as Google is only going to give you brownie points for covering the subtopics in the very first page.
A better way to do this:
- put all the content on one page
- in the onload() or the Jquery document ready function, hide all but the first page's worth of content
- now, you can react to a scroll by calling Jscript functions to hide the currently shown content and show the next page's worth...all on the same URL
Best posts made by MichaelC-15022
-
RE: How does a collapsed section affect on page SEO?
Hi Stephan,
Presuming the expand/collapse thing is done properly, it should be golden. You'll find a lot of sites use this approach when they have multiple pages of content, e.g. a product page with specifications, reviews, technical details, etc.
I do this on my travel website. A great way to test to see if the initially-collapsed content is being seen and indexed by Google is to take a block of text from the collapsed section and search for it in double-quotes.
Here's an example: search for "At the Bora Bora Pearl Beach Resort you can discover the sparkling magic of the lagoon". You'll find my site there at #3 (Visual Itineraries), along with the other 1000 websites who've also copied the resort's description straight from the resort's website (yeah, I really shouldn't do this). So much for Google's duplicate content detection when it comes to text chunks...BUT I DIGRESS. That content you see is on the More Info tab.
Now, on to what "done properly" means:
- each tab should be in a separate div
- assign all divs a class which has style="display:none;" EXCEPT the currently selected tab
- have onclick handlers for the tabs that set all of the divs' classes to the display:none class, and then set the newly selected tab's div class to one with display:block or display:inline
And not done properly would mean something like changing the text of a div with Javascript onclick()....because Google won't see that text in the Javascript. It's got to be in the HTML.
That's about it. Not so tricky, really. And works well both for usability (no roundtrip to the server, not even an Ajax fetch!) and for SEO (lotsa yummy content on a single page for Panda).
-
RE: Do mobile and desktop sites that pull content from the same source count as duplicate content?
Be sure you follow the best practices outlined here for separate mobile sites. In short, you want the desktop pages to have a rel alternate tag pointing at the mobile equivalent, and the mobile pages having their rel canonical pointing at the desktop equivalents.
-
RE: Why are these sites outranking me?
The social interaction counts are going to affect personalized results a lot more than depersonalized results (although this may be changing in the near future....see Eric Enge's post about this).
I'd say your backlink profile stats are right in line with the other 2 sites you mention. I'd say the differences in DA, PA, and RDs linking are really negligible. Your site and those 2 are all similarly tuned for the target phrase, from page title to URL to amount of content on the page.
You might try increasing the amount of unique content on your page. Big, original images, maybe shoot a little intro video of yourself talking about your game, and embed that. And crank up the total text on the page to over 2000 words. (See this study.)
-
RE: Query results being indexed and providing no value to real estate website - best course of action?
Ideally, you'd set the meta robots in that page to noindex,follow. This will allow link juice to flow from all of those pages to the pages in your main navigation as well as removing them from the index.
If you cannot modify the section of those pages, then, at a minimum, you could tell Webmaster Tools to ignore the pre and start parameters (specify that the parameter merely sorts the data on the page). Then, you'd end up with just 1 page indexed per city, which is probably a lot better than where you are now.
-
RE: Google reconsideration nightmare
From the comment "show more efforts", I'd say you'll want to show not just more success at removing links, but how many times you contacted each webmaster and how.
I've had experiences with a couple of clients where the kinds of links that kept getting pointed out by the Google spam team tended to be article marketing examples, where the pages linking to my client's site were not in the WMT links, not in OSE, etc.....far too weak. So you're not alone there.
I would advise looking at all the examples you can find of any article marketing that was done for your site, then try to find all related pages...i.e., don't JUST try to remove the examples they pointed out. In other words, if you find there's someone named "Andy Smith" authoring some of the article marketing posts they've pointed out, then do a Google search for "Andy Smith" and your brand name to try to find any other article this person wrote for you. In my case, I was able to find quite a collection of pages in the Google Index (not even supplemental...the regular index!) that weren't in the WMT links nor in OSE etc. Also, take a big block of text from the start of each article and search for that in double-quotes, to see if it was posted elsewhere under a different name.
Then, chase these down, try and get them taken down, ping the webmaster 3-4x each, then disavow them and submit your reinclusion request.
-
RE: Miriam's 7 Local SEO Predictions for 2019
Great predictions, Miriam!
I'll add one more...maybe it's more of a wish than a prediction...that Google will make some sort of serious strides towards cracking down on fake reviews (both positive and negative). Hopefully not as over-the-top as Yelp's approach (which throws a lot of babies out with the bathwater!) though.
-
RE: Are there any negative side effects of having millions of URLs on your site?
I'll echo Robert's concern about duplicate content. If those facet combinations are creating many pages with very similar content, that could be an issue for you.
If, let's say, there are 100 facet combinations that create essentially the same basic page content, then consider taking facet elements that do NOT substantially change the page content, and use rel=canonical to tell Google that those are all really the same page. For instance, let's say one of the facets is packaging size, and product X comes in boxes of 1, 10, 100, or 500 units. Let's say another facet is color, and it comes in blue, green, or red. Let's say the URLs for these look like this:
www.mysite.com/product.php?pid=12345&color=blue&pkgsize=1
www.mysite.com/product.php?pid=12345&color=green&pkgsize=10
www.mysite.com/product.php?pid=12345&color=red&pkgsize=100
You would want to set the rel=canonical on all of these to:
www.mysite.com/product.php?pid=12345
Be sure that your XML sitemap, your on-page meta robots, and your rel=canonicals are all in agreement. In other words, if a page has meta robots "noindex,follow", it should NOT show up in your XML sitemap. If the pages above have their rel=canonicals set as described, then your sitemap should contain www.mysite.com/product.php?pid=12345 and NONE of the three example URLs with the color and pkgsize parameters above.
-
RE: Organic search traffic dropped 40% - what am I missing?
Possibilities:
- The layout of the product pages for the new shopping cart is pissing off Panda. If that's the case, the traffic to the home page shouldn't have changed much, but the product pages will have dropped.
- Panda now sees the pages in general as having less content than before, perhaps images aren't getting loaded in the pages in such a way that Google sees them whereas they were before, something like that....and Panda now thinks the entire site is less rich in content.
- It often seems to take Google a month or so to "settle out" all of the link juice flows when you do a bunch of redirects, have new URLs, etc. I would expect that the link juice calculation is iterative, and that would be why it would take a number of iterations of the PageRank calculation in order for entirely new URLs to "get" all the link juice they should have.
- Their backlinks were moderately dependent upon a set of link networks, and those link networks have shut down all their sites (so that neither Google nor Bing still see the links from them).
Those are the ideas that come to mind so far.
-
RE: Client wants to rebrand but insists on keeping their old website live as well...
I'll second Miriam's points, above. There's substantial risk here if both sites are going to be visible to Google.
I'd block the old site in in robots.txt permanently. I'd never redirect the old site to the new, even if cleanup had been done. From the penalty recovery work I've done, it sure feels like Google keeps some sort of permanent flag on your site, even after you've done the cleanup. New, good links don't seem to have as much effect as you'd expect.
For the new site, spend the $$ and do some PR/outreach and some solid, strong links in addition to the core directory links you get via MozLocal. Do some community service work that gets a press mention; offer a scholarship to dentistry students from a specific school, so that the school will link to your scholarship page. A few really good links from newspaper stories will work wonders for getting the new site to rank, both in the 3-pack and in regular organic.
-
RE: Homepage ranks worse than subpages
I'd agree with Monica. Panda's above-the-fold algo is absolutely going to slay your home page. You've got only 1 sentence of content above the fold. Your images in the slider are all clickable (except the Lego image), and besides, they don't seem to be foreground images (except the Lego image)...Panda is likely going to see them as decoration.
Your video is probably not seen as video. I see no schema.org/VideoObject markup, and it doesn't seem to be one of the standard embeds (YouTube, Vimeo, Wistia) that Panda can likely recognize in the HTML.
Everything else on the page is clickable, which (this is my theory only) is likely to cause Panda to see it as navigation....not content.
So....I'd recommend:
- chucking your current slider; choose a different plugin (or write it from scratch, it's only a couple dozen lines of Javascript and a few links of plain old boring HTML), so that the images are seen as content AND they're not clickable, except the next/prev slide buttons
- redesign your layout to pull some of the text below up above the fold, including moving the Archive & Communicate section and its siblings above the giant buttons
- use Wistia to embed the video, and follow their instructions re: creation of a video sitemap
I'd also recommend going into Google Webmaster Tools, and doing a Fetch & Render on your home page, to make sure that Google is able to see your page laid out the way you expect.
Founder of Visual Itineraries, a closing tool/lead generation tool for niche and high-end travel agents. Co-founder of TheBigDay honeymoon registry; avid traveler, photographer, and still plays with cars and motorcycles. As of 2010, plays with airplanes too :-). Living the life in Bend,Oregon.
Looks like your connection to Moz was lost, please wait while we try to reconnect.