Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Ive had a bit of a dilemma whether to go for a full ecommerce site or having a seperate shop section.  My main goal is to push our installation services so Ive decided to go with the latter option.  The main categories will be focused solely on Installation services and then Ill have a seperate category which will take the customer to mydomain.com/shop where well have our products for them to buy and fit themselves.  The only issue I see is that Im going to have two pages competing against each other.  Theyll both have different content but one will be focusing on us installing a particular product and the other will focus on the customer buying it to fit themselves.  Will it make things more difficult to rank or wont it make a difference?

    | paulfoz1609
    0

  • Hi, Here is a rather detailed overview of our problem, any feedback / suggestions is most welcome. We currently have 6 sites targeting the various markets (countries) we operate in all websites are on one wordpress install but are separate sites in a multisite network, content and structure is pretty much the same barring a few regional differences. The UK site has held a pretty strong position in search engines the past few years. Here is where we have the problem. Our strongest page (from an organic point of view) has dropped off the search results completely for Google.co.uk, we've picked this up through a drop in search visibility in SEMRush, and confirmed this by looking at our organic landing page traffic in Google Analytics and Search Analytics in Search Console. Here are a few of the assumptions we've made and things we've checked: Checked for any Crawl or technical issues, nothing serious found Bad backlinks, no new spammy backlinks Geotarggetting, this was fine for the UK site, however the US site a .com (not a cctld) was not set to the US (we suspect this to be the issue, but more below) On-site issues, nothing wrong here - the page was edited recently which coincided with the drop in traffic (more below), but these changes did not impact things such as title, h1, url or body content - we replaced some call to action blocks from a custom one to one that was built into the framework (Div) Manual or algorithmic penalties: Nothing reported by search console HTTPs change: We did transition over to http at the start of june. The sites are not too big (around 6K pages) and all redirects were put in place. Here is what we suspect has happened, the https change triggered google to re-crawl and reindex the whole site (we anticipated this), during this process, an edit was made to the key page, and through some technical fault the page title was changed to match the US version of the page, and because geotargetting was not turned on for the US site, Google filtered out the duplicate content page on the UK site, there by dropping it off the index. What further contributes to this theory is that a search of Google.co.uk returns the US version of the  page. With country targeting on (ie only return pages from the UK) that UK version of the page is not returned. Also a site: query from google.co.uk DOES return the Uk version of that page, but with the old US title. All these factors leads me to believe that its a duplicate content filter issue due to incorrect geo-targetting - what does surprise me is that the co.uk site has much more search equity than the US site, so it was odd that it choose to filter out the UK version of the page. What we have done to counter this is as follows: Turned on Geo targeting for US site Ensured that the title of the UK page says UK and not US Edited both pages to trigger a last modified date and so the 2 pages share less similarities Recreated a site map and resubmitted to Google Re-crawled and requested a re-index of the whole site Fixed a few of the smaller issues If our theory is right and our actions do help, I believe its now a waiting game for Google to re-crawl and reindex. Unfortunately, Search Console is still only showing data from a few days ago, so its hard to tell if there has been any changes in the index. I am happy to wait it out, but you can appreciate that some of snr management are very nervous given the impact of loosing this page and are keen to get a second opinion on the matter. Does the Moz Community have any further ideas or insights on how we can speed up the indexing of the site? Kind regards, Jason

    | Clickmetrics
    0

  • Hello Experts, For my ecommerce site at product listing page which functionality is best to implement Load more or pagination 1,2,3....and why? Thanks!

    | Johny12345
    1

  • Hi everyone, I have a problem with a website wherein all URLs (homepage, inner pages) are 302 redirected. This is based on Screaming Frog crawl. But the weird thing is that they are 302 redirected to themselves which doesn't make any sense. Example:
    https://www.example.com.au/ is 302 redirected to https://www.example.com.au/ https://www.example.com.au/shop is 302 redirected to https://www.example.com.au/shop https://www.example.com.au/shop/dresses is 302 redirected to https://www.example.com.au/shop/dresses Have you encountered this issue? What did you do to fix it? Would be very glad to hear your responses. Cheers!

    | alex_goldman
    0

  • Hello I'm cleaning my old posts from my website and I found that lots of posts that I have to delete have a good PA I want to know how to transfer the PA of an old post to a new one. Is it possible to do it with 301 redirect? Is there any way to 100% transfer of link juice? Thank you

    | shadowava
    0

  • Hello Mozzers - I am looking at a site that deals with URLs that generate parameters (sadly unavoidable in the case of this website, with the resource they have available - none for redevelopment) - they deal with the URLs that include parameters with *robots.txt - e.g. Disallow: /red-wines/? ** Beyond that, they userel=canonical on every PAGINATED parameter page[such as https://wine****.com/red-wines/?region=rhone&minprice=10&pIndex=2] in search results.** I have never used this method on paginated "product results" pages - Surely this is the incorrect use of canonical because these parameter pages are not simply duplicates of the main /red-wines/ page? - perhaps they are using it in case the robots.txt directive isn't followed, as sometimes it isn't - to guard against the indexing of some of the parameter pages??? I note that Rand Fishkin has commented: "“a rel=canonical directive on paginated results pointing back to the top page in an attempt to flow link juice to that URL, because “you'll either misdirect the engines into thinking you have only a single page of results or convince them that your directives aren't worth following (as they find clearly unique content on those pages).” **- yet I see this time again on ecommerce sites, on paginated result - any idea why? ** Now the way I'd deal with this is: Meta robots tags on the parameter pages I don't want indexing (nofollow, noindex - this is not duplicate content so I would nofollow but perhaps I should follow?) 
    Use rel="next" and rel="prev" links on paginated pages - that should be enough. Look forward to feedback and thanks in advance, Luke

    | McTaggart
    0

  • Hi all I am looking for some really good clear examples of sites that have excellent JSON LD markup.  Not just the basics but packed to the teeth with markup for every element.  I am particularly interested in e-commerce applications as I am re skinning our e-commerce platform written from scratch in house. It is far from perfect, not mobile friendly and well a bit backward but links into everything we have in a seamless way all the way to our manufacturing plant.  Take a look have a little laugh and then take pity 🙂 https://www.spurshelving.co.uk/shop/shop.aspx Thanks Pete

    | Eff-Commerce
    0

  • I work for a company that makes an important product in a category. The company has a website (www.company.org); the product is at www.company.org/product. We recently (early May) redesigned and rearchitected the product site for SEO purposes. The company site talks about the category a bit (imagine the Colgate site; it talks about "toothpaste" a bit). The blog (blog.company.org/product) also talks about the category quite a bit (and links to the company site of course). The product is a major product in the category, among the top 3. The site and blog have been around for 15+ years. The site has appx. a billion backlinks, most branded links to the product. It's in the top 50 highest ranked sites among all sites on the internet in the ahrefs rank index. Imagine you are searching for our product category, "category". If you search for "category" in Bing today, my company's site is the 3rd result, and it's the 1st result from a company that makes a product in this category. If you search for "category" in Google today, our site is not in the top 150 results. In fact, the site keeps dropping out of Google's index. (See attached for what that looks like in the search console.) What might cause a site to jump from "ranked in top 10" to "not ranked" in Google -- back and forth every couple of days? Penalties? Our recent (early May) site rearchitecture? We're not making giant, index-shifting changes every day. wE0Bn

    | hoosteeno
    0

  • Ive got a current domain and after a bit of a rebrand Im considering 301 rediecting the current site to a newly purchased domain.  Id redirect each age to idential pages.  Am I likely to see any issues.  I know this is the recomended way from Google but just wondering how smoothly it works and whether Im likely to see any ranking drops or other problems?

    | paulfoz1609
    0

  • Hi Guys, We have product pages on our site which have duplicate content, the search volume for people searching for these products is very, very small. Also if we add unique content, we could face keyword cannibalisation issues with category/sub-category pages. Now based on proper SEO best practice we should add rel canonical tags from these product pages to the next relevant page. Pros Can rank for product oriented keywords but search volume is very small. Any link equity to these pages passed due to the rel canonical tag would be very small, as these pages barely get any links. Cons Time and effort involved in adding rel canonical tags. Even if we do add rel canonical tags, if Google doesn't deem them relevant then they might ignore causing duplicate content issues. Time and effort involved in making all the content unique - not really worth it - again very minimal searchers. Plus if we do make it unique, then we face keyword cannibalisation issues. -- What do you think would be the optimal solution to this? I'm thinking just implementing a: Across all these product based pages. Keen to hear thoughts? Cheers.

    | seowork214
    0

  • Hi guys, We're noticing a few alternate hostnames for a website rearing their ugly heads in search results and I was wondering how everyone else handles them. For example, we've seen: alt-www.(domain).com test.(domain).com uat.(domain).com We're looking to ensure that these versions all canonical to their live page equivalent and we're adding meta robots noindex nofollow to all pages as an initial measure. Would you recommend a robots.txt crawler exclusion to these too? All feedback welcome! Cheers, Sean

    | seanginnaw
    0

  • Good Morning Moz peeps, I am new to this but intending on starting off right! I have heard a wealth of advice that the "post name" permalink structure is the best one to go with however... i am wondering about a "custom structure" combing the "post name" following the below example structure: Www.professionalwarrior.com/bodybuilding/%postname/ Where "professional" and "bodybuilding" is my focus/theme/keywords of my blog that i want ranked. Thanks a mill, RO

    | RawkingOut
    0

  • My company currently owns five different websites and every day we download a list of for Google crawl errors.I then crawl the downloaded list with screaming frog to double check the redirects to make sure the pages are not 404s. We have two websites that have similar identical content and used to cross-check each and used to redirect to the parent page by our bread crumbs. (404 error) www.website1.com/productxx.html (working Site) www.website2.com/productxx.html we then redirect (404 error) www.website1.com/productxx.html to the last parent page or similar page. Is there a faster way to compare two websites beside opening 200 windows all at once?  Is there a program that would allow us to compare two websites?

    | petmkt
    0

  • Hi there, We understand that hreflang tagging can be incorporated into an existing XML sitemap. That said, is there any inherent issue with having two sitemaps - your regular XML sitemap plus an international XML sitemap which lists off many of the same URLs as your original XML sitemap? For example, one of our clients has an XML sitemap file they don't want to have to edit, but we want to implement international hreflang xml sitemaps for them. Can we add an "English" XML sitemap with the proper hreflang tagging even though this new sitemap contains many duplicates as the existing XML sitemap file? Thank you!

    | FPD_NYC
    0

  • I make a living off my site, so everytime I have to make a critical decision on SEO, I have a lot of fear that I could make things worse and lose everything. I can find solutions here and google, but there always a lot of different opinions, so I was thinking in hiring someone to pick his brain and do what he/she thinks its best in my case. What do you guys think about this ? Do you have someone to recommend ? That you see his work with good results ? I have a wordpress blog (micro businnes), so my budget is not big.

    | Glinski
    1

  • Hi, If the "content" is the same, but is written in different languages, will Google see the articles as duplicate content?
    If google won't see it as duplicate content. What is the profit of implementing the alternate lang tag?Kind regards,Jeroen

    | chalet
    0

  • Hello, I have an ecommerce site with all pages crawled and indexed by Google. But I have some pages with multiple urls like : www.sitename.com/product-name.html and www.sitename.com/category/product-name.html There is a canonical on all these pages linking to the simplest url (so Google index only one page). So the multiple pages are not indexed, but Google still comes crawling them. My question is : Did I have any interest in avoiding Google to crawl these pages or not ? My point is that Google crawl around 1500 pages a day on my site, but there are only 800 real pages and they are all indexed on Google. There is no particular issue, so is it interesting to make it change ? Thanks

    | onibi29
    0

  • We have a site, blog.example.org, and another site, www.example.org. The most visited pages on www.example.org were redesigned; the redesign landed May 8. I would expect this change to have some effect on organic rank and conversions. But what I see is surprising; I can't believe it's related, but I mention this just in case. Between April 30 and May 7, Google stopped indexing roughly 1,000 pages on www.example.org, and roughly 3,000 pages on blog.example.org. In both cases the number of pages that fell out of the index represents appx. 15% of the overall number of pages. What would cause Google to suddenly stop indexing thousands of pages on two different subdomains? I'm just looking for ideas to dig into; no suggestion would be too basic. FWIW, the site is localized into dozens of languages.

    | hoosteeno
    0

  • Hello members. I have a question that I am seeking to confirm whether or not I am on the right track. I am interested in purchasing a .ly domain which is the ccTLD for Libya. The purpose of the .ly domain would be for branding purposes however at the same time I do not want to kill the websites ability to rank in Google.com (United States searches) because of this domain. Google does not consider .ly to be one of those generic ccTLDs like. io, .cc, .co, etc. that can rank and Bitly has also moved away from the .ly extension to a .com extension. Back in 2011 when there was unrest in Lybia, a few well known sites that utilized the .ly extension had their domains confiscated such as Letter.ly, Advers.ly and I think Bitly may have been on that list too however with the unrest behind us it is possible to purchase a .ly so being able to obtain one is not an issue. From what I can tell, I should be able to specify in Google Search Console that the website utilizing the .ly extension is a US based website. I can also do this with Google My Business and I will keep the Whois info public so the whois data can been seen as a US based website. Based on everything I just said do any of you think I will be OK if I were to register and use the .ly domain extension and still be able to rank in Google.com (US Searches). Confirmation would help me sleep better. Thanks in advance everyone and have a great day!!

    | joemaresca
    0

  • I asked the question originally on webmaster central. I tried RickRoll's solutions (but it doesn't seem to have solved the issue). Problem below: I've been noticing for some time that certain pages of our site (https://www.renthop.com/boston-ma/apartments-for-rent) have been deindexed locally (or very low ranked), but indexed nationally (well ranked). In fact, it seems that the actual page isn't ranking (but the blog https://www.renthop.com/blog is). This huge mismatch between national vs local rankings seem to only happen for Boston & Chicago. Other parts of the country seem unaffected (and the national & local rankings are very similar). A bit of a background (and my personal theory as to what's happening). We use to have subdomains: boston.renthop.com & chicago.renthop.com for the site. These subdomains stopped working, though, as we moved the site to the directory format (https://www.renthop.com/boston-ma/apartments-for-rent). These subdomain URLs were inactive / broken for roughly 4 months. After the 4 months, we did a 301 from the subdomain to the main page (because these subdomains had inbound external links). However, this seems to have caused the directory pages to exhibit the national/local mismatch effect instead of helping. Is there anything I'm doing wrong? I'm not sure if the mismatch is natural, if the pages are getting algo penalized on a local level (I'm negative SEOing myself), or if it's stuck in some weird state because of what happened with bad sub-domain move). Some things I've tried: I've created webmaster console (verified) accounts for both the subdomains. I've asked Google to crawl those links. I've done a 1-1 mapping between individual page on the old site vs the new directory format I've tried both doing a 301, 302 and meta-refresh redirect from the subdomains to the directory pages. I've made sure the robots.txt on the subdomain is working properly I've made sure that the robots.txt on the directory pages are working properly. See below for a screenshot of the mismatch & deindexing in local search results (this is using SERPS - but can be replicated with any location changer). Note the difference between the ranking (and the page) when the search is done nationally vs in the actual location (Boston, MA). I'd really appreciate any help.. I've been tearing my hair out trying to figure this out (as well as experimenting). renthop%2Bboston.png

    | lzhou
    0

  • Hey Friends, I can't seem to figure out why https://feello.com/ isn't ranking on Google for it's branded term (Feello). It's ranking in 1st position on Bing and Yahoo but on page 2 (16th or so) on Google. Going through the list and can't come up with an answer. Metadata: Yes Indexed to Webmaster: Yes, Fetched pages: Yes Google cache on May 27, 2017: Check Using canonical and redirecting for non-www and HTTPS version: Yes & Yes Feello in domain name: Yes Set up social profiles and GMB: Yes Driving traffic: Yes, some email and ads Checked robots.txt: Yes, not created yet Created and Submitted Sitemap: Yes - https version Checked for blocked resources: None. The list goes on...Any ideas would be appreciated.

    | GarrettDenham
    0

  • Hello, I have a wordpress blog that have more than 10 years old, when I created the blog the permalink had the date included. Example: site.com/2007/02/02/my-post/ Do you guys think is it worth the risk of changing my url escructure to remove the date ? Ofcourse I would do the 301 redirects and such... What I want to know if this will have any significant SEO advantage considering Google evolved so much ?
    Thank you very much for reading my question 🙂

    | Glinski
    0

  • Hi, I am running a wordpress site and our blog has grown to have a bit of a life of its own. I would like to use a more blog-oriented wordpress theme to take advantage of features that help with content discoverability, which is what the current theme I'm using doesn't really provide. I've seen sites like Canva, Mint and Hubspot put their blog on a subdomain, so the blog is sort of a separate site within a site. Advantages I see to this approach: Use a separate wordpress theme Help the blog feel like its own site and increase user engagement Give the blog its own name and identity My questions are: Are there any other SEO ramifications for taking this approach? For example, is a subdomain (blog.mysite.com) disadvantageous somehow, or inferior to to mysite.com/article-title? Regarding redirects, I read a recent Moz article about how 301s now do not lose page rank. I would also be able to implement https when I redirect, which is a plus. Is this an ok approach? Assuming I have to create redirect rules manually for each post though Thanks!

    | mikequery
    0

  • Hi just wondering i'm using the same image across 20 pages which are optimized for SEO purposes. I was wondering is there issues with this from SEO standpoint? Will Google devalue the page because the same image is being used? Cheers.

    | seowork214
    0

  • So, I have major concerns with this plan. My company has hundreds of facilities located all over the country. Each facility has it's own website. We have a third party company working to build a content strategy for us. What they came up with is to create a bank of content specific to each service line. If/when any facility offers that service, they then upload the content for that service line to that facility website. So in theory, you might have 10-12 websites all in different cities, with the same content for a service. They claim "Google is smart, it knows its content all from the same company, and because it's in different local markets, it will still rank." My contention is that duplicate content is duplicate content, and unless it is "localize" it, Google is going to prioritize one page of it and the rest will get very little exposure in the rankings no matter where you are. I could be wrong, but I want to be sure we aren't shooting ourselves in the foot with this strategy, because it is a major major undertaking and too important to go off in the wrong direction. SEO Experts, your help is genuinely appreciated!

    | MJTrevens
    1

  • I've been thinking about this as I go through my daily link building activities for clients. Do we really know as much as we hope/think we do about how Google values inbound links, which links actually matter, and how much these link signals play into rankings? For example, does Google REALLY value the fact that a business is paying to sponsor a local sports team, or to join a local chamber? For local businesses, link building is rather difficult because they don't necessarily have the resources or ability to implement ongoing Content Marketing initiatives to earn links naturally. How can we be sure that the things we recommend actually make a difference? I had my family real estate business featured in almost a dozen articles as expert sources, with links from authoritative sites like Realtor.com and others. Does Google distinguish between a profile link on a site like Realtor.com vs. being featured as an expert source on home page news? Just second guessing a lot of this today. Anyone can to share thoughts and insights?

    | RickyShockley
    0

  • Hi Our site has a lot of similar/lower quality product pages which aren't a high priority - so these probably won't get looked at in detail to improve performance as we have over 200,000 products . Some of them do generate a small amount of revenue, but an article I read suggested no-indexing pages which are of little value to improve site performance & overall structure. I wanted to find out if anyone had done this and what results they saw? Will this actually improve rankings of our focus areas? It makes me a bit nervous to just block pages so any advice is appreciated 🙂

    | BeckyKey
    0

  • Hi, I'm working with a Shopify site that has about 10x more URLs in Google's index than it really ought to. This equals thousands of urls bloating the index. Shopify makes it super easy to make endless new collections of products, where none of the new collections has any new content... just a new mix of products. Over time, this makes for a ton of duplicate content. My response, aside from making other new/unique content, is to select some choice collections with KW/topic opportunities in organic and add unique content to those pages. At the same time, noindexing the other 90% of excess collections pages. The thing is there's evidently no method that I could find of just uploading a list of urls to Shopify to tag noindex. And, it's too time consuming to do this one url at a time, so I wrote a little script to add a noindex tag (not nofollow) to pages that share various identical title tags, since many of them do. This saves some time, but I have to be careful to not inadvertently noindex a page I want to keep. Here are my questions: Is this what you would do? To me it seems a little crazy that I have to do this by title tag, although faster than one at a time. Would you follow it up with a deindex request (one url at a time) with Google or just let Google figure it out over time? Are there any potential negative side effects from noindexing 90% of what Google is already aware of? Any additional ideas? Thanks! Best... Mike

    | 94501
    0

  • A week ago, I took a canonical off of a page that was pointing to the homepage for a very big, generic search term for my brand as we felt that it could have been harming our rankings (as it wasn't a true canonical page). A week in and our rankings for the term have dropped 7 positions out of page 1 and the page we want to rank instead is nowhere to be seen. Do I hang fire? As such a big search term, it's affecting traffic, but I don't want to make any rash decisions. Here's a bit more info: For arguments sake, let's call the search term we're going after 'Boots', with the URL where the canonical was placed of /boots. The canonical went to the root domain as we sell, well... boots. At the time, the homepage was ranking for Boots on page 1 and we wanted to change this so that the Boots page ranked for that term... all logical right? We did the following: Took off mentions of Boots from meta on the homepage and made sure it was optimised for on the boots page. Took the canonical off of /boots. Used GSC to fetch & ask Google to recrawl "/boots". Resubmitted the sitemap. Do I hang fire on running back to the safety of ranking for boots on the homepage? Do I risk keyword cannibalisation by adding the search terms back to the homepage?

    | Kelly_Edwards
    0

  • hello, guys, at the moment we have 3 websites, basically, the websites have the same content and appearance. we got UK website, New Zealand web site and USA website for the different business purpose. I have some questions about multisite for SEO, with similar content, will it harm website ranking? if it is bad, what should we do to deal with multisite? thank you

    | kelvinbongcn85
    0

  • hi i'm very new in seo want to have links to my website:www.warningbroker.com how i can get links to my website?

    | marketing66
    0

  • Hi I'm still stuck on the subject if SEO friendly facets. Firstly, is it worth investing time in over things like SEO campaigns/content marketing as I'm the only one working on SEO and trying to prioritise all tasks 🙂 Can I set up facets so they are SEO friendly - should they simply be blocked? my concern is wasting crawl budget and duplicate pages. Here's an example of a page on the site - https://www.key.co.uk/en/key/lift-tables Here's an example of a facet URL - https://www.key.co.uk/en/key/lift-tables#facet:-1002779711011711697110,-700000000000001001651484832107103,-700000000000001057452564832109109&productBeginIndex:0&orderBy:5&pageView:list& What would be the best course of action to take to make them SEO friendly? Tips would be appreciated 🙂

    | BeckyKey
    0

  • We own www.homemenorca.com, a real estate website based in Spain. Pages from this domain are not being indexed: https://www.google.com/search?q=site%3Awww.homemenorca.com&oq=site%3Awww.homemenorca.com&aqs=chrome..69i57j69i58j69i59l2.3504j0j7&sourceid=chrome&ie=UTF-8Please notice that the URLs are Home Menorca, but the titles are not Home Menorca, they are Fincas Mantolan, a completely different domain and company: http://www.fincasmantolan.com/. Furthermore, when we look at Google's cache of Home Menorca, we see a different website: http://webcache.googleusercontent.com/search?q=cache%3Awww.homemenorca.com%2Fen&oq=cache%3Awww.homemenorca.com%2Fen&aqs=chrome..69i57j69i58j69i59.1311j0j4&sourceid=chrome&ie=UTF-8We reviewed Google Search Console, Google Fetch, the canonical tags, the XML sitemap, and many more items. Google Search Console accepted our XML sitemap, but is only indexing 5-10% of the pages. Google is fetching and rendering the pages properly. However, we are not seeing the correct content being indexed in Google. We have seen issues with page loading times, loading content longer than 4 seconds, but are unsure why Google would be indexing a different domain.If you have suggestions or thoughts, we would very much appreciate it.Additional Language Issue:When a user searches "Home Menorca" from America or the UK with "English" selected in their browser as their default language, they are given a Spanish result. It seems to have accurate hreflang annotations within the head section on the HTML pages, but it is not working properly. Furthermore, Fincas Mantolan's search result is listed immediately below Home Menorca's Spanish result. We believe that if we fix the issue above, we will also fix the language issue. Please let us know any thoughts or recommendations that can help us. Thank you very much!

    | CassG1234
    0

  • Hi Guys, Have a site which ends ?v=6cc98ba2045f for all its URLs. Example: https://domain.com/products/cashmere/robes/?v=6cc98ba2045f Just wondering does Google ignore what is after the ?. Also any ideas what that is? Cheers.

    | CarolynSC
    0

  • Hello, I was wondering what the best way would be to implement Canonical Tags in kind of a unusual situation. The company I work for creates single property websites for real estate agents. We register a URL such as 123MainSt.com - however through DNS we redirect that to a path. For example: http://www.944milmadadr.com would redirect to: https://www.qwikvid.com/realestate/go/v1/home/?idx=wDg1Gdwt7wnQiR3LMeCx28qPnWTKM0JV If we wanted to rank high in the search engines for our clients: "944 Milmada Dr" - Would it be the best practice to Canonical: http://www.944milmadadr.com ? Thanks in advance for any feedback on this!! Jason

    | Qwikvid
    0

  • Hello Moz Team, I would like to implement AMP for my single blog post not on whole blog. Is it possible? if Yes then How? Note - I am already using GTM for my website abcd.com but I would like to use for my blog post only and my blog is like - abcd.com/blog..............let me clarify Blog Post means - abcd.com/blog/my-favorite-dress Thanks!

    | Johny12345
    0

  • Have seen a fairly substantial drop in Google search console, I'm still looking into it comparing things, but does anyone know if there's been a Google updates within the past few days?  Or has anyone else noticed anything? Thanks

    | seoman10
    0

  • Hi Guys, Noticed this recently, for the keyword "granny flat prices" on Google Australia. See screenshot: https://prnt.sc/fmp4is Any ideas why Google is showing a 257 character description like this? Cheers.

    | CarolynSC
    0

  • My client has 2 Wordpress sites (A and B). Each site is 20 pages, with similar site structures, and 12 of the pages on A having nearly 100% duplicate content with their counterpart on B. I am not sure to what extent A and/or B is being penalized for this. In 2 weeks (July 1) the client will execute a rebrand, renaming the business, launching C, and taking down A and B. Individual pages on A and B will be 301 redirected to their counterpart on C. C will have a similar site structure to A and B. I expect the content will be freshened a bit, but may initially be very similar to the content on A and B. I have 3 questions: Given that only 2 weeks remain before the switchover - is there any purpose in resolving the duplicate content between A and B prior to taking them down? Will 301 redirects from penalized pages on A or B actually hurt the ranking of the destination page on C? If a page on C has the same content as its predecessor on A or B, could it be penalized for that, even though the page on A or B has since been taken down and replaced with a 301 redirect?

    | futumara
    0

  • Hello I am doing a SEO auditing for a website which only has a few pages. I have no cPanel credentials, no FTP no Wordpress admin account, just watching it from the outside. The site works, the Moz crawler didn't report any problem, I can reach every  page from the menu. The problem is that - except for the few actual pages - no matter what you type after the domain name, you always reach the home page and don't get any 404 error. I.E. Http://domain.com/oiuxyxyzbpoyob/ (there is no such a page, but i don't get 404 error, the home is displayed and the  url in the browser remains Http://domain.com/oiubpoyob/, so it's not a 301 redirect). Http://domain.com/WhatEverYouType/ (same) Could this be an important SEO issue (i.e. resulting in infinite amount of duplicate content pages )? Do you think I should require the owner to prevent this from happening? Should I look into the .htaccess file to fix it ? Thank you Mozers!

    | DoMiSoL
    0

  • Hi There! So I work in an industry where there are different conventions for referring to, searching on and spelling the industry name. For example, let's pretend there were a variety of different conventions for referring to the SEO industry. So someone could search for S-EO, SEO, sEO, etc. and those would all be accepted and understood means of referring to the industry. If we use the SEO example as a comparison for our industry, the two most common conventions would be S-EO and SEO. Using this example, we rank on the first page for the term "SEO" but do not rank AT ALL for the term "S-EO". We have a high-value piece of content that is targeted in the following way: "S-EO (SEO): The Basics Guide" so it is more targeted at the hyphenated word but does not rank at all for the hyphenated version, whereas it is page one for the non-hyphenated term. As additional pieces of context: -In general, our site is more targeted at the hyphenated term and there are places where we rank in the top spot for both the hyphenated and non-hyphenated versions. For example, we rank in a top 2 position for both S-EO & SEO software but do not rank at all for the broader "S-EO" term. -There are times when we do appear on page one for the term "S-EO" but it's typically only for a matter or hours or days and then we disappear entirely from the SERPs for that term. We consistently appear for "SEO." -I currently do not believe we are dealing with a penalty of any sort - our link profile is clean and our spam score per Moz is 2 / 17. Any thoughts or ideas as to what is going on here and how we can potentially rank for the term "S-EO?"

    | dpayne1
    0

  • I'm thinking of using rel=canonical for similar products on my site. Say I'm selling pens and they are al very similar. I.e. a big pen in blue, a pack of 5 blue bic pens, a pack of 10, 50, 100 etc. should I rel=canonical them all to the best seller as its almost impossible to make the pages unique. (I realise the best I realise these should be attributes and not products but I'm sure you get my point) It seems sensible to have one master canonical page for bic pens on a site that has a great description video content and good images plus linked articles etc rather than loads of duplicate looking pages. love to hear thoughts from the Moz community.

    | mark_baird
    0

  • I just read a great article on the SEO benefits of external links to relevant authoritative sites. But it didn't state if the benefits still existed if the external links were nofollows.
    The article concluded: “Outgoing relevant links to authoritative sites are considered in the algorithms and do have a positive impact on rankings.” I found this old article on the subject, but opinions on the nofollow issue were mixed:
    https://moz.rankious.com/_moz/blog/external-linking-good-for-seo-whiteboard-friday Can anyone shed any light? Thanks! ~Caro

    | Caro-O
    1

  • We are currently handling search for a global brand www.example.com which has presence in many countries worldwide. To help Google understand that there is an alternate version of the website available in another language, we have used “hreflang” tags. Also, there is a mother website (www.example.com/global) which is given the attribution of “x-default” in the “hreflang” tag. For Malaysia as a geolocation, the mother website is ranking instead of the local website (www.example.com/my) for majority of the products. The code used for “hreflang” tag execution, on a product page, being: These “hreflang” tags are also present in the XML sitemap of the website, mentioning them below: <loc>http://www.example.com/my/product_name</loc> <lastmod>2017-06-20</lastmod> Is this implementation of “hreflang” tags fine? As this implementation is true across all geo-locations, but the mother website is out-ranking me only in the Malaysia market. If the implementation is correct, what could be other reasons for the same ranking issue, as all other SEO elements have been thoroughly verified and they seem fine.

    | Starcom_Search
    0

  • I was wondering how old the 404 data from Google Search Console actually is? Does anyone know over what kind of timespan their site 404s data is compiled over? How long do the 404s tend to take to disappear from the Google Search Console, once they are fixed?

    | McTaggart
    0

  • Hi all, Our website has more than 30 pages which are duplicates. So canonicals have been deployed to show up only 10 of these pages. Do more of these pages impact rankings? Thanks

    | vtmoz
    0

  • Hi Guys, Currently optimizing my e-commerce store which currently has around 100 words of content on average for each category page. Based on this study by Backlinko the more content the better: http://backlinko.com/wp-content/uploads/2016/01/02_Content-Total-Word-Count_line.png Would you say this is true for e-commerce pages, for example, a page like this: http://www.theiconic.com.au/yoga-pants/ What benefits would you receive with adding more content? Is it basically more content, leads to more potential long-tail opportunity and more organic traffic? Assuming the content is solid and not built just for SEO reasons. Cheers.

    | seowork214
    0

  • I want to target different keywords for the same e-commerce product. What's the best SEO practice? I'm aware of the pitfalls to keyword stuffing. The product example is the GoPro Hero 5 Action Camera. The same action camera can be used in many different activities, e.g. surfing, auto racing, mountain biking, sky diving, search & rescue, law enforcement etc. These activities target completely different markets, so naturally the keywords are different. I have three strategies to tackle the issue. Please let me know which one you think is best. 1) Create different keyword landing pages with a call-to-action to the same conversion page Each landing page will be optimized for the targeted keywords e.g. surfing, auto racing, mountain biking, sky diving, search & rescue etc. Obviously this will be a big task because there will be numerous landing pages. Each page will show how the product can be used in these activities. For Surfing, the content would include surfing images with the GoPro Hero 5, instructions on how to mount the camera to a surfboard, waterproof tests, surfing testimonials and surfing owner reviews, etc. The call-to-action leads to a generic product conversion page displaying product information such as specs, weight, video formats, price, shipping, warranty etc. The same product page will be the call-to-action for all keyword landing pages. Positives Vast number of targeting long-tail keywords, numerous landing pages Good specific user experience who may be looking for "underwater action camera" (specific mounting instructions related to surfboards etc.) Less duplicate content as there is only one product page showing the same information Negatives Challenging to come up with each page for the vast amount of activities. Inbound Link Considerations
    Inbound links from publications can link directly to the product page or the keyword landing page Surf Magazine may link to:
    "Surfing Action Camera | GoPro Hero 5 | GoPro.com" - gopro.com/hero5/underwater-surf-camera
    "GoPro Hero 5 Action Camera | GoPro.com" - gopro.com/hero5 2) Create different keyword landing pages with call-to-action to directly add product to cart Similar to the first option, but the call-to-action on the landing page is to Add Hero 5 to Cart. The user experience will be similar, the content creation challenges will be similar, but the techy product info e.g. specs, price, video format, etc. will be displayed on the same landing page. Positives Same benefit to long-tail keywords targeting Same benefit to a good, specific user experience Negatives Same challenges to create each long-tail keyword landing page Since there is no aggregate "product page", inbound links will be split between the landing pages Splitting of Page Authority to each landing conversion page Surf Magazine will link to:
    "Surfing Action Camera | GoPro Hero 5 | GoPro.com" - gopro.com/hero5/underwater-surf-camera
    Cycling Magazine will link to:
    "Cycling Action Camera | GoPro Hero 5 | GoPro.com" - gopro.com/hero5/cycling-camera 3) Create conversion-focused product page with casual blog about keywords This is currently what GoPro has chosen - GoPro Hero 5. The product page displays the many different types of activities on the same page. The page is focused on the user experience with images of the action camera being used in different cool activities, showing its versatility. Note, very little long-tail keyword targeting on this page, instead they could use a broad keyword "action camera". To target long-tails, maybe a blog can be used brand ambassadors displaying the product being used in the various activities. Positives User experience focused Higher conversion rate Less content creation work Inbound links go to the same product page, building Page Authority Negatives Poor ranking with short-tail keyword (GoPro is not even in Top 10 SERP for "action camera") Poor ranking with long-tail keywords, (GoPro doesn't rank for "diving camera, cycling camera, surf camera") For blogging the long-tail keywords, who really converts from landing on a blog of the actual seller?! I hope those three strategies were explained clear enough and have enough of a differentiator. Please let me know what you think!

    | ChrisCK
    0

  • Hi Moz Community, We changed the URL structure 6 months ago for our new site, and we experienced a ranking drop since then. From my understanding, changing URL structure and using 301 redirects will lose link juice, more or less. We think the ranking drop is because of the loss of link juice, assuming other factors remain constant. Here are my questions: How do those link juice losses have an impact on our ranking? Would changing URL structure back to original version regain the lost link juice, with all the redirects done properly? Would it take a lot of efforts? Is it recommended to change it back? Thank you so much in advance. Any thoughts and opinions are appreciated! Best, Raymond

    | raymondlii
    0

  • You have a full webpage with a great amount of content, images & media. This is a social blogging site where other members can leave their comments and reactions to the article. Over time there are say 1000 comments on this page. So we set the canonical URL, and use Rel (Prev & Next) to tell the bots that the next subsequent block of 100 comments is attributed to the primary URL. Or... We allow the newest 10 comments to exist on the primary URL, with a "see all" comments link that refers to a new URL, and that is where the rest of the comments are paginated. Which option does the community feel would be most appropriate and would adhere to the best practices for managing this type of dynamic comment growth? Thanks

    | HoloGuy
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.