Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi mozzers, We have decided to migrate the blog subdomain to the domain's subfolder (blog.example.com to example.com/blog). To do this the most effective way and avoid impact SEO negatively I believe I have to follow this checklist: Create a list of all 301 redirects from blog.example.com/post-1 to example.com/post-1 Make sure title tags remain the same on main domain Make sure internal links remain the same Is there something else I am missing? Any other best practices? I also would like to have all blog post as AMPs. Any recommendations if this something we should do since we are not a media site? Any other tips on successfully implementing those types of pages? Thanks

    | Ty1986
    1

  • Hi all, I have strange issue as someone redirected website http://bukmachers.pl to ours https://legalnibukmacherzy.pl We don't know exactly what to do with it. I checked backlinks and the website had some links which now redirect to us. I also checked this website on wayback machine and back in 2017 this website had some low quality content  but in 2018 they made similar redirection to current one but to different website (our competitor). Can such redirection be harmful for us?  Should we do something with this or leave it, as google stop encouraging to disavow low quality links.

    | Kahuna_Charles
    1

  • Hi Guys, Any free site speed testing tools for sites in production, which are password protected? We want to test site speed before the new site goes live on top priority pages. Site is on Shopify – we tried  google page insights while being logged into the production site but believe its just recording the speed of the password page. Cheers.

    | brandonegroup
    1

  • I'm getting the same structured data error on search console form most of my websites, Invalid value in field "itemtype" I take off all the structured data but still having this problem, according to Search console is a syntax problem but I can't find what is causing this. Any guess, suggestion or solution for this?

    | Alexanders
    0

  • Hi Mozers, Are in-page tabs still detrimental for SEO? In-page tabs: allow you to  alternate between views within the same context, not to navigate to different areas.  As in one long HTML page that just looks like it's divided into different pages via tabs that you can click between. Each tab has it's own URL, which I guess is for analytics tracking purposes? https://XXX https://XXX?qt-staff_profile_tabs=1 https://XXX?qt-staff_profile_tabs=2 https://XXX?qt-staff_profile_tabs=3

    | yaelslater
    0

  • Hello, Once I have done your keyword research is there way to write other than "naturally" which is what everyone says ? Could someone explain what they mean by naturally. For example let's say my keyword is Piedmont bike tour, some of the words I find through my research are cycle, routes, piedmont, barolo, wine etc... Is there a way to integrate those so that google understands what I mean. I imagine that google parses sentences for s reason and imagine that if I only sprinkle those words like in the sentence below it won't work. Piedmont bike tour, cycle, routes, piedmont, barolo, wine all this is cool ! Thank you,

    | seoanalytics
    0

  • Hi Everyone, Been racking my brain around this one. Not sure why it is happening. Basically Google is showing the "www" version of the homepage, when 99% of the site is "non-www". It also says "No Information Available". I have tried submitting it through GSC, but it is telling me it is blocked through the Robots.txt file. I don't see anything in there that would block it. Any ideas? shorturl.at/bkpyG I would like to get it to change to the regular "non-www" and actually be able to show information.

    | vetofunk
    0

  • Hi, for guest posting which Anchor is better for my brand website? 1)Naked Anchor text (e.g according to www.example.co.uk) OR Branded Anchor text? (e.g according to Example, which can be clicked to www.example.co.uk ) Thank you

    | KINSHUN
    0

  • Hello Everybody! After June Core Update was released, we saw an insane drop on traffic/revenue and indexed pages on GSC (Image attached below) The biggest problem here was: Our pages that were out of the index were shown as "Blocked by robots.txt", and when we run the "fetch as Google" tool, it says "Crawl Anomaly". Even though, our robots.txt it's completely clean (Without any disallow's or noindex rules), so I strongly believe that the reason that this pattern of error is showing, is because of the June Core Update. I've come up with some solutions, but none of them seems to work: 1- Add hreflang on the domain: We have other sites in other countries, and ours seems like it's the only one without this tag. The June update was primarily made to minimize two SERP results per domain  (or more if google thinks it's relevant). Maybe other sites have "taken our spot" on the SERPS, our domain is considerably newer in comparison to the other countries. 2- Mannualy index all the important pages that were lost The idea was to renew the content on the page (title, meta description, paragraphs and so on) and use the manual GSC index tool. But none of that seems to work as well, all it says is "Crawl Anomaly". 3- Create a new domain If nothing works, this should. We would be looking for a new domain name and treat it as a whole new site. (But frankly, it should be some other way out, this is for an EXTREME case and if nobody could help us. ) I'm open for ideas, and as the days have gone by, our organic revenue and traffic doesn't seem like it's coming up again. I'm Desperate for a solution Any Ideas gCi46YE

    | muriloacct
    0

  • I'd like to utilize AMP for faster loading for one of my clients. However, it is essential that this client have chat. My developer is having trouble incorporating chat with AMP, and he claims that it isn't possible to integrate the two. Can anyone advise me as to whether this is accurate? If it is true that AMP and chat aren't compatible, are there any solutions to this issue? I'd appreciate any leads on this. Thanks!

    | Joseph-Green-SEO
    0

  • Hi I am migrating an old wordpress site to a custom PHP site and the URL profiles will be different, so want to retain all link profiles and more importantly if a user visits the old urls via search then they are seamlessly transferred to the new equivalent page For example www.domain.com/about-us is going to need to redirect to www.domain.com/aboutus.php www.domain.com/furniture is going to need to redirect to www.domain.com/furniture-collections.php etc What is the best way of achieving this apart from .htaccess as not 100% confident of doing this.  Could it be done via PHP or using meta tags?

    | ocelot
    0

  • Hi everyone! Let's use this as an example URL: https://www.example.com/marvel/avengers/hulk/ We have done a 301 redirect for the "Avengers" page to another page on the site. Sibling pages of the "Hulk" page live off "marvel" now (ex: /marvel/thor/ and /marvel/iron-man/). Is there any benefit in doing a 301 for the "Hulk" page to live at /marvel/hulk/ like it's sibling pages? Is there any harm long-term in leaving the "Hulk" page under a permanently redirected page? Thank you! Matt

    | amag
    0

  • I noticed that when I request indexing in the webmaster tool my new content is live within minutes. Does it take longer to update the ranking or is the ranking updated as soon as the new page has been indexed. Thank you,

    | seoanalytics
    0

  • Hi mozzers, When checking the coverage report on GSC I am seeing over 649,000 valid pages https://cl.ly/ae46ec25f494 but when performing site:domain.com I am only seeing 130,000 pages. Which one is more the source of truth especially I have checked some of these "valid" pages and noticed they're not even indexed?

    | Ty1986
    0

  • Hi, I have a sitemap(s) which is very large(.i.e. 60000) links, is it recommended to have so many links and how come when I do a site search(site:mydomain) the number of links are less than on my site map?

    | FreddyKgapza
    0

  • Hey everyone, I am curious to know if anyone has tried to implement faceted navigation on the footer's website. I am asking because top navigation is a sensitive topic and can't be touched. Please share if this is something that works or not? Thanks

    | Ty1986
    0

  • Hi Everyone, Google is displaying www.domain.com instead of domain.com. We have our preferred URL set up as domain.com, and even redirect www.domain.com to domain.com, but in the search results it is showing www.domain.com. Problem is we are seeing referral data from www.domain.com and in Google it says "No information is available for this page." Anyone seen a way to resolve this?

    | vetofunk
    0

  • Hi, I would like to block all URLs with the parameter '?filter=' from being crawled by including them in the robots.txt. Which directive should I use: User-agent: *
    Disallow: ?filter= or User-agent: *
    Disallow: /?filter= In other words, is the forward slash in the beginning of the disallow directive necessary? Thanks!

    | Mat_C
    1

  • Google recently added related icons at the top of the image search results page. Some of the icons may be unrelated to the search. Are there any best practices to influence what is positioned in the related image icons section?  Thank you.

    | JaredBroussard
    1

  • Looking for some advice on an area that I can't seem to find much research about online. Since starting our website, it's always been hosted in the UK and targeting UK visitors. That means we always had the date/time format of the website as DD.MM.YY for example. We've now changed business focus and are targeting US visitors. We recently moved the site over to US hosting, and our web developers have instructed that we change to US date/time format (MM.DD.YY). My question is, are there any implications on doing this from an SEO perspective? Obviously, all our historic blog posts will need to have their date updated from, for example, 9 July to July 9. Does this make any difference at all? Anyone got any insights as to what best practice with this is? Cheers.

    | PeteratS2
    0

  • When I search on Google site:alexanders.co.nz still showing over 900 results. There are over 600 inexisting pages and the 404/410 errrors aren't not working. The only way that I can think to do that is doing manually on search console using the "Removing URLs" tool but is going to take ages. Any idea how I can take down all those zombie pages from the search results?

    | Alexanders
    1

  • Hi, On a webshop we are optimizing, the main navigation consists of the 5 main categories to which all of the products can be assigned. However, the main tabs in the navigation just activate a drop down with all of the subcategories. For example: the tab in the navigation is 'Garden equipment' and when you click on this tab, the drop down is shown with subcategories like 'Lawn mowers', 'Leaf blowers' and so on. Now, the page 'Garden equipment' is one of the main category pages and we want this page to rank of course. This shouldn't be a problem, since there is a separate URL for this page that can be indexed and that can be reached through internal links on the website. However, this page can not be reached when a visitor initially comes on the homepage of the webshop, since the tab in the navigation isn't clickable. This page will only be reached when a subcategory is selected, and then when the visitor goes back to the category page through the breadcrumb or through an internal link. Is it a problem that these important overview category pages can not be reached immediately? Thanks.

    | Mat_C
    0

  • Hi mozzers, Because our blog is located on blog.example.com on powered by Wordpress and currently can't migrate it to the main domain, unfortunately. Since we would like to grow our main's domain organic traffic and would like to test an option that could help us leverage the traffic of the top blog posts content. There is a Wordpress API that would allow us to get 100-200 words(snippet of the blog post) from the blog posts into the main domain that would provide a "Read more link" linking back to the blog.
    Is this even a good idea assuming we would make sure content is not identical?

    | Ty1986
    0

  • Which is better practice, using 1/2" or ½"? The keyword research suggests people search for "1 2" with the space being the "/". How does Google handle fractions? Would ½ be the same as 1/2?

    | Choice
    2

  • Hello! We are doing an image optimization audit, and are therefore trying to find a way to get a list of all images on a site.  Screaming Frog seems like a great place to start (as per this helpful article:  https://moz.rankious.com/_moz/ugc/how-to-perform-an-image-optimization-audit), but unfortunately, it doesn't include images in CSS.  😞 Does the community have any ideas for how we try to otherwise get list of images? Thanks in advance for any tips/advice.

    | mirabile
    0

  • Why do search engine parse sentences ?

    | seoanalytics
    0

  • Hello, I am wondering if to write and rank I should slipt my expression. Example : "Alsace bike tour ", should I write a paragraph with keywords that match "Alsace" and then a paragraph with keywords related to "bike tours" or should I write something with keywords related to "alsace bike tour" I imagine that when it is something that exist I don't need to split it for example " la loire velo" because it is a network of bike path in the Loire valley but for other things like above I do need to split it ? An input on that would be great. Thank you,

    | seoanalytics
    1

  • Hi there, We are now in the process of implementing a JSON-LD mark-up solution and are building cruises as an event.  Will this work and can we get away with this without penalty? Previously they have been marking their cruises as events using the data highlighter and this has displayed correctly in the SERP. The ideal schema would be Trip but this is not supported by Google Rich Results yet, hopefully they will support this in the future. Another alternative would be product but this does not display rich-results as we would like. Event has the best result in terms of how the information is displayed. For example someone might search "Cruises to Spain" and the landing page would display the next 3 cruises that go to Spain, with dates & prices. The event location would be the cruise terminal, the offer would be the starting price and the start & end date would be the cruise duration, these are fixed dates. I am interested to hear the communities opinion and experience with this problem.

    | NoWayAsh
    1

  • Hello there,  our site is on a Flatsome Wordpress theme (which is responsive and does not support AMP), and we are currently using the AMP for Wordpress plugin on our blog and other content rich pages. My question is - is a plugin sufficient to make our pages AMP friendly? Or should we consider switching to a theme that is AMP enabled already? Thanks!
    Katie

    | tnixis
    0

  • I have a site, natvest.com, with which I sell real estate in Alabama and Georgia. I need to show up in an "Alabama Land for Sale" search.  Same thing for Georgia. If I re-index my site, I show up for roughly one day, before disappearing again.  Happens every time I re-index. Ideas?

    | natvest
    0

  • What's the best practice these days for handling indexing of WooCommerce product subcategories? Example: in the sitemap we have:
    /product-category-a/
    /product-category-a/subcategory-1/
    /product-category-a/subcategory-2/
    etc. Should the /subcategory-*/ be noindexed, canonical to parent, or stay as indexed? Thanks!

    | btetrault
    2

  • Hi all, Most of the URLs that are created by using the internal search function of a website/web shop shouldn't be indexed since they create duplicate content or waste crawl budget. The standard way to go is to 'noindex, follow' these pages or sometimes to use robots.txt to disallow crawling of these pages. The first question I have is how these pages actually would get indexed in the first place if you wouldn't use one of the options above. Crawlers follow links to index a website's pages. If a random visitor comes to your site and uses the search function, this creates a URL. There are no links leading to this URL, it is not in a sitemap, it can't be found through navigating on the website,... so how can search engines index these URLs that were generated by using an internal search function? Second question: let's say somebody embeds a link on his website pointing to a URL from your website that was created by an internal search. Now let's assume you used robots.txt to make sure these URLs weren't indexed. This means Google won't even crawl those pages. Is it possible then that the link that was used on another website will show an empty page after a while, since Google doesn't even crawl this page? Thanks for your thoughts guys.

    | Mat_C
    0

  • Hi all, How bad is it to have a link in the breadcrumb that 301 redirects? We had to create some hidden category pages in our ecommerce platform bigcommerce to create a display on our category pages in a certain format. Though whilst the category page was set to not visable in bigcommerce admin the URL still showed in the live site bread crumb. SO, we set a 301 redirect on it so it didnt produce a 404. However we have lost a lot of SEO ground the past few months.  could this be why? is it bad to have a 301 redirect in the breadrcrumb.

    | oceanstorm
    0

  • My client has a subdomain from their main site where their online waiver tool lives. Currently, all the waivers generated by users are creating indexed pages, I feel they should deindex that subdomain entirely. However, a lot of their backlinks are from their clients linking to their waivers. If they end up deindexing their subdomain, will they lose the SEO benefit of backlinks pointing to that subdomain? Thanks! Jay

    | MCC_DSM
    0

  • I have my regular site blog at www.Guideyourhealth.org and a blog on www.medium.com, should I try to get back links for my medium articles as well that are on topics not competing with my site?

    | BuyKratomPowderInfo
    0

  • Hi all! Does anyone have advice for getting a software product to appear in the card results at the top of SERPs? Example https://www.google.com/search?q=budgeting+software&rlz=1C1CHBF_enUS784US784&oq=budgeting+software&aqs=chrome..69i57j0l5.2194j0j7&sourceid=chrome&ie=UTF-8 dzTpe2B

    | SimpleSearch
    0

  • So I have a weird situation, and I was hoping someone could help. This is for an ecommerce site. 1. Parameters are used to tie Product Detail Pages (PDP) to individual categories. This is represented in the breadcrumbs for the page and the use of a categoryid. One product can thus be included in multiple categories. 2. All of these PDPs have a canonical that does not include the parameter / categoryid. 3. With very few exceptions, the canonical URL for the PDPs are not linked to. Instead, the parameter URL is to tie it to a specific category.  This is done primarily for the sake of breadcrumbs it seems. One of the big issues we've been having is the canonical URLs not being indexed for a lot of the products. In some instances, the canonicals _are _indexed alongside parameters, or just parameter URLs are indexed. It's all very...mixed up, I suppose. My theory is that the majority of canonical URLs not being linked to anywhere on the site is forcing Google to put preference on the internal link instead. My problem? **I have no idea what to recommend to the client (who will not change the parameter setup). ** One of our Technical SEOs recommended we "Use cookies instead of parameters to assign breadcrumbs based on how the PDP is accessed." I have no experience this. So....yeah. Any thoughts? Suggestions? Thanks in advance.

    | Alces
    0

  • Hello everyone a website working on travel field with this address : https://goo.gl/4gaoAn Let me know what do you think about it and please give me some advises about it get improve on google rankings. If you be able to take time and give me some advises based on what you see on the website, would be great for me. Also what would work best for me to have a great link building strategy after penguin 4.0 update? and what does my site lack right now? Thanks and waiting to hear from you asap.

    | BahadorGh
    0

  • Hope someone can shed some light on this: We moved our smaller site  (into the main site (  different domains) . The smaller site that was moved ( https://www.bluegreenrentals.com)
    Directory  where the site was moved  (https://www.bluegreenvacations.com/rentals) Each page from the old site was 301 redirected to the appropriate page under .com/rentals. But we are seeing a significant drop in rankings and traffic., as I am unable to request a change of address in Google search console (a separate issue that I can elaborate on). Lots of (301 redirect) new destination pages are not indexed. When Inspected, I got a message  : Indexing allowed? No: 'index' detected in 'robots' meta tagAll pages are set as Index/follow and there are no restrictions in robots.txtHere is an example URL :https://www.bluegreenvacations.com/rentals/resorts/colorado/innsbruck-aspen/Can someone take a look and share an opinion on this issue?Thank you!

    | bgvsiteadmin
    0

  • We are removing a large number of URLs permanently. We care about rankings for search engines other than Google such as Yahoo-Bing, who don't even list https status 410 code option: https://docs.microsoft.com/en-us/bingmaps/spatial-data-services/status-codes-and-error-handling Does anyone know how search engines other than Google handle 410 vs 404 status? For pages permanently being removed John Mueller at Google has stated "From our point of view, in the mid term/long term, a 404 is the same as a 410 for us. So in both of these cases, we drop those URLs from our index. We generally reduce crawling a little bit of those URLs so that we don’t spend too much time crawling things that we know don’t exist. The subtle difference here is that a 410 will sometimes fall out a little bit faster than a 404. But usually, we’re talking on the order of a couple days or so. So if you’re just removing content naturally, then that’s perfectly fine to use either one." Any information or thoughts? Thanks

    | sb1030
    0

  • Hi all, Scenario: Ecommerce website selling a food product has their store on a subdomain (store.website.com). A GOOD chunk of the URLs - primarily parameters - are blocked in Robots.txt. When I search for the products, the main domain ranks almost exclusively, while the store only ranks on deeper SERPs (several pages deep). In the end, only one variation of the product is listed on the main domain (ex: Original Flavor 1oz 24 count), while the store itself obviously has all of them (most of which are blocked by Robots.txt). Can anyone shed a little bit of insight into best practices here? The platform for the store is Shopify if that helps. My suggestion at this point is to recommend they all crawling in the subdomain Robots.txt and canonicalize the parameter pages. As for keywords, my main concern is cannibalization, or rather forcing visitors to take extra steps to get to the store on the subdomain because hardly any of the subdomain pages rank. In a perfect world, they'd have everything on their main domain and no silly subdomain. Thanks!

    | Alces
    0

  • As we are growing fast, more and more websites go online. When they do, we always put a link in the footer which says: ‘Webdesign by Conversal’. But this is creating a substantial amount of backlinks to our root domain with the same anchors. Recently, we’ve moved our websites to different servers to spread the risk of a server crashing. I think Google now sees the backlinks through different IP’s as artificial, not natural, while Semrush and Moz are giving us a toxic score. What is your advice on this? Will we need a ‘no-follow’ attribute to each link on every website? Or could we better write a small case article for each client, where we can link to in the footer?

    | conversal
    0

  • Hi all, A fairly common problem in webshops is having the same subcategory in multiple main categories. Let's take the following example: example.com/legal/economic-law/company-law example.com/tax/companies/company-law I came across this interesting article on this topic:  https://moz.rankious.com/_moz/community/q/e-commerce-site-one-product-multiple-categories-best-practice Although I understand that the answer on the above question is the most thorough method, I don't see a problem with just using canonicals either. On the webshop we are restructuring, there are only a few of these subcategories that return in multiple main categories, so generating a path via user activity and storing it in a cookie doens't seem really necessary to me. Is it ok to just use canonicals or can this cause issues? Thanks.

    | Mat_C
    2

  • Hi Everyone, I am in the middle of an SEO contract with a site that is partially HTML pages and the rest are PHP and part of an ecommerce system for digital delivery of college classes. I am working with a web developer that has worked with this site for many years. In the php pages, there are also 6 different parameters that are currently filtered by Google URL parameters in the old Google Search Console. When I came on board, part of the site was https and the remainder was not. Our first project was to move completely to https and it went well. 301 redirects were already in place from a few legacy sites they owned so the developer expanded the 301 redirects to move everything to https. Among those legacy sites is an old site that we don't want visible, but it is extensively linked to the new site and some of our top keywords are branded keywords that originated with that site. Developer says old site can go away, but people searching for it are still prevalent in search. Biggest part of this project is now to rewrite the dynamic urls of the product pages and the entry pages to the class pages.  We attempted to use 301 redirects to redirect to the new url and prevent the draining of link juice. In the end, according to the developer, it just isn't going to be possible without losing all the existing link juice. So its lose all the link juice at once (a scary thought) or try canonicals. I am told canonicals would work - and we can switch to that. My questions are the following: 1.  Does anyone know of a way that might make the 301's work with the URL rewrite? 2.  With canonicals and Google parameters, are we safe to delete the parameters after we have ensures everything has a canonical url (parameter pages included)? 3.  If we continue forward with 301's and lose all the existing links, since this only half of the pages in the site (if you don't count the parameter pages) and there are only a few links per page if that, how much of an impact would it have on the site and how can I avoid that impact? 4.  Canonicals seem to be recommended heavily these days, would the canonical urls be a better way to go than sticking with 301's. Thank you all in advance for helping! I sincerely appreciate any insight you might have. Sue (aka Trudy)

    | TStorm
    1

  • Hey guys and gals, I'm having a frustrating time with an issue. Our site has around 10 pages that are coming up as duplicate content/ duplicate title. I'm not sure what I can do to fix this. I was going to attempt to 301 direct the upper case to lower but I'm worried how this will affect our SEO. can anyone offer some insight on what I should be doing? Update:  What I'm trying to figure out is what I should do for our URL's. For example, when I run an audit I'm getting two different pages: aaa.com/BusinessAgreement.com and also aaa.com/businessagreement.com. We don't have two pages but for some reason, Google thinks we do.

    | davidmac
    1

  • I'm wondering should I put any effort in making Meta Keywords tags for my pages or normal Tags (they're separate in Drupal), since apparently first are not considered by most of search engines, while not sure about normal tags. Obviously SERPS has to determine partial valu of the page by content, thus consider keywords / tags to some extend. What's your opinion on that. Thank you.

    | Optimal_Strategies
    1

  • Hi all– I've legitimately never seen this before, in any circumstance. I just went to check the google webcache of a product page on our site (was just grabbing the last indexation date) and was immediately redirected away from google's cached version BACK to the site's standard product page. I ran a status check on the product page itself and it was 200, then ran a status check on the webcache version and sure enough, it registered as redirected. It looks like this is happening for ALL indexed product pages across the site (several thousand), and though organic traffic has not been affected it is starting to worry me a little bit. Has anyone ever encountered this situation before? Why would a google webcache possibly have any reason to redirect? Is there anything to be done on our side? Thanks as always for the help and opinions, y'all!

    | TukTown
    1

  • Hi there, We have an e-commerce shopping site with over 8000 products and over 100 categories. Some sub categories belong to multiple categories - for example, A Christmas trees can be under "Gardening > Plants > Trees" and under "Gifts > Holidays > Christmas > Trees" The product itself (example: Scandinavian Xmas Tree) can naturally belong to both these categories as well. Naturally these two (or more) categories have different breadcrumbs, different navigation bars, etc. From an SEO point of view, to avoid duplicate content issues, I see the following options: Use the same URL and change the content of the page (breadcrumbs and menus) based on the referral path. Kind of cloaking. Use the same URL and display only one "main" version of breadcrumbs and menus. Possibly add the other "not main" categories as links to the category / product page. Use a different URL based on where we came from and do nothing (will create essentially the same content on different urls except breadcrumbs and menus - there's a possibiliy to change the category text and page title as well) Use a different URL based on where we came from with different menus and breadcrumbs and use rel=canonical that points to the "main" category / product pages This is a very interesting issue and I would love to hear what you guys think as we are finalizing plans for a new website and would like to get the most out of it. Thank you all!

    | arikbar
    0

  • In this situation I have SiteA, and SiteB on completely separate domains.  SiteA is the marketing front for the company and SiteB is an app that company owns.  SiteB receives a fair amount of backlinks as it has the login page of the application where customers link to a branded version for their members to login. Additionally none of that domain is indexable including the login page. SiteB's domain can't be changed to be a subdomain of SiteA as it isn't technically feasible. Initially I was reluctant to use canonical because as it isn't really duplicate content.  Is there a method for forwarding any link-juice from SiteB to SiteA without the use of a redirect and would canonical be appropriate in this case?  Additionally would SiteB's not being indexed negate any link benefit? Edit: Typo

    | OCN
    0

  • I am a bit boggled about https to https we redirected olddomain.com to https://www.newdomain.com, but redirecting https://www.olddomain.com or non-www is not possible. because the certificate does not exist on a level where you are redirecting. only if I setup a new host and add a htaccess file will this work. What should I do? just redirect the rest and hope for the best?

    | waqid
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.