Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hello community, Should I be using canonical tags on every job posted on my job board and also every job category page? I currently use no canonicals on my job board but I still rank well organically.

    | SO_UK
    1

  • Hi Everyone, I am not sure how this is all happening. We have been online for about 15 years, and now we are at our lowest amount of traffic in about 10 years. Our sites are www.bestpricenutrition.com and www.mysupplementstore.com. We sell commodity items, but I have focused on unique product descriptions, tons of UGC, blog posts and guides for awhile now and it has always done us well. Until as of late. This is what I feel led up to this, but I am hoping there is something I missed. May 1st, 2018: Migrated www.bestpricenutrition.com and www.mysupplementstore.com from Shopify. Similar sites, but almost all unique content. We purchased www.mysupplementstore.com about 8 years ago. A ton of traffic and sales, which is why we didn't just redirect it. Around May 25th: www.mysupplementstore.com took a big hit and lost almost 40% of its traffic. Nothing happened to www.bestpricenutrition.com, we actually increased traffic. Aug 1st Update: www.mysupplementstore.com lost another 25% of its traffic. www.bestpricenutrition.com lost about 40% of it's traffic. Sept 28th: Nothing happened to www.mysupplementstore.com, but www.bestpricenutrition.com lost another 30% of it's traffic. So I have been trying to figure out if there is anything technically wrong, but doesn't seem so. These are issues we discovered in August. During the migration, the reviews from each site were syndicated to both websites. There were 1000's. This was resolved in mid August. During the migration, the company doing the migration pushed our blog posts to both websites. 100's of blog posts duplicated to each website. This was resolved mid August. We found that a disgruntled employee instead writing unique content for our product pages, she was copying them one from another. This was about 100 product pages, which we have since resolved. What's Left I noticed on www.bestpricenutrition.com that we have 100's of blog posts that are getting hardly any traffic. I had trimmed www.mysupplementstore.com of this low traffic content. I am working on www.bestpricenutrition.com still. I have been in this industry since 2003, survived 2012, but have exhausted everything I know to figure this out. It's another sob story I know, but trying to keep everyone's job alive here, but it doesn't look like it's going to happen. Any help would be greatly appreciated.

    | vetofunk
    0

  • We have a client's website that was on page 1 for 2 years, and then in September fell off while a new website with virtually no visitors and never showing in organic search before shot to #1. Never seen anything like it. Today, my client is back where they were two weeks ago and the #1 listing I mentioned is not even on page 1 at all.  In fact it's at the bottom of page 2! It seems to us, having read about Google organic changes made around July 3rd, 2018, that even more emphasis is now on the domain name (all the results on page 1 have my client's keyword in their domain name) and the importance of H1 tags and Title tags has risen to trump many other factors. Can anyone shed some light on changes you may have seen in the past few months? Along with huge changes to Adwords and AdGrants, Google seems all over the place (at least to us) and it is more challenging then ever. Thanks!!

    | Teamzig
    0

  • Hello All Our site www.xxxx.com has long had a forum subdomain forum.xxxx.com. We have decided to sunset the forum. We find that the 'Ask a Question' function on product pages and our social media presence are more effective ways of answering customers' product & project technical Qs. Simply shutting down the forum server is going to return thousands of 404s for forum.xxxx.com, which I can't imagine would be helpful for the SEO of www.xxxx.com even though my understanding is that subdomains are sort of handled differently than the main site. We really tremendously on natural search traffic for www.xxxx.com, so I am loathe to make any moves that would hurt us. I was thinking we should just keep the forum server up but return 410s for everything on it, including the roughly ~3,000 indexed pages until they are removed from the index, then shut it down. The IT team also gave the option of simply pointing the URL to our main URL, which sorta scares me because it would then 200 and return the same experience hitting it from forum.xxxx.com as www.xxxx.com, which sounds like a very bad idea. (Yes, we do have canonicals on www.xxxx.com). In your opinion, what is the best way to handle this matter? Thank You

    | jamestown
    0

  • One of my clients has been invited to feature his blog posts here https://app.mindsettlers.com/. Here is an example of what his author page would look like: https://app.mindsettlers.com/author/6rs0WXbbqwqsgEO0sWuIQU. I like that he would get the exposure however I am concerned about duplicate content with the feed. If he has a canonical tag on each blog post to itself, would that be sufficient for the search engines? Is there something else that could be done? Or should he decline? Would love your thoughts! Thanks.
    Cindy T.

    | cindyt-17038
    0

  • Hi Guys Our developer forgot to add a no index no follow tag on the pages he created in the back-end. So we have now ended up with lots of back end pages being indexed in google. So my question is, since many of those are now indexed in Google, so is it enough to just place a no index no follow on those or should we do a 301 redirect on all those to the most appropriate page? If a no index no follow is enough, that would create lots of 404 errors so could those affect the site negatively? Cheers Martin

    | martin1970
    0

  • Hello, My site had a 50% decrease in the last 48 hours (9/26/18) and I looking for ideas/reasons what would cause such a dramatic drop. Year to year organic traffic has been up 40% and September was up 30%. The site has a domain authority of 39 according to Moz and keywords positions have been flat for a few months. I made a change to the code and robots.txt file on Monday, pre-drop. The category pagination pages had a "NoIndex" with a rel =canonical and I removed the "NoIdnex" per: https://www.seroundtable.com/google-noindex-rel-canonical-confusion-26079.html. I also removed "Disallow" in the robots.txt for stuff like "/?dir" because the pages have the rel =canonical. Could this be the reason for drop?? Other possible reasons:
    1. Google Update: I dont think this is it, but ti looks like the last one was August 1st: "Medic" Core Update  —  August 1, 2018
    2. Site was hacked
    3. All of keyword positions dropped overnight: I dont think this is it because Bing has also dropped at the same percentage. Any help, thoughts or suggestions would be awesome.

    | chuck-layton
    0

  • Website niche - Animation and 3D Rendering Studios Backlink from - http://www.adamfrisby.com/create-home-design-and-interior-decor-in-2d-3d.html the anchor tag is image URL from one of the many images in that post. Please let me know such types of links were good for bad?

    | varunrupal
    0

  • We currently have a database of content across about 100 sites. All of this content is exactly the same on all of them, and it is also found all over the internet in other places. So it's not unique at all and it brings in almost no organic traffic. I want to remove this bloat from our sites. Problem is that this database accounts for almost 60,000 pages on each site and it is all currently indexed. I'm a little bit worried that flat out dumping all of this data at once is going to cause Google to wonder what in the world we are doing and we are going to see some issues from it (at least in the short run). My thought now is to remove this content in stages so it doesn't all get dropped at once. But would deindexing all of this content first be better? That way Google would still be able to crawl it and understand that it is not relevant user content and therefore minimize impact when we do terminate it completely? Any other ideas for minimizing SEO issues?

    | MJTrevens
    1

  • Due to some reason my website https://xyz.com is not redirecting to my main website domain - https://www.xyz.com so our tech team suggested - we will have the non-www name on a different IP and we'll 301 redirect that to the https://www.xyz.com. if it works does it will effect our website from SEO point of view? please let me know.

    | BPLLC
    0

  • Backlink is good or bad? All of the website links were of the same type. Website niche - Animation and 3D Rendering Studios Backlink from - http://www.adamfrisby.com/create-home-design-and-interior-decor-in-2d-3d.html the anchor tag is image URL from one of the many images in that post.

    | varunrupal
    0

  • Hi, I have been researching the best way to migrate six sites into one, since I have never done it, and I am frankly overwhelmed. Some resources say to do it incrementally, and a/b test; but I would prefer not to have to do it, as won't it present a disjointed representation for visitors? The previous sites are older and a bit clumsy compared to the new design and functionality in the new site. Can someone please tell me the right way to approach this? Or tell me the best resource for a step-by-step prep, migrate, and watch process? Thanks so much in advance!

    | lfrazer123
    0

  • Hello all, I work for a university and I my small team is responsible for the digital marketing, website, etc. We recently had a big initiative on SEO and generating traffic to our website. The issue I am having is that my department only "owns" the www subdomain. There are lots of other subdomains out there. For example, a specific department can have its own subdomain at department.domain.com and students can have their own webpage at students.domain.com, etc. I know the possibilities of domain cannibilization, but has any one run into long term problems with a similar situation or had success in altering the views of a large organization? If I do get the opportunity to help some of these other domains, what is best to help our overall domain authority? Should the focus be on removing similar content to the www subdomain or cleaning up errors? Some of these subdomains have hundreds of 4XX errors.

    | Jeff_Bender
    0

  • We've migrated my site from HTTP to HTTPS protocols in Sep 2017 but I noticed after migration soft 404 granularly increasing. Example of soft 404 page: https://bit.ly/2xBjy4J But these soft 404 error pages are real pages but Google still detects them as soft 404. When I checked the Google cache it shows me the cache but with HTTP page. We've tried all possible solutions but unable to figure out why Google is still indexing to HTTP pages and detecting HTTPS pages as soft 404 error. Can someone please suggest a solution or possible cause for this issue or anyone same issue like this in past.

    | bheard
    0

  • Hi - I am seeking an onsite search engine that is SEO friendly - which do you recommend? And has anyone tried doofinder.com - that specific search engine - if you have, is it well aligned/attuned to the SEO aspects of your site? Thanks as ever, Luke

    | McTaggart
    1

  • I am looking for an SEO strategy, a step by step procedure to follow to rank my website https://infinitelabz.com . Can somebody help?

    | KHsdhkfn
    0

  • I'm in the process of a redesign and upgrade to Drupal 8 and have used Drupal's taxonomy feature to add a fairly large database of Points of Interest, Services etc. initially this was just for a Map/Filter for site users. The developer also wants to use teasers from these content types (such as a scenic vista description) as a way to display the content on relevant pages (such as the scenic vistas page, as well as other relevant pages). Along with the content it shows GPS coordinates and icons related to the description. In short, it looks cool, can be used in multiple relevant locations and creates a great UX. However, many of these teasers would basically be pieces of content from pages with a lot of SEO value, like descriptive paragraphs about scenic viewpoints from the scenic viewpoints page. Below is an example of how the descriptions of the scenic viewpoints would be displayed on the scenic viewpoints pages, as well as other potential relevant pages. HOW WILL THIS AFFECT THE SEO VALUE OF THE CONTENT?? Thanks in advance for any help, I can't find an answer anywhere. About 250 words worth of content about a scenic vista. There’s about 8 scenic vista descriptions like this from the scenic vistas page, so a good chunk of valuable content. There are numerous long form content pages like this that have descriptions and information about sites and points of interest that don't warrant having their own page. For more specific content with a dedicated page, I can just the the intro paragraph as a teaser and link to that specific page of content. Not sure what to do here.

    | talltrees
    0

  • Does anyone have experience removing /product/ and /product-category/, etc. from URLs in wordpress? I found this link from Wordpress which explains that this shouldn't be done, but I would like some opinions of those who have tried it please. https://docs.woocommerce.com/document/removing-product-product-category-or-shop-from-the-urls/

    | moon-boots
    0

  • My question is that if I have several links going to different landing pages will the one at the top of the content pass more value than ones at the bottom. Assuming that there are not more than 1 of the same link in the content. The ultimate question is whether or not link position in the content/html code make a difference if it passes more value. This question comes in response to this whiteboard Friday https://www.youtube.com/watch?v=xAH762AqUTU Rand talks about how if there are 2 links going to the same URL from the same content page then google will only inherit the value of the anchor text from the first link on the page and not the both of them. Meaning that google will treat that second link as if it doesn’t exist. There are lots of resources that shows this was true but there isn’t much content newer than 2010 that say this is still true, We all know that things have changed a lot since then  Does that make sense?

    | 97th_Floor
    0

  • I am working in Magento 2 which has all sorts of difficulties. My product page is example.com/testproduct the canonical is the same. But in the sitemap it is example.com/testproduct/ In a perfect world I would get rid of the trailing slash but can't because of this issue- https://magento.stackexchange.com/questions/205337/unique-constraint-violation-found-when-remove-suffix-html-magento-2-2-0 The trailing slash will 301 redirect properly. Is it an issue having the sitemap urls different with the trailing slash?

    | Tylerj
    0

  • I have noticed that some e-commerce sites don't worry aout their store working when JS is switched off - yet some do - are there any SEO implications of losing faceted navigation/filtering functionality when JS is disabled I tried M&S - didn't work - but Tesco did - when JS is disabled.

    | McTaggart
    0

  • Hi there, I'm hoping someone can help here... I'm new to a company where due to the limitations of their Wordpress instance they've been creating what would ordinarily be considered pages in the standard sitemap as landing pages in their Pardot marketing automation platform. The URL subdomain is slightly different. Just wondering if anybody could quickly outline the SEO implications of doing this externally instead of directly on their site? Hope I'm making some sense... Thanks,
    Phil

    | philremington
    1

  • I run a McAfee Technical Support website. I has been 2.3 months since I have been practicing seo on it. It was slick until it appeared on the second page of google. But now it doesnt rank up as it's frozen. Can i get any advices and suggestions for my website to break the 2nd page cage. My website:-** mcafee.com/activate**

    | six_figures
    0

  • hi to all i find 2 site they do mirror my site and send back link to all my pages. do you thing its bad for my seo ?? my site is https://android-apk.org mirror sites | Who links the most | fryeboysent.com | 1,342,613 |
    | ficyexp.cl | 934,654 | |

    | moztabliq1
    0

  • Hi All As our company becomes a bigger and bigger entity I'm trying to figure out how I can create more autonomy. One of the key areas that needs fixing is briefing the writers on articles based on keywords. We're not just trying to go after the low hanging fruit or the big money keywords but actually comprehensively cover every topic and provide actual good quality up to date info (surprisingly rare in a competitive niche) and eventually cover pretty much every topic there is. We generally work on a 3 tier system on a folder level, topics and then sub-topics. The challenge is getting an agency to: a) be able to pull all of the data without being knowledgeable in our specific industry. We're specialists and, thus, target people that need specialist expertise as well as more mainstream stuff (the stuff that run of the mill people wouldn't know about). b) know where it all fits topically as we kind of organise the content on a heirarchy basis. And we generally cover multiple smaller topics within articles. Am I asking for the impossible here? It's the one area of the business I feel the most nervous about creating autonomy with. Can we become be as extensive and comprehensive as a wiki-type website without having somebody within the business that knows it providing the keyword research. I did a searh for all data using the main two seed keywords for this subject on ahrefs and it came up with 168000 lines of spreadsheet data. Obviously this went way beyond the maximum I was allowed to export. Interested in feedback and, if any agencies are up for the challenge, do let me know! I've been using moz pro for a long time but have never posted and apologise if what I'm describing is being explained badly here. Requirements Keywords to cover all (broad niche) related queries in the UK, no relevant uk (broad niche) keywords will be missed Organised in a way that can be interpreted as article brief and folder structure instructions. Questions How would you ensure you cover every single keyword? Assuming no specialist X knowledge, how will you be able to map content and know which search queries belong in which topics and in what order. Also (where there is keyword leakage from other regions) how will you know which are UK terms and which aren’t? With minimal X knowledge – how will you know whether you’ve missed an opportunity or not (what you don’t know you don’t know) What specific resources will you require from us in order for this to work? What format will the data be provided to us in - how will you present the finished work so that it can be turned into article briefs?

    | d.bird
    0

  • Hi everyone, We are changing a website's domain name. The site architecture will stay the same, but we are renaming some pages. How do we treat redirects? I read this on Search Engine Land: The ideal way to set up your redirects is with a regex expression in the .htaccess file of your old site. The regex expression should simply swap out your domain name, or swap out HTTP for HTTPS if you are doing an SSL migration. For any pages where this isn’t possible, you will need to set up an individual redirect. Make sure this doesn’t create any conflicts with your regex and that it doesn’t produce any redirect chains. Does the above mean we are able to set up a domain redirect on the regex for pages that we are not renaming and then have individual 1:1 redirects for renamed pages in the same .htaccess file? So have both? This will not conflict with the regex rule?

    | nhhernandez
    0

  • Hi, First question: Working on the indexation of all pages for a specific client, there's one page that refuses to index. Google Search console says there's a robots.txt file, but I can't seem to find any tracks of that in the backend, nor in the code itself. Could someone reach out to me and tell me why this is happening? The page: https://www.brody.be/nl/assistentiewoningen/ Second question: Google is showing another meta description than the one our client gave in in Yoast Premium snippet. Could it be there's another plugin overwriting this description? Or do we have to wait for it to change after a specific period of time? Hope you guys can help

    | conversal
    0

  • My firm has hired an SEO to create links to our site. We asked the SEO to provide a list of domains that they are targeting for potential links. The SEO did not agree to this request on the grounds that the list is their unique intellectual property. Alternatively I asked the SEO to provide the URL that will be linking to our site before the link is activated. The SEO did not agree to this. However, they did say we could provide comments afterwards so they could tweak their efforts when the next 4-5 links are obtained next month. The SEO is adamant that the links will not be spam. For whatever it is worth the SEO was highly recommended. I am an end user; the owner and operator of a commercial real estate site, not an SEO or marketing professional. Is this protectiveness over process and data typical of link building providers? I want to be fair with the provider and hope I will be working with them a long time, however I want to ensure I receive high quality links. Should I be concerned? Thanks,
    Alan

    | Kingalan1
    0

  • Recently, in the last two weeks, I started seeing a lot of odd 404 errors in GSC for my site. Upon investigation, the URLs are for fairly new articles, and the URLs are chopped in various places. From missing a character at the end to missing about 10 characters at the end of the URL. (an old similar issue is that GSC reports duplicate contents on weird subdomains that we've never used like 'smtp' 'ww1' or even random ones like 'bobo'.) GSC doesn't report any 'linked from' for those odd URLs and I know for sure these links aren't on the site itself. They're definitely not errors in the CMS. The site is a long established site (started 1997-1998) and we've been subject to a lot of negative SEO. I recently had to disavow about 1000 .ru domain linking to us, with some domains containing over a million link each. Could these chopped links be a new tactic of negative SEO? How do I find these seemingly intentionally broken links to us?

    | Lazeez
    2

  • Hello, Is it ok that to use the homepage of website as a product page directly where you present all your products on your homepage or can it penalise you to do that ? and in that case, is it better to have a homepage that you don't rank and create a subpage for your product page. Thank you,

    | seoanalytics
    1

  • I'm completely rebranding a website but keeping the same domain. All content will be replaced and it will use a different theme and mostly new plugins. I've been building the new site as a different site in Dev mode on WPEngine. This means it currently has a made-up domain that needs to replace the current site. I know I need to somehow redirect the content from the old version of the site. But I'm never going to use that content again. (I could transfer it to be a Dev site for the current domain and automatically replace it with the click of a button - just as another option.) What's the best way to replace blahblah.com with a completely new blahblah.com if I'm not using any of the old content? There are only about 4 URL'st, such as blahblah.com/contact hat will remain the same - with all content replaced. There are about 100 URL's that will no longer be in use or have any part of them ever used again. Can this be done safely?

    | brickbatmove
    1

  • Hi, I just noticed a huge large problem in our rankings. Our rankings suddenly dropped for more than 50 %. Of course, I immediately started to research the issue. And under Links, I found that we somehow lost all of our internal links! They have dropped from 9k to 0. Now, I am sure that we do have some internal links on our site ( since I put them there myself). Could you please tell me what is going on and how I can fix this issue? Our site is 1solarsolution.com and I will also attach screenshots bellow from Link Explorer, thank you. Fr08UGe

    | alisamana
    0

  • I have a quick view modal for all products on my website. How should I deal with these in the page set up eg. should I rel=canonical to the full product page and no-index in robots txt or are they ok in Googles eyes as they are part of the UX ?

    | ColesNathan
    0

  • Hi,
    I was wondering what this would be used for as it's in the Robots.exe of a recruitment agency website that posts jobs. Should it be removed? Disallow: /jobs/?
    Disallow: /jobs/page/*/ Thanks in advance.
    James

    | JamesHancocks1
    0

  • Hi I have a company for example: abc.com and they have a subdomain def.abc.com with a lot of errors. These errors i believe affect the parent domain abc.com. So my company would like to redirect the subdomain to another domain altogther ex: xyz.com Can i redirect the subdomain def.abc.com to another website domain? Would this affect the parent domain in a good or bad way? Or should i be using an external links to point to a new domain for the subdomain? Trying to think what's best for SEO and the parent domain. Thanks!

    | crodriguez89
    0

  • when would you reccomend using a canonical tag on a large site?

    | Cristiana.Solinas
    0

  • Hi 🙂 I run a small wordpress multisite network where the main site which is an informative portal about the Langhe region in Italy, and the subsites are websites of small local companies in the tourism and wine/food niche. As an additional service for those who build a website with us, I was thinking about giving them the possibility to use some ouf our portal's content (such as sights, events etc) on their website, in an automatic way. Not as an "SEO" plus, but more as a service for their current users/visitors base: so if you have a B&B you can have on your site an "events" section with curated content, or a section about thing to see (monuments, parks, museums, etc) in that area, so that your visitors can enjoy reading some content about the territory. I was wondering if, apart from NOT being benefical, it would be BAD from an SEO point of view... ie: if they could be actually penlized by google. Thanks 🙂 Best

    | Enrico_Cassinelli
    0

  • The website I'm working on has no canonical tags. There is duplicate content so rel=canonicals need adding to certain pages but is it best practice to have a tag on every page ?

    | ColesNathan
    0

  • What are the effects of changing URL's during a site redesign following all of the important processes (ie: 301 redirects, reindexing in google, submitting a new sitemap) ?

    | jennifer-garcia
    0

  • We are considering to add at the end of alll our 1500 product pages answers to the 9 most frequently asked questions. These questions and answers will be 90% identical for all our products and personalizing them more is not an option and not so necessary since most questions are related to the process of reserving the product. We are convinced this will increase engagement of users with the page, time on page and it will be genuinely useful for the visitor as most visitors will not visit the seperate FAQ page. Also it will add more related keywords/topics to the page.
    On the downside it will reduce the percentage of unique content per page and adds duplication. Any thoughts about wether in terms of google rankings we should go ahead and benefits in form of engagement may outweight downside of duplication of content?

    | lcourse
    0

  • Hi Moz Community, The keyword that http://customsigncenter.com/ is ranking for is "custom sign", the keyword difficulty is 38 (according to Moz Keyword Explorer). Here are the link metrics for the page and domain: Page authority: 27 Domain authority: 18 Facebook shares: 50 Linking RDs to the page: 7 Linking RDs to the Root Domain: 8 From the SERP, a lot of its competitors have better link profile than this guy. How come the page http://customsigncenter.com/ can rank 6th for the keyword "custom sign". Are there any important "hidden factors" behind the scene? Thank you for any help and support. Best, Raymond

    | raymondlii
    1

  • You can request re-indexing of a single page via Google Search Console.  It would seem to me you could use this feature to experiment with on-page changes to see the rank change to determine which changes have the most effect.  For the sake of this thread, lets temporarily forget that the relative importance on various on-page factors has already been reverse engineered to a degree so we already have a general idea to som extent. It would seem to me if I were Google, I would introduce either a random delay period, or, temper rank change after reindexing. What I mean by that latter point is say a reindex takes a page from position 20 to 10.  If it is 'tempered' so to speak on Day 2 after reindexing it might be at 18, day 5 it's at 16, day 7 it's at 16 until it reaches the actual "real" rank.  Both the delay and or the tempering of rank change would make it difficult more difficult to reverse engineer relative importance of on-page factors. OR, does Google realize there are large SEO firms doing SEO over several years for many sites that can examine aggregate data to determine these factors so Google doesn't delay (aka sandbox) or temper rank changes due to manual re-indexing?

    | Semush
    0

  • We have followers, following, friends, etc pages for each user who creates account on our website. so when new user sign up, he may have 0 followers, 0 following and 0 friends, but over period of time he can get those lists go up. we have different pages for followers, following and friends which are allowed for google to index. When user don't have any followers/following/friends, those pages looks empty and we get issue of duplicate content and description too short. so is it better that we add  noindex for those pages temporarily and remove noindex tag when there are at least 2 or more people on those pages. What are side effects of adding noindex when there is no data on those page or benefits of it?

    | swapnil12
    0

  • Howdy folks, excited to be part of the Moz community after lurking for years! I'm a few weeks into my new job (Digital Marketing at Rewind) and about 10 days ago the product team moved our Help Center from Zendesk to Intercom. Apparently the import went smoothly, but it's caused one problem I'm not really sure how to go about solving: https://help.rewind.io/hc/en-us/articles/***    is where all our articles used to sit https://help.rewind.io/***    is where all our articles now are So, for example, the following article has now moved as such: https://help.rewind.io/hc/en-us/articles/115001902152-Can-I-fast-forward-my-store-after-a-rewind- https://help.rewind.io/general-faqs-and-billing/frequently-asked-questions/can-i-fast-forward-my-store-after-a-rewind This has created a bunch of broken URLs in places like our Shopify/BigCommerce app listings, in our email drips, and in external resources etc. I've played whackamole cleaning many of these up, but these old URLs are still indexed by Google – we're up to 475 Crawl Errors in Search Console over the past week, all of which are 404s. I reached out to Intercom about this to see if they had something in place to help, but they just said my "best option is tracking down old links and setting up 301 redirects for those particular addressed". Browsing the Zendesk forms turned up some relevant-ish results, with the leading recommendation being to configure javascript redirects in the Zendesk document head (thread 1, thread 2, thread 3) of individual articles. I'm comfortable setting up 301 redirects on our website, but I'm in a bit over my head in trying to determine how I could do this with content that's hosted externally and sitting on a subdomain. I have access to our Zendesk admin, so I can go in and edit stuff there, but don't have experience with javascript redirects and have read that they might not be great for such a large scale redirection. Hopefully this is enough context for someone to provide guidance on how you think I should go about fixing things (or if there's even anything for me to do) but please let me know if there's more info I can provide. Thanks!

    | henrycabrown
    1

  • I think I know the answer but i need to ask anyway in case i am wrong. The www PA is 29. Http version is PA of 21. Should I start using the WWW one? A number of years ago, they hired an agency that built a ton of links to the WWW version. I should also point out that most of the site urls are for the http, so i would have to redirect all the other pages. Advice? Thanks, Nails

    | matt.nails
    0

  • Hi, I am working on plan to divide up mid-number DA website into multiple sites. So the current site's content will be divided up among these new sites. We can't share anything going forward because each site will be independent. The current homepage will change to just link out to the new sites and have minimal content. I am thinking the websites will take a hit in rankings but I don't know how much and how long the drop will last.  I know if you redirect an entire domain to a new domain the impact is negligible but in this case I'm only redirecting parts of a site to a new domain. Say we rank #1 for "blue widget" on the current site. That page is going to be redirected to new site and new domain. How much of a drop can we expect? How hard will it be to rank for other new keywords say "purple widget" that we don't have now? How much link juice can i expect to pass from current website to new websites? Thank you in advance.

    | timdavis
    0

  • Hi MOZ community! We're wondering what the implications would be on organic ranking by having 2 pages, which have quite different functionality were optimised for the same keywords. So, for example, one of the pages in question is
    https://www.whichledlight.com/categories/led-spotlights
    and the other page is
    https://www.whichledlight.com/t/led-spotlights both of these pages are basically geared towards the keyword led spotlights the first link essentially shows the options for led spotlights, the different kind of fittings available, and the second link is a product search / results page for all products that are spotlights. We're wondering what the implications of this could be, as we are currently looking to improve the ranking for the site particularly for this keyword. Is this even safe to do? Especially since we're at the bottom of the hill of climbing the ranking ladder of this keyword. Give us a shout if you want any more detail on this to answer more easily 🙂

    | TrueluxGroup
    0

  • Hi, Will Regex expressions work in a disavow file? If i include website.com/* will that work or would you recommend just website.com? Thanks.

    | Fubra
    0

  • A lot of our competitors come up but we aren't coming up. What do we need to do so that google considers us related? Our website is culinarydepotinc.com And I believe not being related to those big competitors affects our SEO, is that correct?

    | Sammyh
    2

  • Hi Mozers, Does having a Google+ page really impact SEO? Thanks, Yael

    | yaelslater
    1

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.