Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • I built my website via square space.  It is my name.  If you google my name it was the number one hit. Suddenly 2 weeks ago it doesn't show up AT ALL.   I went through square spaces SEO check list, secured my site etc.  Still doesn't show up.  Why would this happen all of the sudden and What can I do? Thank you!

    | Jbark
    0

  • I have placed content on a partner site using the same content that is on my site. I want the link juice from the site and the canonical tag points back to my site. However, they are also using the original source tag as they publish a lot of news. If they have the original source tag as the page on their site and the canonical as mine, is this killing the link juice from the canonical and putting me in jeopardy of a duplicate content penalty?  Google has already started indexing the page on their site with the same content.

    | SecuritiesCE
    0

  • We have a website where we do job postings. We manually add the data to our website. The Job Postings are covered by various other websites including the original recruiting organisations. The details of the job posting remain the same, for instance, the eligibility criteria, the exam pattern, syllabus etc. We create pages where we list the jobs and keep the detailed pages which have the duplicate data disallowed in robots.txt. Lately, we have been thinking of indexing these pages as well, as the quantum of these non-indexed pages is very high. Some of our competitors have these pages indexed. But we are not sure whether doing this is gonna be the right move or if there is a safe way to deal with this. Additionally, there is  this problem that some job posts have very less data like fees, age limit, salary etc which is thin content so that might contribute to poor quality issue. Secondly, we wanted to use enriched result snippets for our job postings. Google doesn't want snippets to be used on the listing page: "Put structured data on the most detailed leaf page possible. Don't add structured data to pages intended to present a list of jobs (for example, search result pages). Instead, apply structured data to the most specific page describing a single job with its relevant details." Now, how do we handle this situation? Is it safe to allow the detailed pages which have duplicate job data and sometime not so high quality data in robots.txt?

    | dailynaukri
    0

  • Hello, I looked through the forum but couldn't find an answer so here is my question: A client have 2 subfolders that are selling the same things for example Furnitures, Office Furniture, Bedroom Furniture, etc. The website looks like this : www.website.com/subfolder1/ www.website.com/subfolder2/ But it's like 2 diferents brand, just that it sells the same kind of product. The company wants to put evereything in subfolder1 and stop the subfolder2, which mean stop the second brand. But the thing is that right now the subfolder2 have better positions in SEO than subfolder1 for most of the Keywords. How could I make all the internet traffic goes from the subfolder2 to the subfolder1 ?
    A 301 redirection could do the trick ? In addition of improving the SEO (Title, H1, Meta, etc) of the subfolder1 ? Thanks for your help.

    | Sodimaccl
    0

  • According to the Moz rank checking tool, my blog ranks in the top 3 for my name "James Crowley" on Bing, Yahoo (both in the US and UK), and also DuckDuckGo (though Moz can't tell me that). And yet doesn't rank anywhere for Google. I don't have any penalties, and for other keywords it appears fine on Google. Does this seem strange to you? Am I going wrong somewhere? The blog is https://www.jamescrowley.net/. Many thanks James Nq5uF2al.png

    | james.crowley
    0

  • Hello! We keep getting "critical crawler" notifications on Moz because of firing 404 codes. We've checked each page and know that we are not linking to them anywhere on our site, they are not published and they are not indexed on Google. It's only happened since we migrated our blog to Hubspot so we think it has something to do with the test pages their developers had set up and that they are just lingering in our code somewhere. However, we are still concerned having these codes fire implies negative consequences for our SEO. Is this the case? Should we be concerned about these 404 codes despite the pages from those URLs not actually existing? Thank you!
    Chloe

    | DebFF
    0

  • Hi Guys,
    I am working on a site at the moment, 
    Previous developer used a API to pull in HealthCare content (HSE) .
    So the API basically generates landing pages within the site, and generates the content.
    To date it has over 2k in pages being generated.
    Some actually rank organically and some don't. New site being launch: So a new site is being launched & the "health advice" where this content used to live be not included in the new site. So this content will not have a place to be displayed. My Query: Would you allow the old content die off in the migration process & just become 404's
    Or 
    Would you 301 redirect the all or only ranking pages to the homepage ? Other considerations, site will be moved to https:// so site will be submitted to search console & re-indexed by Google. Would love to hear if anyone had similar situation or suggestions.
    Best Regards
    Pat

    | PaddyM556
    0

  • Last year we purchased a $79 them and coded a new designer our real estate website. The  database of listings was transferred to the new theme. A year later we realize the new theme is not that fast; does not perform great, so despite optimizing our server we are not getting very fast performance. So, my question is, can we take the design, the CSS of our current theme (and database) and transfer it to a better performing theme?  We are in a very competitive niche and our website must perform quickly both desktop and mobile. If this is feasible is this a major production? Note we are very happy with the design and this would solely be to improve download speeds to improve the user experience and get better ranking. Thanks, Alan

    | Kingalan1
    0

  • In respect to a safety clothing manufacturer manage SEO on behalf of, I've noticed that product distributors own 85% of the page 1 SERPs leaving product manufacturers such as my client largely under represented for the vast majority of search queries such as 'safety boots'. Love to hear your opinion on why this is and how I can combat it? TIA!

    | resolved
    0

  • I have heard a lot about having a solid internal linking structure so that Google can easily discover pages and understand your page hierarchies and correlations and equity can be passed. Often, it's mentioned that it's good to have optimized anchor text, but not too optimized. You hear a lot of warnings about how over-optimization can be perceived as spammy: https://neilpatel.com/blog/avoid-over-optimizing/ But you also see posts and news like this saying that the internal link over-optimization warnings are unfounded or outdated:
    https://www.seroundtable.com/google-no-internal-linking-overoptimization-penalty-27092.html So what's the tea? Is internal linking overoptimization a myth? If it's true, what's the tipping point? Does it have to be super invasive and keyword stuffy to negatively impact rankings? Or does simple light optimization of internal links on every page trigger this?

    | SearchStan
    1

  • Hi, can anyone help me to understand if having category folder in URL  matters or not?  how to google treat a URL?  for example, I have the  URL www.protoexpress.com/pcb/certification but not sure google will treat it a whole or in separate parts?  if in separate parts, is it safe to use pcb/pcb-certification?  or it will be considered as keyword stuffing? Thank you in anticipation,

    | SierraPCB
    1

  • I know that in order to rank on any keyword I need to talk about different "concepts / topics " everyone has a different word for it but let say I need to talk about multiple subject to make it simple. My question is how to find those subjects ... in some industry it is pretty straight forward you go to related search  or some keyword tool such as Moz and you find what you need. Example : If "Title tag" is my main topic the subtopic that I find and that I need to cover on the same page are "title tag length , title tag checker, mobile title tag, title tag example etc..." On my keywords such as " Alsace bike tours" all I find in related searches and using all the tools out there such as Moz keyword research explorer is "Alsace cycling vacations "  ""Cycling Colmar" "Alsace bike trip " etc... not really anything exiting , it means the same thing and it is just variations of the keyword. I have used other tools such as Marketmuse and they give me related topic such as "Strasbourg" "Colmar" "half timbered houses" "Alsace wine"  and I am not sure it is any better... because to cover those I have no other solution that doing definitions... or describe those in details which is probably not what someone typing "Alsace bike tour" is looking for. I have the feeling that all those tools are great for keywords like "content marketing" or "title tag" with a lot of requests but they fail for everything else. Can someone give me an insight on how they do to write on multiple topic where they are in my situation and based on the example I gave which topic they would cover and based on my example.. Thank you,

    | seoanalytics
    1

  • We are adding existing customer reviews to Product Detail Pages pages. There are about 300 reviews per product so we're going to have to paginate reviews off of the PDP page. I'm wondering what the best url structure for reviews pages is to get the most seo benefit. For example, would it be something like this?  site.com/category/product/reviews/page-1 or something that used parameters, such as: site.com/reviews?product=a Also, what is the best way to show that the internal link on the PDP page to "All Reviews" is a higher priority link than the other links on the page?

    | katseo1
    0

  • Let me give me you an example. For example for the keyword title tag (let's imagine) I would want to rank on that. I go to the keyword explorer or related searches at the bottom of google there are many questions people have.. I find expressions (with the same user intent)  such as "title tag length", "title tags generator", " "why are title tag importants" (I found this one using the are questions drop down menu of the keyword explorer). With this in hand I can create a page where I answer all those questions. I would have all those expressions being an H2 and answer the questions using related phrases and context word that I will find with the keyword explorer in my paragraph below. Let now take one of my keyword "Sicily bike tours". If I type this expression int he keyword explorer...the only related phrases (with the same user intent) that I find are "Sicily bike tour", "Sicily cycling tours", "Sicily bike trips"... (first thing I noticed is that it is just variation of my main expression not really question...). If I look at questions I find "what is the highest elevation in Sicily" or "How safe is Sicily for tourists". I don't imagine on a page that sells bikes tours in Sicily having h2 tags that answers those questions...  and this is not what people that rank do, they describe their tour and this is what is confusing to me. Let's now take a secondary related keyword to main keyword. Let' s take "Sicily cycling tours" (it is a secondary related keyword to "Sicily bike tours". Based on the keyword explorer, the secondary related phrases to "Sicily cycling tours" are "tour of Sicily". "trips to Sicily".... ( isn't that going to be boring and look unnatural to use all those expressions ? ). There are all synonyms of my expression but not really different which is my worry ? Or can I use an expression such as "Sicilian villages" or "Sicily maps" even though they don't have the same user intent) as  my secondary related keyword "Sicily cycling tours". Thank you,

    | seoanalytics
    0

  • Hello, Can you rank on a keyword like "Loire Valley bike tours" (plural version) on the 1 st page  that describes your tour with just 1 tour and not a lit of tours ? Thank you,

    | seoanalytics
    0

  • Is anyone familiar with class=X-hidden-focus? Do these links still contain link juice or are they similar to no follow?

    | Colemckeon
    0

  • I have categories whirlpools .     saunen .    dampfduschn . etc and  got sub categories Ts- Serie Whirlpool Modelle T15 Serie Whirlpool Modelle I have changed the title of the head categories and also the sub categories, but google change the title of my subcatgegoreis very quick and now 4 days later still the head navigigation not changed what does that mean ? google index my head navigation bad ? regards
    Marcel

    | HolgerL
    0

  • Hi There We are currently redesigning the following site https://tinyurl.com/y37ndjpn The local pages links in the main menu do provide organic search traffic. In order to preserve this traffic, would be wise to preserve these links in the main menu? Or could we have a secondary menu list (perhaps in the header or footer), featured on every page, which links to these pages? Many Thanks In Advance for Responses

    | ruislip18
    0

  • Hi All, Need a suggestion on it. For buttons, I am using links in tag instead of "ahref". Do you know that can Google read links inside "div" tag? Does it pass rank juice? It will be great if you can provide any reference if possible.

    | pujan.bikroy
    0

  • We have a potential client who operates a jobs board in a niche sector in the UK. They want to start a blog but don't want to set it up on the same server as the main jobs site. Discussion started around Wordpress, and their preference is for the WP.com hosted version in a directory or subdomain of the TLD. Our concerns are around the different locations of the two sites (impact of two diff server locations and IP addresses?) but also the limitation of WP.com to interlink the two sites enough that they provide a decent customer experience. Thoughts, musings, advice - all welcome! Tks

    | AB-Marketing
    0

  • Hi guys, Just want to confirm if we need some SEO actions on two of our sites. Example:
    https://www.example.com.au/collections/dresses
    https://exampleamerica.com/collections/dresses Will the domain naming will fix the issue of possible duplication?
    Do we still need to implement hreflang markups?

    | brandonegroup
    0

  • Hi all, We are checking to optimize a website that has four language versions in subfolders. When setting the self referencing canonicals and the hreflang tags, we came across a particular problem. Both the URL's example.com and example.com/nl are being indexed and have the same content in the same language. For the other language versions, it is quite straightforward, but what to do with these two URL's? Currently, there is a canonical tag from example.com to example.com/nl. Is a simple 301 redirect to the URL with the language subfolder the best solution? Something to consider: if a backlink points to example.com (without specifying the language subfolder), all the link juice will go to the /nl version and not to other versions (with a canonical as well as with a 301 of course). Thanks!

    | Mat_C
    0

  • Dear team Moz, I'm investigating SEO issues for the site that dropped rankings over a period of 4-6 months;  after conversion from old platform (xenForo) to new custom developed platform. The old version of the site was a simple xenForo based forum; with threads having standard url structure as like www.domain.com/threads/thread-title.{thread_id}/. Notice the trailing slash. We chose to keep the URLs intact during conversion to new platform; however the site still lost rankings. I'm sure there could be multiple reasons for it - but I wish to know if I should adjust the URLs - 1. By 301 redirecting all the URLs with trailing / to the URLs without /. 2. Leave the URLs as they were. I must also mention that the new site has several new sections; and the old forum is just one part of it. The rest of the site follows URLs without trailing / - as it's the recommended URL structure by Google. I'd really appreciate your suggestions on this.

    | KaustubhKatdare
    0

  • Hello Moz Community! I'm a bit stuck with this one and have read a few varying answers! Basically I have an eCommerce website on a .com domain name selling to the UK and Europe currently. I have recently created a very similar selling the same products but solely to the US with a .US domain name. What is the best practice with this, will the 2 separate sites be okay left as they are, or do I need to credit the UK site from the US site as they are incredibly similar? The sites are: www.rhinox-group.com www.rhinox-group.us Thanks in advance!

    | josh.sprakes
    0

  • Hello, We have received permission from a consultant we partner with to publish one of his articles on our site (listing him as the author, of course). However, he currently has the article published on his site, so if I put it on my site will I get penalized for stealing content? Is there some sort of tagging that will provide him/his site credit? Maybe a canonical tag?

    | AliMac26
    0

  • I have two questions: I have bought a domain that is a misspelled version of my domain. I have created an A record with DNS provider to point to my main domain's IP and on my main site I modified .htaccess file to make a 301 redirect if referrer is that misspelled domain. I also bought an expired domain with some relevant backlinks. I intend to create a simple page for that domain and add a link to my main site. Which of those two approaches are best from SEO point of view? Thanks

    | usabiliTEST_ux
    1

  • I have numerous clients who were at the top of page in the top 3 spots. They all dropped to page 3 or 4 or 2 and now they are number 1 in maps or in the top 3. Content is great on all these sites. backlinks are high quality and we do not build high quantity, we always focus on quality. the sites have authorship information. trust . we have excellent content written by professionals in the industry for each of the websites. The sites load super fast. they are very mobile friendly. we have CDN installed. content is organized per topic. all of our citations are setup properly and no duplicates, or missing citations. code is good on the websites. we do not have anchor text links pointing to the site from gust posts or whatever. we have plenty of content. our DA/PA is great. Audits of the website are great. I've been doing this a long time and ive never been so dumb founded as to what google has done this time. Or better yet what exactly is wrong with our clients websites today that was working perfectly for the last 5 years. I really am getting frustrated. im comparing my sites to competitors and everything's better. Please someone guide me here and tell me what im missing or tell me what you have done to recover from this nonsense.

    | waqid
    0

  • Hi all, Wondered if there was any wisdom on this that anyone could impart my way? I'm moving a set of pages from one area of the site to another - to bring them up the folder structure, and so they generally make more sense.  Our URLs are very long in some cases, so this ought to help with some rationalisation there too. We will have redirects in place, but the pages I'm moving are important and I'd like the new paths to be indexed as soon as possible.  In such an instance, can I submit an additional sitemap with just these URLs to get them indexed quicker (or to reaffirm that indexing from the initial parse)?  The site is thousands of pages. Any benefits /  disadvantages anyone could think of?  Any thoughts very gratefully received.

    | ceecee
    0

  • Hi, We have a sitemap on AWS that is retrievable via a url that looks like ours http://sitemap.shipindex.org/sitemap.xml.  We have notified Google it exists and it found our 700k urls (we are a database of ship citations with unique urls).  However, it will not index them.  It has been weeks and nothing.  The weird part is that it did do some of them before, it said so, about 26k.  Then it said 0.  Now that I have redone the sitemap, I can't get google to look at it and I have no idea why.  This is really important to us, as we want not just general keywords to find our front page, but we also want specific ship names to show links to us in results.  Does anyone have any clues as to how to get Google's attention and index our sitemap?  Or even just crawl more of our site?  It has done 35k pages crawling, but stopped.

    | shipindex
    0

  • We have a very large Knowledge center that is indexed. Is there any reason I should not exclude this subdomain from indexing? Thank you

    | NikCall
    2

  • I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?

    | rickyporco
    0

  • I haven't seen this much but wondered what do you think?

    | Chris2918
    0

  • Hello, I have create a new webpage and asked google in the webmaster tool to crawl it. Within minutes it is ranked at a certain spot. I did make changes to it to increase the ranking and right away I could see variations in ranking either up or down ? I have done the same same thing for a page that has been existing on my website for many years. I changed the content, asked the webmaster tool to re-crawl it. It got the new content within minutes but the ranking doesn't seem to change. Maybe my content isn't good enough but I doubt. Could it be that on old pages it takes a couple weeks to see ranking changes whereas on new page it is instantaneous. Has anyone experienced something similar ? Thank you,

    | seoanalytics
    1

  • Hi Everyone I wanted to see how people submit their urls to Google and ensure they are all being indexed.  I currently have an ecommerce site with 18,000 products.  I have sitemaps setup, but noticed that the various product pages haven't started ranking yet.  If I submit the individual url through the new Google Search Console I see the page ranking in a matter of minutes. Before the new Google Search Console you could just ask Google to Fetch/Render an XML sitemap and ask it to crawl all the links.  I don't see the same functionality working today on Google Search Console and was wondering if there are any new techniques people could share. Thanks,
    Anthony

    | abiondo
    1

  • I've been doing some research on a keyword with Page Optimization. I'm finding there's a lot of changes suggested. I'm wondering that because of the amount of changes required is it better to create a new page entirely from scratch that has all the suggestions implemented OR change the current page? Thanks, Chris

    | Chris2918
    1

  • Hi guys, We have noticed trailing slash vs non-trailing slash duplication on one of our sites. Example:
    Duplicate: https://www.example.com.au/living/
    Preferred: https://www.example.com.au/living So, SEO-wise, we suggested placing a canonical tag on all trailing slash pointing to non-trailing slash. However, devs have advised against removing the trailing slash from some URLs with a blanket rule, as this may break functionality in Magento that depends on the trailing slash. The full site would need to be tested after implementing a blanket rewrite rule. Is any other way to address this trailing slash duplication issue without breaking anything in Magento? Keen to hear from you guys. Cheers,

    | brandonegroup
    0

  • As a good practice of SEO is to have your keywords in the links. I am thinking of doing some optimization and change my urls to more effective keywords. I am using shopify and there is an option (a tick) that you can check while changing the url (ex. for a category, for a product, for a blog post). This will give a redirection to the old post to the new. Is it good practice? Is it risky for losing SEO or it will help to rank higher because I will have better keywords in my links?

    | Spiros.im
    0

  • If a website doesn't have a true folder structure, how much does have the page path structured like 
    /shoes/rain-boots/ actually help establish hierarchy and flow of equity?
    Since /rain-boots/ doesn't actually live in the /shoes/ folder? Will you simply have to use internal linking to get the same effect for the search engine?

    | SearchStan
    1

  • Question Which link structure is better in 2019 for best SEO practice Example A)  https://www.fishingtackleshop.com.au/soft-plastic-lures/ Or B) https://www.fishingtackleshop.com.au/fishing/fishing-lures/soft-plastic-lures/ We're on the bigcommerce platform and used to have https://www.fishingtackleshop.com.au/categories/soft-plastic-lures/ Last year we went from bigcommerce long URL to short to bypass the link juice being sent to /categories Now we have an SEO company trying to sell me their services after a bit of a steady decline since september 2018  and told me that we should have link structure as example B and that is likely the reason for the dip..  Due to breadcrumbing, True or False?   
    I explained i had bread crumb like shown in  https://www.fishingtackleshop.com.au/berkley-powerbait-t-tail-minnow/  buy the SEO guy said no it needs to be in the URL structure too. I was under the impression that Short urls opposed to long was better these days and link juice is passed better if it is short url direct to the point?  Am i wrong?

    | oceanstorm
    1

  • I have a client who is closing down his local business because he'smoving to another state. When he gets there he will launch a new website.On his current website, he put in a lot of work and has a ton of good copy, including blog posts that have helped gain him excellent rankings.He's asking me if he can use that copy on his new site and get original author credit for that, like he did on his current site.Can he use the same copy from his current website on his new websitewithout any problems — and get original author credit for it?Would it be best to shut down the old site or to 301 all of the pages beingmoved to the new corresponding pages?If 301's are the way to go, how long should he leave those in place?Thanks!Kirk

    | kbates
    1

  • I am cataloguing the pages on our website in terms of which focus keyword has been used with the page. I've noticed that some pages repeated the same keyword / term. I've heard that it's not really good practice, as it's like telling google conflicting information, as the pages with the same keywords will be competing against each other. Is this correct information? If so, is the alternative to use various long-winded keywords instead? If not, meaning it's ok to repeat the keyword on different pages, is there a maximum recommended number of times that we want to repeat the word? Still new-ish to SEO, so any help is much appreciated! V.

    | Vitzz
    1

  • In our blog listings page, we limit the number of blogs that can be seen on the page to 10. However, all of the blogs are loaded in the html of the page and page links are added to the bottom. Example page: https://tulanehealthcare.com/about/newsroom/ When a user clicks the next page, it simply filters the content on the same page for the next group of postings and displays these to the user. Nothing in the html or URL change.  This is all done via JavaScript. So the question is, does Google consider this hidden content because all listings are in the html but the listings on page are limited to only a handful of them? Or is Googlebot smart enough to know that the content is being filtered by JavaScript pagination? If this is indeed a problem we have 2 possible solutions: not building the HTML for the next pages until you click on the 'next' page. adding parameters to the URL to show the content has changed. Any other solutions that would be better for SEO?

    | MJTrevens
    1

  • Hi, we have a slight issue with our website. We have been proactively doing SEO for the past year, but we have run into a slight issue. Our website is ranking for search terms everywhere except Our local area (UK) We have tried creating separate sections of our site targeted just at the UK In search console. As well as targeting the whole site as UK preferred and setting the hreflang tags to en-GB. Nothing seems to be working, any ideas? Thanks in advance!

    | SEODale
    1

  • Hi mozzers, We just redesigned our homepage and discovered that our main nav is using JS and when disabling JS, no main nav links was showing up. Is this still considered bad practice for SEO? https://cl.ly/14ccf2509478 thanks

    | Ty1986
    1

  • Howdy, I have a small dilemma. We built a new site for a client, but the old site is still ranking/indexed and we can't seem to get rid of it. We setup a 301 from the old site to the new one, as we have done many times before, but even though the old site is no longer live and the hosting package has been cancelled, the old site is still indexed. (The new site is at a completely different host.) We never had access to the old site, so we weren't able to request URL removal through GSC. Any guidance on how to get rid of the old site would be very appreciated. BTW, it's been about 60 days since we took these steps. Thanks, Kirk

    | kbates
    0

  • If one URL has two 302 redirects, what are the implications?

    | TripIt
    1

  • We've just added to an originally English website, Italian and German translations. User can switch between them with right hand drop down language selection menu; then the entire page will be translated (including menu, body, footer) but the urls remain the same. The Italian page have some meta data (titles and descriptions) translated as well. Is it going to be a significantly negative effect on SEO to have the translated pages sharing the same urls?

    | D2i
    0

  • Hi there, Does anyone here any experience in filtering views in Google Analytics by TLD? I thought the filter type of hostname would have done what I was looking for but it hasn't and I can only find information online about doing it for subdomains rather than top level ones. Many thanks in advance.

    | BAO.Agency
    0

  • Hello, to protect our website against scrapping, visitor are redirect to a recaptcha page after 2 pages visited. But for a SEO purpose Google bot is not included in that restriction so it could be seen as cloaking. What is the best practice in SEO to avoid a penalty for cloaking in that case ?
    I think about adding a paywall Json shema NewsArticle but the content is acceccible for free so it's not a paywall but more a captcha protection wall. What do you recommend ?
    Thanks, Describe your question in detail. The more information you give, the better! It helps give context for a great answer.

    | clementjaunault
    1

  • Hi
    I've started working on our website and I've found millions of "Search" URL's which I don't think should be getting crawled & indexed (e.g. .../search/?q=brown&prefn1=brand&prefv1=C.P. COMPANY|AERIN|NIKE|Vintage Playing Cards|BIALETTI|EMMA PAKE|QUILTS OF DENMARK|JOHN ATKINSON|STANCE|ISABEL MARANT ÉTOILE|AMIRI|CLOON KEEN|SAMSONITE|MCQ|DANSE LENTE|GAYNOR|EZCARAY|ARGOSY|BIANCA|CRAFTHOUSE|ETON). I tried to disallow them on the Robots.txt file, but our Sessions dropped about 10% and our Average Position on Search Console dropped 4-5 positions over 1 week. Looks like over 50 Million URL's have been blocked, and all of them look like all of them are like the example above and aren't getting any traffic to the site. I've allowed them again, and we're starting to recover. We've been fixing problems with getting the site crawled properly (Sitemaps weren't added correctly, products blocked from spiders on Categories pages, canonical pages being blocked from Crawlers in robots.txt) and I'm thinking Google were doing us a favour and using these pages to crawl the product pages as it was the best/only way of accessing them. Should I be blocking these "Search" URL's, or is there a better way about going about it??? I can't see any value from these pages except Google using them to crawl the site.

    | Frankie-BTDublin
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.