Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • H1 tags on my client's website follow the template [Service] + [Location]. These two have their own span, meaning there are two spans in an H1 tag. class="what">Truck Repair near class="where">California, CA How do crawl bots see this? Is that okay for SEO?

    | kevinpark191
    0

  • Hi! I always get this notification on my pages 'Avoid Too Many Internal Links' when I run the Page Optimization Score. And this is the message I get how to fix it: Scale down the number of internal links on your page to fewer than 100, if possible. At a minimum, try to keep navigation and menu links to fewer than 100. On my website I got a desktop navigation menu and a mobile variant, so in the source this will show more internal links. If I hide those links with CSS for the view, is the problem then solved? So Does Google then see less internal links? Or does Google crawl everything? I'm curious how I can fix this double internal links issue with my navigation menu. 
    What are you guys ideas / experiences about this?

    | Tomvl
    0

  • Hi, My client's website occasionally asks users to verify their usage by checking the CAPTCHA box. This is only done once per session and is done randomly. Does having CAPTCHA before content loads block crawl bots from properly indexing pages? Does it have any negative impact on SEO? Thanks, Kevin

    | kevinpark191
    0

  • When inserting a link with a ''no follow'' rule, ''no opener'' keeps popping in. Can I ask if this will cancel the initial rule as I am not sure what no opener entitles?

    | ElizaMaria
    0

  • My URL is: https://bit.ly/2hWAApQ We have set up a CDN on our own domain: https://bit.ly/2KspW3C We have a main xml sitemap: https://bit.ly/2rd2jEb and https://bit.ly/2JMu7GB is one the sub sitemaps with images listed within. The image sitemap uses the CDN URLs. We verified the CDN subdomain in GWT. The robots.txt does not restrict any of the photos: https://bit.ly/2FAWJjk. Yet, GWT still reports none of our images on the CDN are indexed. I ve followed all the steps and still none of the images are being indexed. My problem seems similar to this ticket https://bit.ly/2FzUnBl but however different because we don't have a separate image sitemap but instead have listed image urls within the sitemaps itself. Can anyone help please? I will promptly respond to any queries. Thanks
    Deepinder

    | TNZ
    0

  • Hi, I have an issue with Google indexing the US version of our website rather than the UK version on Google.co.uk. I have added hreflang tags to both sites (https://www.pacapod.com/ and https://us.pacapod.com/), have updated and submitted an XML sitemap for each website and checked that the country targeting in search console is set-up correctly but Google are still indexing the wrong website.  I would be grateful for any assistance with this issue. Many thanks Eddie

    | mypetgiftbox
    0

  • Switched from and HTTPS to HTTP. My home page is facing a redirect issue from the http to https. Should I no index the HTTP or find the redirect and delete it? Thank you

    | LandmarkRecovery2017
    0

  • Is the rel=nofollow tag on this article a true NoFollow for the whole article (and all the external links to other sites in the article), or is it just for a specific part of the page? Here is the article: https://www.aplaceformom.com/blog/americans-are-not-ready-for-retirement/ The reason I ask is that I'm confused about the code since it has "printfriendly.com..." as a portion of the URL. Your help is greatly appreciated. Thanks!

    | dklarse
    0

  • Gajanand angela dayHi,
    I have a question from SEO experts and web developers.
    I want to setup a job website for 5 countries. for each country i will provide daily jobs listing on the basis of
    1. jobs by categories - for example : accounting jobs. IT jobs, Sales jobs
    2. jobs by city - for example : jobs in boston, jobs in chicago
    3. jobs by companies for example : jobs in facebook, jobs in emirates case :
    a company name " emirates " located in "boston" having vacancy of "accounting job " having position of full time this case job will be present in following categories . 1. accounting jobs in boston
    2. jobs in boston
    3. jobs in emirates and open any above option there will be filter box on left side showing
    position i.e full time
    salary i.e 1000-1500
    location i.e boston,chicago Q.1
    i want to know when user search on google these terms "accounting jobs in boston " or "jobs in boston" or "jobs in emirates" same job will display which url structure is recommended in for each search term? Q.2 how we can do on page SEO for these terms because jobs listing will be changing daily because of new jobs addition and content is changing not Q.3 should i create website on separate domains for each country or same domain but with different folders in it
    .co.uk or com/uk for UK and .ae OR .com/uae for UAE Note : i will also attach blog on it and each blog will focus on specific country knowledge for example for USA , how to find jobs in new york and for UAE how to find jobs in Dubai etc . Thanks in Advance

    | Shahjahaaan
    0

  • We implemented title tags and meta descriptions for one of our clients using a GTM and some JS / jQuery. It's been working well for months. Rankings started dropping and nothing had been changed. We tore our hair out. I finally noticed that Google doesn't show our titles/descripts in the SERPs anymore. So I double checked in the developer console that everything was working ok, and you can even see our title in the browser tab. Anyone else see this or have any ideas? Thanks!

    | bbarber57
    0

  • Hi Moz Community, I have a question about personalization of content, can we serve personalized content without being penalized for serving different content to robots vs. users? If content starts in the same initial state for all users, including crawlers, is it safe to assume there should be no impact on SEO because personalization will not happen for anyone until there is some interaction? Thanks,

    | znotes
    0

  • A client recently rolled out their UK business to the US. They decided to deploy with 2 WordPress installations: UK site - https://www.clientname.com/uk/ - robots.txt location: UK site - https://www.clientname.com/uk/robots.txt
    US site - https://www.clientname.com/us/ - robots.txt location: UK site - https://www.clientname.com/us/robots.txt We've had various issues with /us/ pages being indexed in Google UK, and /uk/ pages being indexed in Google US. They have the following hreflang tags across all pages: We changed the x-default page to .com 2 weeks ago (we've tried both /uk/ and /us/ previously). Search Console says there are no hreflang tags at all. Additionally, we have a robots.txt file on each site which has a link to the corresponding sitemap files, but when viewing the robots.txt tester on Search Console, each property shows the robots.txt file for https://www.clientname.com only, even though when you actually navigate to this URL (https://www.clientname.com/robots.txt) you’ll get redirected to either https://www.clientname.com/uk/robots.txt or https://www.clientname.com/us/robots.txt depending on your location. Any suggestions how we can remove UK listings from Google US and vice versa?

    | lauralou82
    0

  • Hello all, I must put URLs in my language Greek, Greeklish or in English? And at the end of url it is good to put -> .html? For exampe www.test.com/test/test-test.html ? What is the best way to get great ranking? I am a new digital marketing manager and its my first time who works with a programmer who doesn't know. I need to know as soon as possible, because they want to be "on air" tomorrow! Thank you very much for your help! Regards, Marios

    | marioskal
    0

  • Hi Moz folks! We are redesigning a website of 30,000+ pages. We are pulling together a spreadsheet for 301 redirects. So basically this: http://www.mywildlifesite.org/site/PageServerpagename=priorities_wildlife_endangered_species_protection#.Ws54SNPwbAw/mexican-spotted-owl Will direct to here, this is the nav architecture:
    https://mywildlifesite.org/wildlife-conservtion/endangered-species-act-protections/endangered-species-list/birds/mexican-spotted-owl My question is, can I and should I truncate that new destination URL to make it easy for Google to see that the page topic is really the owl, like this:
    https://mywildlifesite.org/endangered-species-list/mexican-spotted-owl Your input is greatly appreciated! Jane

    | CalamityJane77
    0

  • Our business is heavily dependent on SEO traffic from long tail search. We have over 400,000 pieces of content, all of which we found scraped and published by another site based out of Hong Kong (we're in the US). Google has a process for DMCA takedown, but doing so would be beyond tedious for such a large set of urls. The scraped content is outranking us in many searches and we've noticed a drastic decrease in organic traffic, likely from a duplicate content penalty. Has anyone dealt with an issue like this? I can't seem to find much help online.

    | Kibin
    0

  • Hi Moz Community, Are Bing/Yahoo crawlers different from Google’s crawler in terms of how they process client side JavaScript and especially content/data loaded by client side JavaScript? Thanks,

    | znotes
    0

  • The image I’ve attached is a screenshot of what shows up on mobile when I do a branded search for one of my clients. We have a Youtube video on the homepage featuring Harry Connick Jr. The video is directly under the hero image on the homepage, and the thumbnail for the video is the image that Google is pulling into the SERP. I don’t want this image in the SERPs. I’ve marked up the logo with schema but that doesn’t seem to be helping. Is there any other schema markup I can use to remedy this? What else I can do to influence which image Google pulls? zVfvz

    | tdastru
    0

  • Hi Moz Community, Is there a proper way to do SPA (client side rendered) and PWA without having a negative impact on SEO? Our dev team is currently trying to covert most of our pages to Angular single page application client side rendered. I told them we should use a prerendering service for users that have JS disabled or use server side rendering instead since this would ensure that most web crawlers would be able to render and index all the content on our pages even with all the heavy JS use. Is there an even better way to do this or some best practices? In terms of the PWA that they want to add along with changing the pages to SPA, I told them this is pretty much separate from SPA's because they are not dependent. Adding a manifest and service worker to our site would just be an enhancement. Also, if we do complete PWA with JS for populating content/data within the shell, meaning not just the header and footer, making the body a template with dynamic JS as well would that effect our SEO in any way, any best practices here as well? Thanks!

    | znotes
    0

  • Hi! I'm working on a client site at the moment and I've discovered a couple of pages that are 404ing but producing a 200 OK response. However, I have checked these URLs again and some are now producing a 404 Error response. No changes have been made (that I'm aware of) so it appears that the URLs are returning both 200 OK and 404 Error responses intermittently. Any ideas what could cause this and the best solution? Thanks!

    | daniel-brooks
    0

  • Does anyone have tips on how to work in an event system to avoid duplicate content in regards to recurring events? How do I best utilize on-page optimization?

    | megan.helmer
    0

  • Hi Moz community, Our tech team has recently decided to try switching our product pages to be JavaScript dependent, this includes links, product descriptions and things like breadcrumbs in JS. Given my concerns, they will create a proof of concept with a few product pages in a QA environment so I can test the SEO implications of these changes. They are planning to use Angular 5 client side rendering without any prerendering. I suggested universal but they said the lift was too great, so we're testing to see if this works. I've read a lot of the articles in this guide to all things SEO and JS and am fairly confident in understanding when a site uses JS and how to troubleshoot to make sure everything is getting crawled and indexed. https://sitebulb.com/resources/guides/javascript-seo-resources/ However, I am not sure I'll be able to test the QA pages since they aren't indexable and lives behind a login. I will be able to crawl the page using Screaming Frog but that's generally regarded as what a crawler should be able to crawl and not really what Googlebot will actually be able to crawl and index. Any thoughts on this, is this concern valid? Thanks!

    | znotes
    0

  • I noticed earlier this week that this page - https://www.ihasco.co.uk/courses/detail/bomb-threats-and-suspicious-packages?channel=care was being indexed instead of this page - https://www.ihasco.co.uk/courses/detail/bomb-threats-and-suspicious-packages for its various keywords We have rel=canonical tags correctly set up and all internal links to these pages with query strings are nofollow, so why is this page being indexed? Any help would be appreciated 🙂

    | iHasco
    0

  • Hi, I've a website with 6 pages identified as 'duplicate content' because they are very similar. This pages looks similar because are the same but it show some pictures, a few, about the product category that's why every page look alike each to each other but they are not 'exactly' the same. So, it's any way to indicate to Google that the content is not duplicated? I guess it's been marked as duplicate because the code is 90% or more the same on 6 pages. I've been reviewing the 'canonical' method but I think is not appropriated here as the content is not the same. Any advice (that is not add more content)?

    | jcobo
    0

  • Hello, A little over a month ago we switched our domain from A to B. The original was .com and the new one is at a .tech for good measure. We have meticulously done 301 redirects page by page but there has been a massive drop in traffic and I checked here to see that we've gone from a 30 domain authority to a 9. The company is a few months over 1 year old, but we're really looking at traffic accumulated in just about 8 months for the old domain. Is there any way to recover some of the old juice in this one other than the re-directs? Our number of backlinks have also severely dropped despite re-directs.

    | SteveSaf
    0

  • I'm trying to be mature and employ the Topic Cluster strategy to my content. In doing so I realized there are a few URL options. Some more difficult to execute than others. -Is it important to call out the Pillar Topic in your subtopic URL?
    -Does the Pillar Topic need to have its own landing page? (As opposed to just being part of the blog.) Here's an Example: My Pillar is: Inbound vs. Outbound
    My subtopic is: Marketing Platforms Here are the URL options I can think of... Option 1: https://pipelineinbound.com/blog/inbound-vs-outbound-marketing-platforms/ Option 2: https://pipelineinbound.com/blog/which-marketing-platforms/ Option 3: https://pipelineinbound.com/blog/marketing-platforms-inbound-vs-outbound/ Option 4 (Hardest): https://pipelineinbound.com/inbound-vs-outbound/marketing-platforms/ Are there some fundamental best practices for URL structure and Link Building as it pertains to Topic Clusters? Thanks!

    | dkellyagile
    0

  • Hi, I have a wordpress website that has articles/news posts witch contain imagery. I've noticed that in the Media Library, when you upload an image to a blog post it generates a new permalink ...article-name/article-image-01.jpg I have Yoast SEO plugin and have the option to set a canonical URL for this image. Should I point it back to the actual article? Thanks for any helpers with this.

    | Easigrass
    0

  • Hello Moz, I need urgent help. I remove a tonne of product pages and put everything into one product page to deal with duplicate content. I thought this was a good thing to do until I got an email from Google saying: "Googlebot identified a significant increase in the number of URLs on ****.com that return a 404 (not found) error. " I checked it out and found the problem: 4 Soft 404's
    41 Not Found's What do I need to do to fix this? Is it a problem or should I just ignore? I removed all the pages on WordPress but I need to do it somehow manually through Google? I have worked so hard on my SERP's that this will destroy me if I'm penalised. Please can someone advise?

    | crocman
    0

  • Hi all, I look after a website which sells a range of products. Each of these products has different applications, so each product has a different product page. For eg. Product one for x application Product one for y application Product one for z application Each variation page has its own URL as if it is a page of its own. The text on each of the pages is slightly different depending on the application, but generally very similar. If I were to have a generic page for product one, and add canonical tags to all the variation pages pointing to this generic page, would that solve the duplicate content issue? Thanks in advance, Ethan

    | Analoxltd
    0

  • hey there, I have website, Which is Doing best on google, But from last 2-3 times i am Getting error in Google Search Console, removed by request., here is screenshot http://prntscr.com/iva5y0 My Question is i have Not Requested Google to Remove My Homepage URL, Then 1. Why I am Getting This warning again & again, ?? 2. Will this Harm My Sites Current traffic & Ranking Status ? 3.What Steps I need Take to stop This warning came again & again ??? Thnx in advance & please Help Out on this Query

    | innovative.rohit
    0

  • Hello, I am being pushed to consolidate our over 6k redirects that have accumulated over the course of 4 years. These redirects are one of the many factors causing extensive load times for our website. Many to most or over a year old, have not been used, or simply redirect back to the home page. Other than looking to keep the pages that have external links (also looking for recommendations/tools), are there other best practices from an SEO stand point to ensure there are no major hits to our website. A little more info, I am looking to pair 6K down by Removing all Redirects that have not been used Removing all redirects that are over 1 yr+ Remove all redirects that redirect to simply the home page or a smaller big bucket subfolder
    This should take the number from 6K to around 300. Are there any major concerns? Pat

    | Owner_Account
    0

  • It feels good to be BACK. I miss Moz. I left for a long time but happy to be back! 🙂 My client is a local HVAC company. They sell Lennox system. Lennox provides a tool that we hooked up to that allows visitors to their site to 'see' 120+ different kind of air quality, furnace and AC units. They problem is (I think its a problem) is Google and other crawl tools are seeing these 100+ pages that are not unique, helpful or related to my client. There is a little bit of cookie cutter text and images and specs and that's it. Are these pages potentially hurting my client? I can't imagine they are helping. Best way to deal with these? Thank you! Thank you! Matthew

    | Localseo4144
    0

  • I can not understand which is the best way to target similar keywords. Do the best way is create landingpage for each long tail keyword landing page or better one but with all included keywords? On the siste i have landingpages: 1. Metal doors 1.2. Steel doors for private houses 1.3. Durvis 1.4 Metal doors for technical rooms and so on. In Latvian language it sounds ok. Some time ago for other sites it worked good but now it just does not work. I see google meses these results up and seo performance is bad. Can you suggest correct structure? Thanks  metāla durvis dzīvoklim un mājai m24 metāla durvis sos serviss durvju atvēršana

    | Felter
    0

  • I have a few questions related to home page URLs being indexed, canonicalization, and GA reporting... 1. I can view the home page by typing in domain.com , domain.com/ and domain.com/index.htm There are no redirects and it's canonicalized to point to domain.com/index.htm  -- how important is it to have redirects? I don't want unnecessary redirects or canonical tags, but I noticed the trailing slash can sometimes be typed in manually on other pages, sometimes not. 2. When I do a site search (site:domain.com), sometimes the HP shows up as "domain.com/", never "domain.com/index.htm" or "domain.com", and sometimes the HP doesn't show up period. This seems to change several times a day, sometimes within 15 minutes. I have no idea what is causing it and I don't know if it has anything to do with #1. In a perfect world, I would ask for the /index.htm to be dropped and redirected to .com/, and the canonical to point to .com/ 3. I've noticed in GA I see / , /index.htm, and a weird Google referral URL  (/index.htm?referrer=https://www.google.com/) all showing up as top pages. I think the / and /index.htm is because I haven't setup a default URL in GA, but I'm not sure what would cause the referrer. I tracked back when the referrer URL started to show up in the top pages, and it was right around the time they moved over to https://, so I'm not sure what the best option is to remove that. I know this is a lot - I appreciate any insight anyone can provide.

    | DigMS
    0

  • Hello, We've been following the best practices outlined in this Moz article for title tags for a long time, however I have a question about the spelling of the brand name when the brand name is composed of keywords. For example, let's say your company is called FitnessGearUSA and your website url is www.FitnessGearUSA.com. How would you spell the brand name in the title tag? In a single word 'FitnessGearUSA' or in 3 words 'Fitness Gear USA'. We've been using the first approach for a long time, but I am starting to think that the second approach would be better, especially when the customer might include a single keyword that's part of the brane name in his search For example, a product page title tag would then look like this:
    Model ABC Weight Lifting Bar | Fitness Gear USA I am inclined to think that this page would rank better for a search like:
    "Model ABC Lifting Bar USA" Thoughts?

    | yacpro13
    0

  • Hi all, We've recently re-launched one of our sites with a substantial redesign, refreshed content, meta data, descriptions and functionality. We noticed in SERPs that some of the page titles are showing the old name for the site, which hasn't been used for a few years and the site's been through a few updates and a URL change since then. All the meta titles showing up as they should in crawls through Search Console and Moz and it's my understanding that if Google were pulling a cached version of a title it would have gone for a more recently cached one? Any thoughts on why Google's turned back the clock on our site's name would be greatly appreciated! -Jamie

    | JamieCMF
    0

  • So this is my situation. I want to redirect : example.com/post1/ to example.com/post1/?m=yes

    | CarlLSweet
    0

  • When searching "Cambridge Savings Bank" which is the specific name of a bank in Bing the search results return with them as Number one with the correct URL. The issue is Bing has titled the search result "East Cambridge Savings Bank" (which is another entity all together) and still has the correct www.cambridgesavings.com URL. So basically Bing is putting another corporation's name on our website. This issue only is in Bing. Any idea how to correct this? 2.png 1.png

    | BOD2008
    1

  • Hi all, We are looking at changing our current e-commerce store to a new platform and in doing so thinking of making some changes to how we list products in sub-categories. We have seen related questions asking about splitting a single product into multiple products to rank for different terms, but we are wondering about combining multiple products into a single product page? The examples we have seen have been about fashion items with variants of colour and size. However, the products we sell have variances that change the appearance, dimensions and technical specification, so we would like to ask the MOZ community if combining products with these variances would still be deemed good practice? We sell wood burning stoves and a good example of a product that we are considering combining is the Scan 85 stove, which is available in eight different configurations: 85-1, 85-2, 85-3 etc. Scan themselves refer to each version as a separate product and they are bought, stocked and sold as separate products. Wood burning stoves like this typically have a firebox in the centre and then design options that can change the top, side, base, door, colour and fuel. In this example, the firebox is the Scan 85 and the variation is the last number, each of which corresponds to a different design option changing both the appearance and dimensions (see attached image). We have them listed as eight different products on our current site, one for each version. Primarily because each option has its own name (albeit 1-digit difference) which when we created the pages we thought that more pages would present us with more ranking opportunity. However, we have since learnt that because these eight pages are all so similar and it is difficult to write unique content about each product (with the 85-1 and 85-2 the only difference between the models are the black trim on the 85-1 and the silver trim on 85-2). Especially as when talking about the firebox itself, how well the fire burns, how controllable it is etc, will be the same for all versions. Likewise, earning backlinks to eight separate pages is also very difficult. Exploring this lead, us to the question, when is a variant a variant and when is it a separate product? Are there hard and fast rules for what defines variants and products? Or does it simply vary from industry to industry product to product, and if so should we be looking at it from a UX or SEO POV, when making that decision? Our hope is that if we combine these eight products into a single high-quality page, it will present us with a greater ranking opportunity for that one page over eight individual pages. We also hope that in doing so will allow us to create a more intuitive UX on a single page with a unique description, more reviews focused on one page and an explanation of the options available, all of which should lead to more conversions. Finally, by creating a better UX and unique detailed description we hope that there is a higher chance of us earning product level backlinks then we do with eight lower quality pages. One of the issues in creating a single product page for all the variants is the sub-category/results pages, as we would be removing eight simple products and replacing them with one complex product. We have questions over how this would work from a filter/facet level whereby when you apply a filter there is an expectation that the image shown will match the criteria, so if we filter for stoves with a silver trim for example, there is an expectation to only see stoves that have a silver trim in the results. When you have separate product pages you have separate listings which makes this easier to only bring back the models matching the criteria. However, when you have a single page this is more complex as you will need a default image for non-filtered results and then the ability to assign an image to lots of different attributes so that the correct image is always shown that matches the criteria selected. All of which we have been assured is do-able but adds an extra level of complexity to the process from an admin side. The alternative to doing this would be to create eight simple/child products and link them to one configurable/parent product. We could them list the simple products into the results pages and have them all linking back to the main configurable product which could load with the options of the simple product that was selected. From an SEO POV this brings in some more work, redirecting each page to the parent, but ultimately this could provide a better UX and might be the better solution. Has anyone got any experience in doing either of these options before? Both options above with affect the number of products we have available, so does the number of products in a sub-category effect the ability for that category page to rank? We currently have around 500 products in our wood burning stoves category, with perhaps an additional 300 to add. If we go down the combining into a single product page route this will reduce the number of products by around a third. If we keep all the simple/child products, then this will stay around the same. So, have we missed something obvious? Is there a glaring issue that we have overlooked from an SEO point of view as well as from the customer experience? We would appreciate your thoughts on this. Thanks, Reece scan85-1.jpg

    | fireproductsuk
    0

  • Hi, all First question here so be gentle, please My question is around geo targeted dynamic content; at the moment we run a .com domain with, for example, an article about running headphones and then at the end - taking up about 40% of the content - is a review of some people can buy, with affiliate links. We have a .co.uk site with the same page about running headphones and then 10 headphones for the UK market. Note: rel alternative is used on the pages to point to each other, therefore (hopefully) removing duplicate content issues. This design works well but it involves having to build links to two pages, in the case of this example. What we are thinking of doing is to just use the .com domain and having the product page of the page served dynamically, ie, people in the UK see UK products and people in US see US products. What are people's thoughts on this technique, please?  From my understanding, it wouldn't be any problem with Google for cloaking etc because a googlebot and a human from the same country will see the same content. The site is made in Wordpress and has <....html lang="en-US"> (for the .com) in the header. Would this cause problems for the page ranking in the UK etc? The ultimate goal of doing this would be to reduce link building efforts by halving the number of pages which links would have to be built for. I welcome any feedback. Many thanks

    | TheMuffinMan
    0

  • Hi, In the page I'm working on, I encountered an tag in an image, rather than in a text form. Do you think it's an issue when it comes to SEO?
    What do you suggest I should do if there is an issue? Keen to hear from you!

    | nerdieb
    0

  • Two versions of the same page are being served at the moment. Certain pages on the site redirect to the https version whilst others don't. I am being flagged for duplicate content because of this. Is this a simple fix? As in just redirect all to the https version and set the preferred version in WMT?

    | TAT100
    0

  • Hi All, I have a query for which i am struggling to find out the answer. I unable to retrieve URL using "site:" query on Google SERP. However, when i enter the direct URL or with "info:" query then a snippet appears. I am not able to understand why google is not showing URL with "site:" query. Whether the page is indexed or not? Or it's soon going to be deindexed. Secondly, I would like to mention that this is a dynamic URL. The index file which we are using to generate this URL is not available to Google Bot. For instance, There are two different URL's. http://www.abc.com/browse/ --- It's a parent page.
    http://www.abc.com/browse/?q=123 --- This is the URL, generated at run time using browse index file. Google unable to crawl index file of browse page  as it is unable to run independently until some value will get passed in the parameter and is not indexed by Google. Earlier the dynamic URL's were indexed and was showing up in Google for "site:" query but now it is not showing up. Can anyone help me what is happening here? Please advise. Thanks

    | SameerBhatia
    0

  • My client's website was built on Wix and despite having substantial content on web pages, Moz is claiming that there is very little-to-no content: Sufficient Characters/Words in Content. Does anyone know how to fix this issue? I've run a word count tool and it gives the same results. Website: pesclinical.com

    | Perfect-Pixel
    1

  • I've just started working on a website that has generated lots (100s) of broken internal links. Essentially specific pages have been removed over time and nobody has been keeping an eye on what internal links might have been affected. Most of these are internal links that are embedded in content which hasn't been updated following the page's deletion. What's my best way to approach fixing these broken links? My plan is currently to redirect where appropriate (from a specific service page that doesn't exist to the overall service category maybe?) but there are lots of pages that don't have a similar or equivalent page. I presume I'll need to go through the content removing the links or replacing them where possible. My example is a specific staff member who no longer works there and is linked to from a category page, should i be redirecting from the old staff member and updating the anchor text, or just straight up replacing the whole thing to link to the right person? In most cases, these pages don't rank and I can't think of many that have any external websites linking to them. I'm over thinking all of this? Please help! 🙂

    | Adam_SEO_Learning
    0

  • I keep running into a redirect chain issue trying to get a non-https/non-www domain to forward directly to the https/www domain on an Apache server. For background, we are forcing https and forcing www, but it appears the non-https/non-www domain is first redirecting to https/non-www and then redirecting again to the desired final https/www version of the domain. (Hope I am making sense here) I am trying to find code to add to my .htaccess file that will perform the following... 301 Redirect
    http://example.com directly to https://www.example.com (without 1st redirecting to https://example.com)
    http://www.example.com directly to https://www.example.com Any experts in this with any thoughts? Thanks,
    Fitz

    | FitzSWC
    0

  • Hi folks, I've just completed a straightforward olddomain -> newdomain migration. All the redirects were done on 7th Feb. I submitted the change of domain request on 7th Feb. All seemed fine - as can be seen in the attached. It's now 19th March and our pals at GSC are still saying that the domain migration is ongoing. I've never had this take so long before; 2-3 days tops. Their results are tanking as I can't geo target and more features in GSC are out of action as it's 'locked' due to this migration (I just get a screen as per the attached). Thoughts? Shall I risk withdrawing the request and starting anew? The old "turn it off and on again"? Thanks! hJXKC

    | tonyatfat
    0

  • Why don't the vast majority of sites using Drupal list keywords in the head section? Is there another convention used in Drupal that serves the same purpose for SEO? I noticed most of the Drupal info pages about keywords seem to drop off around 2010

    | fxarechiga
    0

  • Brief History: Our company made change to a new domain. Both domains had an SSL configured on it in which the old domain SSL was controlled and created by Shopify which gave us limited control. Because we couldn't redirect the old https://  to the new https:// So basically we duplicated our new HTML website and put canonical ref on all duplicate pages to the final domain to help get search to navigate to the newer domain. Question: In the near future I would like to take down the old domain and do a 301 domain forwarding. What is the correct course of action to complete this? Our old domain was indexed and SERP results were tied to it's https:// url's.

    | bnewt
    1

  • If I have 4 versions of my site http://www
    http://
    https://www
    https:// What is the best way to redirect without losing seo positions? i have been mainly using http://www but have recently added my ssl so https works also. I heard at Moz Con that I should get the https working. All of my marketing and ads are going to http://www 301 redirect 3 of them? Which 3? If https is becoming important, should that be my main url? will it hurt my seo to switch? Thank you so much in advance!

    | bhsiao
    0

  • A client has a single page app website that shows https://example.com/example when you visit https://example.com . I don't think this is a redirect; I think it's a URL rewrite. My questions: Is this setup common with single page apps? What are the SEO benefits or drawbacks of having a domain's homepage load, rewrite, or redirect to a subfolder?

    | Kevin_P
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.