Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Moz Q&A is closed.

After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Latest Questions

Have an SEO question? Search our Q&A forum for an answer; if not found, use your Moz Pro subscription to ask our incredible community of SEOs for help!


  • Hi, I have a site where they have: Disallow: /*? Problem is we need the following indexed: ?utm_source=google_shopping What would the best solution be? I have read: User-agent: *
    Allow: ?utm_source=google_shopping
    Disallow: /*? Any ideas?

    Intermediate & Advanced SEO | | vetofunk
    0

  • We have an ecommerce site, we'll say at https://example.com. We have created a series of brand new landing pages, mainly for PPC and Social at https://sub.example.com, but would also like for these to get indexed. These are built on Unbounce so there is an easy option to simply uncheck the box that says "block page from search engines", however I am trying to speed up this process but also do this the best/correct way. I've read a lot about how we should build landing pages as a sub-directory, but one of the main issues we are dealing with is long page load time on https://example.com, so I wanted a kind of fresh start. I was thinking a potential solution to index these quickly/correctly was to make a redirect such as https://example.com/forward-1 -> https:sub.example.com/forward-1 then submit https://example.com/forward-1 to Search Console but I am not sure if that will even work. Another possible solution was to put some of the subdomain links accessed on the root domain say right on the pages or in the navigation. Also, will I definitely be hurt by 'starting over' with a new website? Even though my MozBar on my subdomain https://sub.example.com has the same domain authority (DA) as the root domain https://example.com? Recommendations and steps to be taken are welcome!

    Intermediate & Advanced SEO | | Markbwc
    0

  • I have a client with multiple business addresses - 3 across 3 states, from an SEO perspective what would be the best approach for displaying a NAP on the website? So far I've read that its best: to get 3 GMB account to point to 3 location pages & use a local phone number as opposed to a 1300 number. Display all 3 locations in the footer, run of site

    Local Website Optimization | | jasongmcmahon
    1

  • I have 5 URLs that are "missing titles" however, all 5 are landing pages that were created in Pardot. how would I go about adding the missing title? Would I need to add it on our website platform or in Pardot?

    Technical SEO | | cbriggs
    0

  • There are 2 or 3 URLs and one image file that dozens of toxic domains are linking to on our website. Some of these pages have hundreds of links from 4-5 domains. Rather than disavowing these links, would it make sense to simply break these links, change the URL that the link to and not create a redirect? It seems like this would be a sure fire way to get rid of these links. Any downside to this approach? Thanks, 
    Alan

    Intermediate & Advanced SEO | | Kingalan1
    1

  • The vast majority of the 140 domains that link to our website are very low quality directories or and other toxic links. Only about 20-30 domains are not toxic (according to Link Research Tools confirmed by out manual inspection of these links). Would removing some of these links improve of MOZ Domain Rank? What if we cannot remove them, can NOZ detect a disavow file? In general would improving the ratio between good quality and poor quality links improve domain authority? Thanks,
    Alan

    Moz Bar | | Kingalan1
    2

  • At the moment for most of our sites, we have both a desktop and mobile version of our sites. They both show the same content and use the same URL structure as each other. The server determines whether if you're visiting from either device and displays the relevant version of the site. We are in a predicament of how to properly use the canonical and alternate rel tags. Currently we have a canonical on mobile and alternate on desktop, both of which have the same URL because both mobile and desktop use the same as explained in the first paragraph. Would the way of us doing it at the moment be correct?

    Intermediate & Advanced SEO | | JH_OffLimits
    3

  • At the moment, I am moving from olddomain.com (niche site) to the newdomain.com (multi-niche site). Due to some reasons, I do not want to use 301 right now and planning to use the canonical pointing to the new domain instead. Would Google rank the new site instead of the old site? From what I have learnt, the canonical tag lets Google know that which is the main source of the contents. Thank you very much!

    Intermediate & Advanced SEO | | india-morocco
    0

  • We are using link detox (Link Research Tools) to evaluate our domain for bad links. We ran a Domain-wide Link Detox Risk report. The reports showed a "High Domain DETOX RISK" with the following results: -42% (292) of backlinks with a high or above average detox risk
    -8% (52) of backlinks with an average of below above average detox risk
    -12% (81) of backlinks with a low or very low detox risk
    -38% (264) of backlinks were reported as disavowed. This look like a pretty bad link profile. Additionally, more than 500 of the 689 backlinks are "404 Not Found", "403 Forbidden", "410 Gone", "503 Service Unavailable". Is it safe to disavow these? Could Google be penalizing us for them> I would like to disavow the bad links, however my concern is that there are so few good links that removing bad links will kill link juice and really damage our ranking and traffic. The site still  ranks for terms that are not very competitive. We receive about 230 organic visits a week. Assuming we need to disavow about 292 links, would it be safer to disavow 25 per month while we are building new links so we do not radically shift the link profile all at once? Also, many of the bad links are 404 errors or page not found errors. Would it be OK to run a disavow of these all at once? Any risk to that? Would we be better just to build links and leave the bad links ups? Alternatively, would disavowing the bad links potentially help our traffic? It just seems risky because the overwhelming majority of links are bad.

    Intermediate & Advanced SEO | | Kingalan1
    0

  • I have had a huge increase in direct traffic to our website but not sure why this suddenly happened? (no promos during this time period), traffic up 200%+ according to Google Analytics

    Reporting & Analytics | | Julia_a1a
    1

  • I have been getting conflicting advice on the best way to implement schema for the following scenario. There is a central e-commerce store that is registered to it's own unique address which is "head office". There are a few physical shops each of which has their own location and address. Each shop has its own landing page within /our-stores/. So each page on the website has the Organisation schema for the central 'organisation', something like: Then on each physical store landing page is something like the following as well as the Organisation schema: Is this correct? If it is should I extend LocalBusiness with store URL and sameAs for GMB listing and maybe Companies House registration? It's also been suggested that we should use LocalBusiness for the head office of the company, then Departmentwith the typeStore.  But i'm not sure on that?

    Technical SEO | | MickEdwards
    0

  • what does optimal use of keywords in header tag actually mean given you indicate this as hurting seo factor?

    Technical SEO | | Serg155
    0

  • My website contains 110+ pages in which 70 are CONNECTION TIMEOUT while checking in Screaming Frog. Can someone help me in getting this solved? My website Home Page Sanctum Consulting.

    Moz Pro | | Manifeat9
    0

  • Hey Everyone, I've been searching for data on the percentage of people who click on paid vs organic. My last stats which are now outdated show 60% of the people click on organic on average and 40% click on paid. Any help/links would be greatly appreciated.

    Competitive Research | | JohnSammon
    0

  • I hope I am explaining this correctly. If I need to provide any clarity please feel free to ask. We currently use a domain mask on an external platform that points back to our site. We are a non-profit and the external site allows users to create peer-to peer fundraisers that benefit our ministry. Currently we get many meta issues related to this site as well as broken links when fundraisers expire etc. We do not have a need to rank for the information from this site. Is there a way to index these pages so that they are not a part of the search engine site crawls as it relates to our site?

    Technical SEO | | SamaritansPurse
    0

  • I recently migrated from https://whitefusemedia.com to https://whitefuse.com. The website URL structure and content remained the same and I followed all the best practice guidance regarding checks on the new domain and appropriate 301 redirects. I have seen traffic drop by about 50% and the traffic that is still coming through is mainly coming through links still listed by Google under the old domain (https://whitefusemedia.com). Is this normal? Should I expect to see this bounce back, or is there anything I can do now to regain the rankings?

    Technical SEO | | wfm-uk
    0

  • Our company wanted to experiment on whether it truly is more beneficial to use Yext for citations rather than to do them ourselves. The thought process here, is that when we manually do the citations, some of our listings would increase in quality. The problem we have been running into, is that Yext has exclusive deals with nearly half of the sources we were previously listed under. Is there a way around this, or is Yext truly worth the cost?

    Local Listings | | rburnett
    1

  • Due to the restraints of category page layout many of the products in certain categories have the product titles truncated, in some cases missing off 2-5 words depending on the product in question.  The product name which displays on the category page is lifted straight from the product page itself, so not possible to do something like "product name including spec..."  to place ... to indicate a bit more. I'm assuming not but just wanted to check that Google will not frown on this.  Text is not being hidden it just does not render fully in the restricted space.  So there is a scenario of 'bits of' text in the source not displaying on the rendered page.

    Technical SEO | | MickEdwards
    0

  • We are working with a content based startup that needs to 301 redirect a lot of its pages to other websites. Will give you an example to help you understand. If we assume this is the startups domain and URL structure www.ourcompany.com/brand1/article What they want to do is do a 301 redirect of www.ourcompany.com/brand1/ to www.brand1.com I have never seen 301 as a problem to SEO or link juice. But in this case where all the major URLs are getting redirected to other sites i was wondering if it would have a negative effect. Right now they have just 20-30 brands but they are planning to hit a couple of hundreds this year.

    Intermediate & Advanced SEO | | aaronfernandez
    0

  • Our primary website https://domain.com has a subdomain https://subDomain.domain.com and on that subdomain we have a jive-hosted community, with a few links to and fro. In GA they are set up as different properties but there are many SEO issues in the jive-hosted site, in which many different people can create content, delete content, comment, etc. There are issues related to how jive structures content, broken links, etc. My question is this: Aside from the SEO issues with the subdomain, can the performance of that subdomain negatively impact the SEO performance and rank of the primary domain? I've heard and read conflicting reports about this and it would be nice to hear from the MOZ community about options to resolve such issues if they exist. Thanks.

    Intermediate & Advanced SEO | | BHeffernan
    1

  • Hi there, In October, one of our customer's programmer made a change on their website to optimize its loading speed. Since then, the all the SEO's metrics has dropped. Apparently, the change was to move to CloudFlare and to add Gzip compression. I was talking with the programmer and he told me he had no idea why that happened. Now comes 5 months later and the SEO metrics havn't come back yet. What seems so wierd is that two keywords in particular had the most massive drop. Those two keywords were the top keywords (more than 1k of impressions a month) and now its like there is no impressions or clics at all. Did anyone had the same event occur to them? Do you have any idea what could help this case?

    Technical SEO | | H.M.N.
    0

  • Hello, In the site crawl report we have a few pages that are status 430 - but that's not a valid HTTP status code. What does this mean / refer to?
    https://en.wikipedia.org/wiki/List_of_HTTP_status_codes#4xx_Client_errors If I visit the URL from the report I get a 404 response code, is this a bug in the site crawl report? Thanks, Ian.

    Product Support | | ianatkins
    0

  • The reviews services advertise that your reviews and stars will be placed in your Google search results and this helps with rankings. Does anyone have experience using Yotpo or Reviews.io with a brick and mortar business? Or, any business for that matter? Thanks,

    Reviews and Ratings | | Jarod4566
    0

  • Update: Domain Authority 2.0 has arrived! Check it out over in Link Explorer or in your Campaigns, and visit our resource center for more information about the change. Hey Moz friends, I’m excited to share some news from the Moz product team. In the last few months our team of data scientists have been hard at work developing an improvement to one of the favorite SEO metrics used in digital marketing: Domain Authority, also referred to as “DA.” On March 5, 2019, we’ll release the new and improved Domain Authority algorithm, which includes a number of new factors that make this score even more accurate, trustworthy, and predictive than ever before. Having worked with marketing clients in the past and reported on Domain Authority during monthly reviews, I wanted to make sure we give our community enough advance notice to understand what is changing, why it’s changing, and what it might mean for your reporting.  Sudden, unexpected fluctuations in any core metric you use in reporting have the potential to make your job more difficult, so we want to help you start the conversation about this change with your stakeholders. Let’s start with the “why” ... Why is Moz changing the DA algorithm? The Search Engine Results Page (SERP) is constantly changing. Rankings change and the algorithms that drive those rankings change. For Moz to ensure you have the most accurate prediction possible, it means we need to update our algorithm from time to time to ensure it delivers on its promise. You trust Moz, in part, because of the accuracy of the data we create. We want to make sure that we’re providing you with the best data to make your work easier. To ensure that DA continues to accurately predict ability of sites to rank, and to remain reliable over time, we’ve decided to make some improvements. What can I expect from the DA algorithm update? Many sites should expect to see a change to their current Domain Authority score. Depending on the site, this change might be insignificant, but it’s possible the new algorithm will cause material adjustments. The new Domain Authority takes into consideration a number of additional factors, such as link pattern identification and Moz’s Spam Score metric, to help you deploy your SEO strategy. How can I prepare for this algorithm update? I recommend that you reach out to your stakeholders or clients prior to the March 5th launch to discuss this upcoming change. This can be an opportunity to both refresh them on the utility of Domain Authority, as well as plan for how to use it for additional link building or ranking projects. Visit this page to check out resources that may help you to have conversations with your stakeholders. If you feel inclined to save a snapshot of your current Domain Authority and history, you can consider exporting your historical data from your Moz Pro account. Is historical data changing? Yes. When the new DA algorithm goes into place, all historical data will be affected. However, for anyone who has an active Moz Pro campaign, you will be able to see a historical representation of the old DA line for reference for an interim period. As the “Metrics over time” chart is designed to help track your work over time, we believe applying the update to both past and present DA scores will help you to best track linear progress. Is Domain Authority an absolute score or a relative one? Domain Authority is a relative, comparative metric. Moz evaluates over 5 trillion pages and greater than 35 trillion links to inform Domain Authority. Your site’s links are evaluated amongst those trillions of links. Because of this, it is important to compare your DA to your competition, peers, and other sites that show up in search results important to your strategy. In terms of how to use Domain Authority, nothing is changing. If you use it to evaluate domains to purchase, it will function exactly the same. If you use it to find hidden keyword ranking opportunities, it will still be your best friend. It’s the same trusty tool you used before — we just sharpened for you. I saw a change to my DA when Link Explorer launched last April. What’s the difference between that change and this one? In April 2018, Moz released its new link index along with its new research tool, Link Explorer. Because the link index was so much larger than the previous index, and because Domain Authority is based on attributes discovered in that index, scores changed. Any changes that occurred were due to the upgrade of that link index, not how the algorithm calculated scores. The change coming in March 2019 will be an actual algorithm update to how Domain Authority is calculated. How will Page Authority (PA) be affected by this update? Page Authority will not be impacted by the March 2019 update. This particular algorithm update is specific to Domain Authority only. Will API users be affected at the same time? Yes. The Domain Authority metric in all of our products, including our API, will be affected by this update on March 5th. Check out this page for more resources about the Domain Authority algorithm update. You can also read more here in Russ Jones’s announcement post on the blog. We’d love to hear from you here in this Q&A thread, or you can send an email over to [email protected] with any questions.

    API | | BrianChilds
    22

  • Hi, I am working on a large global site which has around 9 different language variations. We have setup the hreflang tags and referenced the corresponding content as follows: (We have not implemented a version X-default reference, as we felt it was not necessary) Using DeepCrawl and Search Console, we can see that these language variations are causing duplicate title issues. Many of them. My assumption was that the hreflang would have alleviated this issue and informed Google what is going on, however i wanted to see if anyone has any experience with this kind of thing before. It would be good to understand what the best practice approach is to deal with the problem. Is it even an issue at all, or just the tools being over-sensitive? Thank you in advance.

    Technical SEO | | NickG-123
    0

  • Hello everyone! I hope an expert in this community can help me verify the canonical codes I'll add to our store is correct. Currently, in our Shopify store, the subsequent pages in the collections are not indexed by Google, however the canonical URL on these pages aren't pointing to the main collection page (page 1), e.g. The canonical URL of page 2, page 3 etc are used as canonical URLs instead of the first page of the collections. I have the canonical codes attached below, it would be much appreciated if an expert can urgently verify these codes are good to use and will solve the above issues? Thanks so much for your kind help in advance!! -----------------CODES BELOW--------------- <title><br /> {{ page_title }}{% if current_tags %} – tagged "{{ current_tags | join: ', ' }}"{% endif %}{% if current_page != 1 %} – Page {{ current_page }}{% endif %}{% unless page_title contains shop.name %} – {{ shop.name }}{% endunless %}<br /></title>
    {% if page_description %} {% endif %} {% if current_page != 1 %} {% else %} {% endif %}
    {% if template == 'collection' %}{% if collection %}
    {% if current_page == 1 %} {% endif %}
    {% if template == 'product' %}{% if product %} {% endif %}
    {% if template == 'collection' %}{% if collection %} {% endif %}

    Intermediate & Advanced SEO | | ycnetpro101
    0

  • Hi, I've seen a fair amount of topics speaking about the difference between domain names ending with or without trailing slashes, the impact on crawlers and how it behaves with canonical links.
    However, it sticks to domain names only.
    What about subfolders and pages then? How does it behaves with those? Say I've a site structured like this:
    https://www.domain.com
    https://www.domain.com/page1 And for each of my pages, I've an automatic canonical link ending with a slash.
    Eg. rel="canonical" href="https://www.domain.com/page1/" /> for the above page. SEM Rush flags this as a canonical error. But is it exactly?
    Are all my canonical links wrong because of that slash? And as subsidiary question, both domain.com/page1 and domain.com/page1/ are accessible. Is it this a mistake or it doesn't make any difference (I've read that those are considered different pages)? Thanks!
    G

    Technical SEO | | GhillC
    0

  • This is the second time I have posted this question and never got a satisfactory result. I have an SEO client in Tacoma Wa and when you type (Dispensaries Near Tacoma they are in the Top 3 snack pack and the Google maps shows 20 other similar businesses. However, when you search (Dispensary Near Tacoma) only 3 or 5 recreational marijuana shops show up and my client disappears. Someone earlier suggested it could be because of the categories selection, but that can't affect ALL the other shops and like I said it happens in other cities. for example Dispensary Near Olympia vs Dispensaries Near Olympia.  I have the full write up and pictures and diagrams on my blog. Please HELP! This could affect your future clients also. https://isenselogic.com/local-business-disappearing-on-google-maps-when-plurals-used/

    Local Listings | | isenselogic
    0

  • How does google view vertical bar pipe separation in content ? For example I want to create highlights. If i write something like that Sentence A | Sentence B | Sentence C | Sentence D | Is it considered the same paragraph or different paragraphs ? Thank you

    Intermediate & Advanced SEO | | seoanalytics
    0

  • Hello, We run two sites with the same product, product descriptions and url structure. Essentially, the two sites are the same except for domain name and minor differences on the home pages. We've run this way for quite a few years. Both sites have a domain authority of 48 and there are not a large number of duplicate incoming links. I understand the "book" to say we should combine the sites with 301's to the similar pages. I am concerned about doing this because "site 2" still does about 20% of our business. We have been losing organic traffic for a number of years. I think this mainly has to do with a more competitive environment. However, where google used to serve both our sites for a search term it now will only show one. How much organic benefit should we see if we combine. Will it be significant enough to merge the two sites. Understandably, I realize the future can't be predicted but I would like to know if anyone has had a similar experience or opinion Thanks

    Intermediate & Advanced SEO | | ffctas
    0

  • Hey everybody, I have been testing the Inurl: feature of Google to try and gauge how long ago Google indexed our page. SO, this brings my question. If we run inurl:https://mysite.com all of our domains show up. If we run inurl:https://mysite.com/specialpage the domain shows up as being indexed If I use the "&as_qdr=y15" string to the URL, https://mysite.com/specialpage does not show up. Does anybody have any experience with this? Also on the same note when I look at how many pages Google has indexed it is about half of the pages we see on our backend/sitemap. Any thoughts would be appreciated. TY!

    On-Page Optimization | | HashtagHustler
    1

  • I am currently perplexed over a client's search results. They are an established company and well known in their field. (Unfortunately, I am not comfortable providing a link or their name.) The company is a consulting firm and let's assume it is an accounting firm, which it is not. When you search on BSC Accounting the results give them the first result but the next 18 results are around education - BSc Accounting. Consider the DA on the site is 34 and the PA for homepage is 39. Is there a chance that when someone is searching on accounting firms that having the BSC in the name skews what they are able to rank for? Forget about searches for their exact name, I am more interested in thoughts as to how the BSC effects general searches for their specialties.

    Branding | | RobertFisher
    1

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.