Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi Guys, Is having a single h1 tag still best practice for SEO? Guessing multiple h1 tags dilute the value of the tag and keywords within the tag. Thoughts? Cheers.

    | kayl87
    0

  • We are migrating 13 websites into a single new domain and with that we have certain pages that will be terminated or moved to a new folder path so we need custom 301 redirects built for these. However, we have a huge database of pages that will NOT be changing folder paths and it's way too many to write custom 301's for. One idea was to use domain forwarding or a wild card redirect so that all the pages would be redirected to their same folder path on the new URL. The problem this creates though is that we would then need to build the custom 301s for content that is moving to a new folder path, hence creating 2 redirects on these pages (one for the domain forwarding, and then a second for the custom 301 pointing to a new folder). Any ideas on a better solution to this?

    | MJTrevens
    0

  • I updated my site's title and description about a week ago, however for some reason it's still not reflected in google search results. Here's my site and try searching for 'shopious directory'. Any idea why this is? I tried looking at webmaster tool and it seems that google didn't have any errors. Why is it still showing the old data?

    | herlamba
    0

  • I'm got a website with a slider and each of the 6 slides has a 5-second video background.  The website is B2B and the user profile for the website is employees at Fortune 1000 companies in the United States using desktop computers to browse.  The videos are highly optimized and we did testing using various browsers and bandwidth connections to determine the videos loaded fast enough on down to a 15mbit/s connection (which is pretty low by today's average U.S. business bandwidths.) We tried hosting the videos on Vimeo and YouTube but it caused issues in the timing of the slide show display.  (I've not seen any other website do what we do the way we do it.  Most sites have a single video background with a single text overlay on top.) The downside to this is that loading all those videos produces a lot of bandwidth usage for our server.  The website is serving a niche service industry though so we're not exceeding our current limits. I'm wondering though might there be some benefit to hosting just the video files on a CDN?  Obviously that would mean lest bandwidth usage for our server, and possibly quicker load times where the CDN server is closer to the user than our server.  But are there benefits or downsides from an SEO perspective noting that I'm proposing only putting the videos on the CDN, not the entire web page.

    | Consult1901
    0

  • Hi, I'm working with a site that has created a large group of urls (150,000)  that have crept into Google's index. If these urls actually existed as pages, which they don't, I'd just noindex tag them and over time the number would drift down. The thing is, they created them through a complicated internal linking arrangement that adds affiliate code to the links and forwards them to the affiliate. GoogleBot would crawl a link that looks like it's to the client's same domain and wind up on Amazon or somewhere else with some affiiiate code. GoogleBot would then grab the original link on the clients domain and index it... even though the page served is on Amazon or somewhere else. Ergo, I don't have a page to noindex tag. I have to get this 150K block of cruft out of Google's index, but without actual pages to noindex tag, it's a bit of a puzzler. Any ideas? Thanks! Best... Michael P.S., All 150K urls seem to share the same url pattern... exmpledomain.com/item/...   so /item/ is common to all of them, if that helps.

    | 94501
    0

  • Is it still possible to use anchor text to rank for a keyword that is not present on the landing page? Or are there any alternatives?

    | seoman10
    0

  • Hi I wonder if anyone could help me on a canonical link query/indexing issue. I have given an overview, intended solution and question below. Any advice on this query will be much appreciated. Overview: I have a client who has a .com domain that includes blog content intended for the US market using the correct lang tags. The client also has a .co.uk site without a blog but looking at creating one. As the target keywords and content are relevant across both UK and US markets and not to duplicate work the client has asked would it be worthwhile centralising the blog or provide any other efficient blog site structure recommendations. Suggested solution: As the domain authority (DA) on the .com/.co.uk sites are in the 60+ it would risky moving domains/subdomain at this stage and would be a waste not to utilise the DAs that have built up on both sites. I have suggested they keep both sites and share the same content between them using a content curated WP plugin and using the 'canonical link' to reference the original source (US or UK) - so not to get duplicate content issues. My question: Let's say I'm a potential customer in the UK and i'm searching using a keyword phrase that the content that answers my query is on both the UK and US site although the US content is the original source.
    Will the US or UK version blog appear in UK SERPs? My gut is the UK blog will as Google will try and serve me the most appropriate version of the content and as I'm in the UK it will be this version, even though I have identified the US source using the canonical link?

    | JonRayner
    2

  • Is it actually even possible to compete against Amazon to be #1 in Google SERPs against Amazon? If so - how? I run a boutique business selling a niche product, in 2008 - 2013 I was always #1 for my keywords.
    But since Amazon started the same type of products as well, I have now always been right under amazon results, who are at 1,2,3. Is it even possible to get to the #1 position any more? Thank you.

    | loginid
    0

  • Hi, A follow up question from another one  I had a couple of months ago: It has been almost 2 months now that my hreflangs are in place. Google recognises them well and GSC is cleaned (no hreflang errors). Though I've seen some positive changes, I'm quite far from sorting that duplicate content issue completely and some entire sub-folders remain hidden from the SERP.
    I believe it happens for two reasons: 1. Fully mirrored content - as per the link to my previous question above, some parts of the site I'm working on are 100% similar. Quite a "gravity issue" here as there is nothing I can do to fix the site architecture nor to get bespoke content in place. 2. Sub-folders "authority". I'm guessing that Google prefers sub-folders over others due to their legacy traffic/history. Meaning that even with hreflangs in place, the older sub-folder would rank over the right one because Google believes it provides better results to its users. Two questions from these reasons:
    1. Is the latter correct? Am I guessing correctly re "sub-folders" authority (if such thing exists) or am I simply wrong? 2. Can I solve this using canonical tags?
    Instead of trying to fix and "promote" hidden sub-folders, I'm thinking to actually reinforce the results I'm getting from stronger sub-folders.
    I.e: if a user based in belgium is Googling something relating to my site, the site.com/fr/ subfolder shows up instead of the site.com/be/fr/ sub-sub-folder.
    Or if someone is based in Belgium using Dutch, he would get site.com/nl/ results instead of the site.com/be/nl/ sub-sub-folder. Therefore, I could canonicalise /be/fr/ to /fr/ and do something similar for that second one. I'd prefer traffic coming to the right part of the site for tracking and analytic reasons. However, instead of trying to move mountain by changing Google's behaviour (if ever I could do this?), I'm thinking to encourage the current flow (also because it's not completely wrong as it brings traffic to pages featuring the correct language no matter what). That second question is the main reason why I'm looking out for MoZ's community advice: am I going to damage the site badly by using canonical tags that way? Thank you so much!
    G

    | GhillC
    0

  • Hello, I have done a redirect and still see in google index my old page after 3 weeks. My new page is there also Is it normal that the old page isn't dropped for the index yet ? Thank you,

    | seoanalytics
    0

  • We have a blog for our e-commerce site. We are posting about 4-5 blog posts a month, most of them 1500+ words. Within the content, we have around 10-20 links pointing out to other blog posts or products/categories on our site. Except for the products/categories, the links use non-optimized generic anchor text (i.e guide, sizing tips, planning resource). Are there any issues or problems as far as SEO with this practice? Thank You

    | kekepeche
    0

  • A script of ours had an error that caused some pages we didn't wish 410'd to be 410'd, we caught it in about 12 hours but for some pages it was too late. My question is, will those pages be reindexed again and how will that affect their page ranking will they eventually be back where they were?  Would submitting a site map with them help, or what would be the best way to correct this error (submit the links to google indexer maybe?).

    | Wana-Ryd
    0

  • Our relaunched website has a much lower bounce rate (66% before, now 58%) increased pages per session (1.89 before, now 3.47) and increased session duration (1:33 before, now 3:47). The relaunch was December 20th. Should these improvements result in an improvement in Google rank? How about in MOZ authority? We have not significantly changed the content of the site but the UX has been greatly improved. Thanks, Alan

    | Kingalan1
    1

  • I have a multi-country website that uses country subfolders to separate countries. When I run a Moz scan, I am getting canonical related alerts (this is probably related to some of our US content being duplicated on the other country websites). Shouldn't I be using href-lang instead since I am telling search engines that a certain article in country B, is just a copy of the same article in country A?

    | marshdigitalmarketing
    0

  • We are an e-commerce marketplace at for alternative fashion and home decor. We have over 1000+ stores on the marketplace.  Early this year, we switched the website from HTTP to HTTPS in March 2018 and also added noindex and nofollow tags to the store about page and store policies (mostly boilerplate content) Our traffic dropped by 45% and we have since not recovered. We have done I am wondering could these tags be affecting our rankings?

    | JimJ
    0

  • We plan to implement an exit intent layer.
    For mailing list expansion and/or to display personalized third party advertising. Any negative impact this may have on google rankings?
    Shall we limit exit intent layer so to show only to users who visited at least 2 pages on our site?
    Shall we reduce size of layer on mobile?
    Anything else to consider to minimize potential negative impact on google rankings?

    | lcourse
    0

  • Hi all I see that Moz gives data on Linking Domains and also External Links. The former being the number of domains which have one or more links pointing at your site and the latter the total number of links, including multiple links on the same domain. Apart from the potential benefit of people clicking the links and coming to your site and so increasing traffic, is there any SEO ranking benefit from multiple links? The only one I can think of is that you MAY get benefit from different anchor text for each link...? I'd be interested to hear any comment or experience on this. Bob

    | BobBawden1
    0

  • Hello, I upload a new website with new web addresses and my current addresses don't work anymore. I don't want to do redirects. Should I just remove the old address from google index using their tool or let google do it on its own. Thank you,

    | seoanalytics
    1

  • Hi, I have new web addresses for my subpages. None if them have external links. Should I do redirects to the new pages or just leave the old pages in 404 and let google crawl and rank the new page. I am asking because my current pages don’t have a good ranking and I am thinking starting with a clean url is better. Thank you,

    | seoanalytics
    1

  • Hey Moz Fam! I have a question regarding adding excerpts from guest posts/expert columns to your website. The reason this came up to begin with is we want to find a way to track ALL our social media posts engagement. When we share guest posts links, we obviously cannot track when someone clicks it since its not going to our own website. The idea was if we create an individual page with an excerpt from the guest post, then add a "read full post" button that links back to it, we could then instead share that excerpt post from our website to social media so we can track it. Then the user could go to view the full post on the actual guest article from there if they wish. We wanted to create a page titled "Media" and have all those posts categorized to fall on that page. Then canonicalize all of the individual posts to the top level media page. My question is... we've spent a lot of time and money on creating pages with unique quality content and updating any thin content to make it not thin. These new excerpt pages will be thin content, is that going to negatively impact our SEO? We're not trying to rank these excerpt pages or anything. I just want to make sure that adding these excerpt posts with only 100-200 words each is not going to hurt our overall SEO strategy.

    | LindsayE
    0

  • Moderator's Note: URL NSFW We have been desperately trying to understand over the last 10 days why our homepage disappears for a few days in the SERPS for our most important keywords, before reappearing again for a few more days and then gone again! We have tried everything. Checked Google webmaster - no manual actions, no crawl errors, no messages. The site is being indexed even when it disappears but when it's gone it will not even appear in the search results for our business name. Other internal pages come up instead. We have searched for bad back links. Duplicate content. We put a 301 redirect on the non www. version of the site. We added a H1 tag that was missing. Still after fetching as Google and requesting reindexing we were going through this cycle of disappearing in the rankings (an internal page would actually come in at 6th position as opposed to our home page which had previously spent years in the number 2 spot) and then coming back for a few days. Today I tried fetch and render as Google and was only getting a partial result. It was saying the video that we have embedded on our home page was temporarily unavailable. Could this have been causing the issue? We have removed the video for now and fetched and rendered and returned a complete status. I've now requested reindexing and am crossing everything that this fixes the problem. Do you think this could have been at the root of the problem? If anyone has any other suggestions the address is NSFW https://goo.gl/dwA8YB

    | GemmaApril
    2

  • Hi Everyone, We just did a site migration ( URL structure change, site redesign, CMS change). During migration, dev team messed up badly on a few things including SEO. The old site had pages canonicalized and self canonicalized <> New site doesn't have anything (CMS dev error) so we are working retroactively to add canonicalization mechanism The legacy site had URL’s ending with a trailing slash “/” <> new site got redirected to Set of url’s without “/” New site action : All robots are allowed: A new sitemap is submitted to google search console So here is my problem (it been a long 24hr night for me  🙂 ) 1. Now when I look at GSC homepage URL it says that old page is self canonicalized and currently in index (old page with a trailing slash at the end of URL). 2. When I try to perform a live URL test, I get the message "No: 'noindex' detected in 'robots' meta tag" , so indexation cant be done. I have no idea where noindex is coming from. 3. Robots.txt in search console still showing old file ( no noindex there ) I tried to submit new file but old one still coming up. When I click on "See live robots.txt" I get current robots. 4. I see that old page is still canonicalized and attempting to index redirected old page might be confusing google Hope someone can help to get the new page indexed! I really need it 🙂  Please ping me if you need more clarification. Thank you ! Thank you

    | bgvsiteadmin
    1

  • Hi, I am doing extensive keyword research for the SEO of a big webshop. Since this shop sells technical books and software (legal books, tax software and so on), I come across a lot of very specific keywords for separate products. Isn't it better to try and rank in the SERP's with all the separate product pages, instead of with the landing (category) pages?

    | Mat_C
    0

  • Hey Folks, So I have a 1000 word articles talking about say Dubai Holiday. Is it okay to have 4-5 Dubai Holiday as anchor linked to the same page. Or it should be only be used once.

    | SAISEO
    0

  • On a category page the products are listed via/in connection with the search function on the site.  Page source and front-end match as they should. However when viewing a browser rendered version of a google cached page the URL for the product has changed from, as an example - https://www.example.com/products/some-product to https://www.example.com/search/products/some-product The source is a relative URL in the correct format, so therefore /search/ is added at browser rendering. The developer insists that this is ok as the query string in the Google cache page result URL is triggering the behaviour, confusing the search function - all locally.  I can see this but just wanted feedback that internally Google will only ever see the true source or will it's internal rendering mechanism possibly trigger similar behaviour?

    | MickEdwards
    1

  • I'm trying to nail down why a site would be consistently showing a search visibility of 1-2% whilst similar competitors are in 20-30s. There are no site errors reported. Sitemap, robot.txt, meta titles & descriptions, key word presence, alt attributes, load times, canonicals, etc all check out as fine. Backlinks profile is healthier than competitors'. Yet even searching for our main product the result links to an obscure blog page rather than our main site, despite the presence of identical and similar keywords on our homepage, in our title, h1 tags, web address... Site content and design seem subjectively good and at the very least matches better performing competitor site's. Does anyone know of any less visible reason as to why a site would be tanking so badly in search rankings? Have checked using other SEO tools and they all report the same as Moz.

    | SimonZM
    1

  • Hi, We have a current category set up that is starting to rank OK but we are going through a site re-build and this category URL will now better describe a new category of products. My dilemma is if I 301 redirect the current url to my new category I won't be able to use the URL for the new one. But if I don't redirect it will the pages that have already been ranked under this url then confuse customers and search engines. For example - Products and sub-categories under the URL /personalised-toys will now become /personalised-toys-for-boys but I want to use the /personalised-toys URL for a different set of sub categories and products. Any assistance or ideas or just definitely don't do it in a particular way would be greatly appreciated

    | neil_stickergizmo
    0

  • I am working on a site that has just gone through a migration to Shopify at the very same time as Google did an update in October.  So problems from day 1. All main menu categories have subsequently over the past 6 weeks fallen off a cliff.  All aspects of the site have been reviewed in terms of technical, link profile and on-page, with the site in better shape than several ranking competitors. One issue that i'd like some feedback on is the main menu which has 4 iterations in the source. desktop desktop (sticky) mobile mobile (sticky - appears as a second desktop sticky but I assume for mobile) These items that are "duplicated" menus are the top level menu items only.  The rest of the nested menu items are included within the last mobile menu option. So desktop menu in source doesn't include any of the sub-menu items, the mobile version carries all these there are 4 versions of the top level main menu items in source Should I be concerned?  Considering we have significant issues should this be cleaned up?

    | MickEdwards
    0

  • Hi guys: I'm doing SEO for a boat accessories store, and for instance they have marine AC systems, many of them, and while the part number, number of BTUs, voltage, and accessories change on some models, the description stays exactly the same across the board on many of them...people often search on Google by model number, and I worry that if I put rel = canonical, then the result for that specific model they're looking for won't come up, just the one that everything is being redirected to. (and people do this much more than entering a site nowadays and searching by product model, it's easier). Excuse my ignorance on this stuff, I'm good with link building and content creation, but the behind-the-scenes aspects... not so much: Can I "rel=canonical" only part of the page of the repeat models (the long description)? so people can still search by model number, and reach the model they are looking for? Am I misunderstanding something here about rel=canonical (Interesting thing, I rank very high for these pages with tons of repeat descriptions, number one in many places... but wonder if google attributes a sort of "across the site" penalty for the repeated content... but wouldn't ranking number 1 for these pages mean nothing's wrong?. Thanks)

    | DavidCiti
    1

  • Hello, I have a large number of product pages on my site that are relatively short-lived: probably in the region of a million+ pages that are created and then removed within a 24 hour period. Previously these pages were being indexed by Google and did receive landings, but in recent times I've been applying a NoIndex tag to them. I've been doing that as a way of managing our crawl budget but also because the 410 pages that we serve when one of these product pages is gone are quite weak and deliver a relatively poor user experience. We're working to address the quality of those 410 pages but my question is should I be no-indexing these product pages in the first place? Any thoughts or comments would be welcome. Thanks.

    | PhilipHGray
    0

  • Hello, I am trying to use data highlighter but my webpage are not indexed yet and google says I can't highlight anything. However when I go on the structured Data Markup helper it let's me markup... Why is the and which one should I use. Thank you,

    | seoanalytics
    0

  • Hi everyone.
    We are doing a CMS migration and site redesign with some structural changes. Our temporarily Beta site (one of the staging environments and the only one that is not behind firewall) started appearing in search. Site got indexed before we added robots.txt due to dev error (at that time all pages were index,follow due to nature of beta site, it is a final stage that mirrors live site) As an remedy,  we implemented robots.txt for beta version as : User-Agent: *
    Disallow: / Removed beta form search for 90 days. Also, changed all pages to no index/no follow . Those blockers will be changed once code for beta get pushed into production. However, We already have all links redirected (301) from old site to new one. this will go in effect once migration starts (we will go live with completely redesigned site that is now in beta, in few days). After that, beta will be deleted completely and become 404 or 410.  So the question is, should we delete beta site and simple make 404/410 without any redirects (site as is existed for only few days ). What is best thing to do, we don't want to hurt our SEO equity. Please let me know if you need more clarification. Thank you!

    | bgvsiteadmin
    0

  • Hi All, I recently did a 301 redirect. Page to Page and the notified google via its console. Its been 6 days since. The home page and one other high traffic page swopped out with the new domain on google search index with 3-4 drops in ranking for each. The rest of the sites pages have been indexed but still reflect the old domain when searched. Recently today my home page dropped even further to the second page of google index for the specific keyword. Can you share similar experiences and how long it took you to recover rank fully? and how long for all pages to swop out on google search's index? Regards Mike

    | MikeBlue1
    0

  • Recently one of my clients was hesitant to move their new store locator pages to a subdomain.  They have some SEO knowledge and cited the whiteboard Friday article at https://moz.rankious.com/_moz/blog/subdomains-vs-subfolders-rel-canonical-vs-301-how-to-structure-links-optimally-for-seo-whiteboard-friday. While it is very possible that Rand Fiskin has a valid point I felt hesitant to let this be the final verdict.  John Mueller from Google Webmaster Central claims that Google is indifferent towards subdomains vs subfolders. https://www.youtube.com/watch?v=9h1t5fs5VcI#t=50 Also this SEO disagreed with Rand Fiskin’s post about using sub folders instead of sub domains.  He claims that Rand Fiskin ran only 3 experiments over 2 years, while he has tested multiple subdomain vs subfolder experiments over 10 years and observed no difference. http://www.seo-theory.com/2015/02/06/subdomains-vs-subfolders-what-are-the-facts-on-rankings/ Here is another post from the Website Magazine.  They too believe that there is no SEO benefits of a subdomain vs subfolder infrastructure.  Proper SEO and infrastructure is what is most important. http://www.websitemagazine.com/content/blogs/posts/archive/2015/03/10/seo-inquiry-subdomains-subdirectories.aspx Again Rand might be right, but I rather provide a recommendation to my client based on an authoritative source such as a Google engineer like John Mueller. Does anybody else have any thoughts and/or insight about this?

    | RosemaryB
    3

  • Hi all, I'm running an international website across 5 regions using a correct hreflang setup. A problem I think I have is that my blog structure is not standardized and also uses hreflang tags for each blog article. This has naturally caused Google to index each of the pages across each region, meaning a massive amount of pages are being crawled. I know hreflang solves and issues with duplication penalties, but I have another question. If I have legacy blog articles that are considered low quality by Google, is that counting against my site once or multiple times for each time the blog is replicated across each region? I'm not sure if hreflang is something that would tell Google this. For example, if I have low quality blog posts: blog/en-us/low-quality-article-1 
    blog/en-gb/low-quality-article-1 
    blog/en-ca/low-quality-article-1 Do you think Google is counting this as 3 low quality articles or just 1 if hreflang is correctly implemented? Any insights would be great because I'm considering to cull the international setup of the blog articles and use just /blog across each region.

    | MattBassos
    0

  • Hello, For the keyword Normandy cycling, it seems according to the result that people are looking for the bike routes. My question : can i rank indicating my favorite bike routes (personal routes) or doIi need to stick to what is already considered as the best biking routes in Normandy, the tour de Manche, the veloscenic, the velo Francette and so forth ? Thank you,

    | seoanalytics
    1

  • Hi Friends,We have recently (about a month ago) launched a new website, and during the review of that site spotted a serious misconfiguration of our old terrible WP siteThis misconfiguration, which may have come from either sitemaps or internal links or both lead to displaying our french german and english sites on each others’ domains. This should be solved now, but they still show in SERPS: The big question is: What’s the best way to safely remove those from SERPS?We haven’t performed as well as we wanted for a while and we believe this could be one of the issues:Try to search for instance“site:pissup.de stag do -junggesellenabschied” to find english pages on our german domain, each link showing either 301 or 404.This was cleaned to show 301 or 404 when we launched our new site 4 weeks ago, but I can still see the results in SERPS, so I assume they still count negatively?Cheers!

    | pissuptours
    0

  • Hi everyone, I have a question about embedding videos on a website: if you optimize the title and description for the video in Youtube, will these be taken into account for the ranking of the page where the video is embedded? Or will only the Youtube link for the video show in SERP's, instead of the page itself? I've read in a post of Phil Nottingham that it's usually not a good idea to embed a Youtube video on your own site, but use Wistia instead, exactly to avoid cannibalisation of your own rankings. Is this correct? Thanks!

    | Mat_C
    0

  • Hello Moz community, As a means of a portfolio, we upload these PowerPoint exports – which are converted into HTML5 to maintain interactivity and animations. Works pretty nicely! We link to these exported files from our products pages. (We are a presentation design company, so they're pretty relevant). For example: https://www.bentopresentaties.nl/wp-content/portfolio/ecar/index.html However, they keep coming up in the Crawl warnings, as the exported HTML-file doesn't contain text (just code), so we get errors in: thin content no H1 missing meta description missing canonical tag I could manually add the last two, but the first warnings are just unsolvable. Therefore I figured we probably better noindex all these files… They appear to don't contain any searchable content and even then; the content of our clients work is not relevant for our search terms etc. They're mere examples, just in the form of HTML files. Am I missing something or should I better noindex these/such files? (And if so: is there a way to include a whole directory to noindex automatically, so I don't have to manually 'fix' all the HTML exports with a noindex tag in the future? I read that using disallow in robots.txt wouldn't work, as we will still link to these files as portfolio examples).

    | BentoPres
    0

  • Interested to know if anybody has any experience of FireCheckout Magento 1.9? The built-in Magento checkout doesn't seem to be mobile friendly and is a bit clunky, hoping to achieve a responsive checkout and a more user-friendly interface.

    | seoman10
    0

  • We are a company that sells pipe and fittings. An example of a part that someone will search for is : 3/4" PVC Socket I am not sure how best to represent the fraction in the title of the page that has such a product. I am concerned that if I use the forward slash it will be misinterpreted by search engines (although it will be interpreted properly by users). A lot of folk search for the product by the fraction size and so it would be good to be able to represent it in the title, but I don't want to get "punished" by confusing search engines. I could replace the forward slash with a hyphen or pipe symbol, but then may look a bit weird to our users... Any recommendations? Bob

    | BobBawden1
    1

  • I have a large international website. The content is subdivided in 80 countries, with largely the same content all in English. The URL structure is: https://www.baumewatches.com/XX/page  (where XX is the country code)
    Language annotations hreflang seem to be set up properly In the Google Search Console I registered: https://www.baumewatches.com the 80 instances of https://www.baumewatches.com/XX in order to geo target the directories for each country I have declared a single global sitemap for https://www.baumewatches.com (https://www.baumewatches.com/sitemap_index.xml structured in a hierarchical way) The problem is that the site has been online already for more than 8 months and only 15% of the sitemap URLs have been indexed, with no signs of new indexations in the last 3 months. I cannot think about a solution for this.

    | Lvet
    0

  • Our annual hosting plan expires soon. Our website is hosted on a shared server. Is there an SEO benefit to hosting our site on a dedicated server. Could this result in faster download times which is a ranking factor? Our traffic is currently low (only about 20 visits per day). Thanks!!
    Alan

    | Kingalan1
    0

  • Advise needed please We have rankings coming along nicely with a website that uses page content but we now need to start online shopping with woo commerce The url structure has always been a bit of a mess, but its quite in depth We are looking to move small paragraphs about each product cat (formerly put on Pages) information into the Product Category and then the Product information into the product page and redirect the old urls to the new urls. It would mean updating the permalinks also - My concern if there is less leverage with product categories - do these rank just as well as pages, are we going to see our rankings change dramatically in doing so? Added to that - is it best doing this change gradually or all at once (like staging site to get the set up ready) and then pushing live

    | KellyDSD86
    0

  • My website is only a week old or so but I have no pages showing on the serps or at least in the first few hundred results! I have many other hobby websites that had pages in the top 100 results instantly and the niche of this new website is tiny and not saturated so it should be up there already! All pages are indexed but non showing in the results. it feels like I have been penalised or something but I don’t see how or why? my website is www.magnet-fishing.co.uk of anyone can see anything obvious that I am missing regards Andy

    | Onlytopheadsets
    0

  • News articles on distributed.com are being indexed by Google, but not showing up for any search queries. In Google Search, I can copy and paste the entire first paragraph of the article, and the listing still won't show up in search results. For example, https://distributed.com/news/dtcc-moves-closer-blockchain-powered-trades doesn't rank AT ALL for "DTCC Moves Closer to Blockchain-Powered Trades", the title of the article. We've tried the following so far: re-submitted sitemap to search console checked manual actions in search console checked for any no-index/no-follow tags Please help us solve this SEO mystery!

    | BTC_Inc
    0

  • Hi, Community! Had a client for many years, MN Plumbing & Appliance Installation: mnplumbingandappliance.com We're re-branding as MN Plumbing & Home Services. We're wanting to change the domain to:mnplumbingandhomeservices.com Problem is, we have some 20+ great backlinks, a DA of 46 (pretty good for us), and a domain age of over 10 years! Will switching this domain cause SEO suicide?

    | Quistdesigns
    0

  • Hello, I am building a new website with a new web address for subpages. The domain name stays the same. I am wondering if I should do redirect to the few pages  that have an outside link going to them. I noticed all my subpage that don't have any external link have an authority of 18. I only have 1 subpage that has 2 external links and 1 of them has a spam score of 32 and then other one of 1. My website is about a 100 pages. What should I do for my subpages redirect , not redirect, redirect only the ones that have external links ? Thank you,

    | seoanalytics
    0

  • A common question from anxious webmasters who have separate/smaller mobile websites is:  _"Will Google move my site over to mobile-first, even though I'm not "ready" yet?" _It's not an easy question to answer.  Here's what we know so far: Google is currently already migrating websites over that have a strong correlation between mobile and desktop content Google is taking the migration process very slowly, and for now, is not migrating sites over that are not "ready". Google has not announced any firm timeline for when they plan to migrate the remainder of all websites. In answering this question, I typically mention all of the above to help allay any fears. I then state that it could maybe be anywhere from one year to several years before the process is over - but with a huge disclaimer that this is pure speculation, and that only Google knows for sure. Lastly, I reiterate that Google in the meantime is strongly encouraging webmasters w mobile sites to ensure that they match the desktop version (URLs, schema, video, metadata, etc).  So the choice is to either to upgrade to responsive/adaptive, or upgrade the mobile site.  This is where the future is going. STILL - any additional feedback / thoughts / ideas / tips on this are welcome, because I continue to struggle with answering this question for clients.   Thanks!

    | mirabile
    1

  • Hi, how would google+ disappearing after this year would affect the rel=publisher markup? Is it still relevant? Thanks!

    | rascordido
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.