Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hey everyone, Hoping to get your take on this: We have some very high demand products, they usually sell out in minutes (lucky us, eh?!) We are implementing a queue function on a product page - basically if too many people try to check out at the same time, we dump them in a queue The queue could kick in before or after search engines have indexed the product page The product page has markup and on-page content relating to the product. The queue page exists on an external (yes, external) site The queue page will not have any of the product info, markup, or optimised page title Product page will 302 to queue page and starts a series of 302 redirects! Here's the sequence when queue is active: CANONICAL product page (with markup, on-page product info, optimised page title, etc.)
    >> 302 >> queue page on external domain (ZERO markup, product info or page title)
    >>302>> same queue page, but throwing a hashed queue ID into the URL (basically giving you your place in the queue)
    HELD IN QUEUE FOR A FEW MINUTES
    **>> 302> ** NON-CANONICAL product page (with markup, on-page product info, optimised page title, etc.) I can foresee two scenarios search engine has indexed product page prior to queue kicking in. Then queue kicks in 302ing search engine to queue page. because it's a 302 the crappy queue page content is indexed back to the originating product page. This causes search engines to drop the product page cos all the product-specific markup/content has been overwritten with crappy queue page content search engines don't manage to index product page before queue kicks in. They crawl product page URL, get 302 to queue page, index crappy queue page content and think the product page is crappy, so don't traffic it. They will recrawl the product page once the queue's turned off, only to discover the product has sold out - boo. I very much doubt the search engines will 'wait for a few minutes' so may never end up reaching the product page again. I'm trying to get the markup/product info and optimised meta data injected into the queue page, so that remains present at all points on the journey in the hope that this enables search engines to continue to rank and traffic the product page. What's your take on this? Any suggestions on how we might overcome the issues? (before you ask; avoiding using the queue system is impossible, sorry!) Thanks!

    | TSEOTEAM
    1

  • Hi all, I am looking for a conclusive answer on how to handle images on Wordpress websites. Most of the time we encounter the same problems regarding images. There are several options to make sure that images don't increase page load too much: Page caching and compressing: standard Lazy loading: helps decrease page load time, but Google might not crawl the images so not good for SEO. See this article on Googlebot scrolling. Correct image format (for example WebP): tried it several times and doesn't help much to decrease page load time. What is best practice? Are there standards or preferred options for the image dimensions and quality (max height, width, number of pixels, rectangular or square) before you upload it, also regarding responsiveness? Is it better to use .jpg, .png or WebP? To sum up, what should you do by default to handle images on websites so you can still have a good page speed even with loads of images? Thanks for your answers!

    | Mat_C
    0

  • I want to know which things increase the spam score of a website? What are the drawbacks of spam score on a website? How can we reduce our website spam score? Are backlinks from a high spam score website affect our website?

    | hdsiwake
    0

  • Hi all, We've recently migrated a Wordpress website from staging to live, but the robots.txt was deleted.  I've created the following new one: User-agent: *
    Allow: /
    Disallow: /wp-admin/
    Disallow: /wp-includes/
    Disallow: /wp-content/plugins/
    Disallow: /wp-content/cache/
    Disallow: /wp-content/themes/
    Allow: /wp-admin/admin-ajax.php However, in the site audit on SemRush,  I now get the mention that a lot of pages have issues with blocked internal resources in robots.txt file. These blocked internal resources are all cached and minified css elements: links, images and scripts. Does this mean that Google won't crawl some parts of these pages with blocked resources correctly and thus won't be able to follow these links and index the images? In other words, is this any cause for concern regarding SEO? Of course I can change the robots.txt again, but will urls like https://example.com/wp-content/cache/minify/df983.js end up in the index? Thanks for your thoughts!

    | Mat_C
    2

  • I run Magento 2 and have two stores, one intended for the EU and one for the US. 99% of the products available appear on both stores, there is an automatic redirect in place to either store depending on your location. But I think Google is seeing these as duplicate products/stores. Should I add the Index,NoFollow tag to one of the two stores? My issue is that I want both stores to rank in their geographical locations and I am concerned that by adding the NoFollow tag is will stop that dead in its tracks for one location. Any advice would be helpful.

    | moon-boots
    2

  • Hello mozers! I have quite the conundrum. My client used to have the unfortunate brand name "Meetoo" - which by the way they had before the movement happened! So naturally, they rebranded to the name Vevox in March 2019 to avoid confusion to users. However, when you search for their old brand name "Meetoo" the first organic link that pops up is their domain www.vevox.com. Now, this wouldn't normally be a problem, however it is when any #MeToo news appears in the media and we get a sudden influx or wrong traffic. I've searched the HTML and content for the term "Meetoo" but can only find one trace of this name through a widget. Not enough to hold an organic spot. My only other thinking is that www.vevox.com is redirected from www.meetoo.com. So I'm assuming this is why Vevox appear under the search term "Meetoo". How can I remove the homepage www.vevox.com from appearing for the search term "meetoo"? Can anyone help? AvGGYBc

    | Virginia-Girtz
    3

  • Hi all i have a website with 1k+ pages and i have schema markup code for reviews and FAQ's, so need help in knowing how to update code for all pages in one go without using tag manager as updating to all pages manually is similar to impossible, let me know is there any way out to achieve the results and my website is built on word-press, awaiting for earliest reply......... Thanks

    | atiagr123
    2

  • I noticed that every link on my site is being flagged up as a 302 temp redirect in Moz. The reason is because we have a multi store and use GeoIP to redirect anyone coming from their respective country. I'm guessing a 302 is the wrong way to do this - can anyone shed advice on the best practice for redirecting customers to geo-specific stores?

    | moon-boots
    0

  • Hi guys!Almost four months ago I performed a Wordpress domain migration. Three pet-based sites were migrated into a new pet-based one that incorporated them all - the new site is petskb.com and 240 posts were migrated.The site migration was performed via 301 wildcard re-directs using the .htaccess files in the old domains, which are still in place and working. I also used the site move tool in GSC. Afterwards, I performed an audit of the new site to ensure that all the old urls were being re-directed to the new one, which they were (and are). There have been no manual actions reported in GSC.The results have been very poor. A small few of the articles that were in the top 10 moved over and I quickly claimed the same positions in the new site. Most did not though and still sit >100 in the SERP or absolutely nowhere (or even omitted) using the main keyword.I've created about 60 new articles (using the same SEO analysis I did previously) since that time on the new site and not one of them has ranked <100 in all that time, whereas on the old sites they would initially rank somewhere in the top 50 after a couple of days and work their way up over the months. These new posts haven't moved though. The posts that were published on the new site four months ago are still in the exact same position.So, I've created new content, re-submitted the sitemap and manually requested re-indexing of the posts. Nothing has changed. I've hired SEO's and not one has found any problems with my site or how I performed the migration. Clearly there is a problem though. The original posts that were ranking previously and all the new posts have not moved in the SERP. There were a few spammy links pointing to the new domain but nothing significant, I did disavow these though - no more than on the old sites though.As a test, I created a new post on another domain which has no posts with the same long-tail keyword as one that has been on my new site for almost four months. The one I posted on the new domain out-ranked the one on petskb after just two days.Can anyone help? If you can I will personally travel to where you live and buy you several beers.Thanks,Matt

    | mattpettitt
    1

  • I use ScreamingFrog to generate sitemaps for my Magento 2 multistore, but I recently noticed two issues. Each category/page has two URLs. One with / and the end and one without. Every product has two URLs. One with /product-name and the other /shop/product-name. The URLs are canonicalised, but this is still a problem and I'm not sure exactly how to execute this in the htaccess file. So I need to: Remove all URLs without the / at the end and redirect them all to the URL with / at the end. Or vice versa. 301 redirect every single product (there are over 400) from shop/product-name to /product-name. How do I do this en mass in the htaccess file?

    | moon-boots
    0

  • Hey All, I have an old static HTML site, and the crawl errors are showing "http://www.website.com" and http://website.com" as the two separate pages because there is no canonicalization. Can I fix that with a rel="canonical" tag? There is just a folder of HTML files to add the tag to, so if the www. version is the true version, can I just add to all the pages? Or is there a better way to do this??

    | mbodine
    0

  • Hi there, I was hoping that somebody has a potential answer to this or if anyone else has experienced this issue. Our website has recently hit by a manual penalty (structured data wasn't matching the content on the page) After working hard on this to fix the issue across the site, we submitted a reconsideration request which was approved by Google a few days later. I understand that not all websites recover and it doesn't guarantee rankings will go back to normal, but it seems as if the traffic is continuing to drop at an even quicker rate. There's a number of small technical optimisations that have been briefed into the dev team such as: Redirecting duplicate versions, fixing redirects on internal links, There's also work on-page running in the background fixing up keyword cannibalization, consolidating content keyword mapping and ensuring the internal link structure is sound. Has this happened to anyone else before? If so, how did you recover? Any suggestions/advice would be really appreciated. Thank you

    | dbutler912
    0

  • I am very confused about domain age. I read many articles about domain age, some experts say domain age does matter in ranking and some experts say it doesn't matter in the ranking. Kindly guide me about domain age.

    | MuhammadQasimAttari
    0

  • Is there any significant benefit to creating online directory listings that only provide nofollow links to our domain? For context, whilst doing link gap analysis I've found our competitors are listed on local government directories such as getsurrey.co.uk and miltonkeynes.co.uk. Whilst these aren't seen as spam directories, it's still highly unlikely we'll receive much traffic through them. The links they provide to our domain have the nofollow tag. So I wonder whether there's any other benefit to investing the time in creating these listings? Would be interested to hear your thoughts Many thanks in advance

    | Opera-Care
    1

  • Let's say you were in the market to buy a domain for a large city in the UK.  Manchester for example. Would you prefer to own the .co.uk or the .com version? Do .co.uk have higher CTR/ranking factors over .com for GEO location based websites in the UK?

    | Mokdiiek
    0

  • Hi Mozers, There are over 200 pages from our site that have a meta tag "noindex" but are STILL being indexed. What else can I do to remove them from the Index?

    | yaelslater
    0

  • Do you guys think having a guest post close to the root domain has more link juice that being in subfolders? example.com/123  vs example.com/nov/123 Both pages have the same amount of internal links and both pages don't have external links

    | arango20
    1

  • Hi, How long is the delay to de-rank once you remove everything from a page ? Thank you,

    | seoanalytics
    0

  • Hello, I have read in the past that during a major update google puts all his ressources in the update and it seems  that they don't update search results anymore. Has someone noticed that too ? How long does it take for an update to be rolled out fully and have everything get back to normal ? Thank you,

    | seoanalytics
    0

  • Hi Everyone, I know there are a lot of tools like Siteliner, which can check the uniqueness of body copy, but are there any that can restrict the check to the title tags alone? Alternatively, is there an Excel or Google Sheets function that would allow me to do the same thing? Thanks, Andy

    | AndyRSB
    0

  • For the past 2-3 years, my site’s organic rankings have been dramatically shifting, ranking well for a few days and then dropping for a month or two, then ranking again for a few days, and then dropping again. During the time of higher rankings, form submissions increase significantly. The ranking increases or decreases are typically between 10-30 spots each time. I’ve done everything I can think of to address any issues: improved speed, limited 404s, changed the architecture of the site, updated link anchor text, etc. Nothing seems to work. The site has 88% of its traffic coming from desktop, with 9% from mobile and 3% from tablet. The disparity between desktop and mobile leads me to believe that the ranking issues are mobile-related, especially now that Google is using mobile-first indexing. I thought dwell time could be an issue, but session duration is 2:07 minutes and bounce rate is under 60%, with an average of 2.27 pages per session. That doesn’t indicate any quality of traffic issues to me. There are no warnings in Google Search Console, and speed is 58 for mobile and 86 for desktop on Page Speed Insights. I’ve been doing SEO for 12 years, but this has me stumped. My top suspicions are: Speed issues on mobile Penalties for redirects from old website to the new website Penalties for anchor text for the old brand name instead of the new one. The site is https://dragonflydm.com Has anyone seen anything like this? Any ideas?

    | dragonflydm
    0

  • We are using Amazon Web Server for www.mastersindia.co. Please help me to know what is idle timeout for server load balancer for AWS.

    | AnilTanwarMI
    0

  • Hi there We have 5 separate sites which handle different regions/niches that we work in, and we are planning to merge into one so we have a logical path for 301 redirects. The sites have DA's as follows: Site 1 - DA 36
    Site 2 - DA 31
    Site 3 - DA 29
    Site 4 - DA 27
    Site 5 - DA 20 Does anyone have any experience with how the DA would flow through to the new site? Each site currently relates to a different niche that we work with, and we are planning to keep the content structured similarly, probably like this: https://newtoplevelsite/site1/products,  https://newtoplevelsite/site2/products and so on. That makes 301 redirects easy and also gives us more control in managing users and different teams in Wordpress. We would link the different niches through the top menu and links within the pages. Is there a better solution? Would it make more sense to have  https://newtoplevelsite/products/site1,  https://newtoplevelsite/products/site2, and so on? Thanks for the ideas

    | ben10001
    0

  • I am wondering the best way to mark up an event page with multiple occurrences. For example, we have an event that happens over the course of 4 sequential weekends:
    9/28-9/29
    10/5-10/6
    10/12-10/13
    10/19-10/20 Our website allows us to enter multiple occurrences that results in a single event listing page which outputs all dates (to eliminate duplicate content, titles, metas, etc.) but allows each occurrence to output individually on our events calendar in the respective individual date. Each time the event is shown, it links to the same listing page. I am wondering if we can add event schema on a single listing multiple times to cover each occurrence. In the above example, we would have 4 schemas on the listing page for each date range/weekend. In our current schema, we end up with a start and end date identified as 9/28-10/20 but it is not clear that the event is just happening on the weekends with gaps in between. Any suggestions are welcome however, we are really trying to NOT list each as an individual event on the website both for the duplicate content issue and the extra burden on our client that lists events for a very large geographic area.

    | Your_Workshop
    0

  • A client is having their site redeveloped on a new platform in sections and are moving the sections that are on the new platform to a temporary subdomain until the entire site is migrated. This is happening over the course of 2-3 months. During this time, is it best for the site to use 302 temporary redirects during this time (URL path not changing), or is it best to 301 to the temp. domain, then 301 back to the original once the new platform is completely migrated? Thanks!

    | Matt312
    0

  • Please suggest me how to improve First Contentful Paint (FCP) and First Input Delay (FID) for my website (http://www.mastersindia.co/) on mobile. All the java scripts moved into footer, added async and defer on JS. Also, apply all the possible ways suggested buy the Google pagespeed insights tool but we did not see more improvement. We tried to defer and async all the CSS but while doing this our website gets break. Please help me to solve it. thread-18712370-27405935606745616.png

    | AnilTanwarMI
    0

  • Hey everyone, I have found few code issues with our new website and wanted to see how bad those problems are and if I have missed anything. If someone can take a look at this and help me it would mean the world. Thank you. all! We hired an agency to design a new site for us and it's almost ready, but the other day I found some problems that made me wonder if this new site might not be as good as I thought and I wanted to ask you to take a look at the code and possibly help me understand if from SEO prospective it is sound.  But I really want someone who understands SEO and web design to look at our code and point out what might be wrong there. Here is  a link to the actual site which is on a new server: http://209.50.54.42/ What I found few days ago that made me wonder something might not be right. Problem 1. Each page has 3 title tags, I guess whatever template  they are using it automatically creates 3 title tags. When you do " View Page Source" For example on this url: http://209.50.54.42/washington-dc-transportation when you view the code, the lines  Lines 16,19 and 20 have the title tag which in my opinion is wrong and there should only be one. Could this hurt our SEO? Problem 2.  Infinite duplicate urls found All following pages have INFINITE NUMBER OF DUPLICATE URLS. EXAMPLE: http://209.50.54.42/privacy-policy/8, http://209.50.54.42/privacy-policy/1048, http://209.50.54.42/privacy-policy/7, http://209.50.54.42/privacy-policy/1, http://209.50.54.42/privacy-policy you can add any type of number to this url and it will show the same page.  I really think this 2nd problem is huge as it will create duplicate content. There should be only 1 url  per page, and if I add any number to the end should give a 404 error. I have managed to find these 2 issues but I am not sure what else could be wrong with the code. Would you be able to look into this? And possible tell us what else is incorrect? I really like the design and we worked really hard on this for almost 5 moths but I want to make sure that when we launch the new site it does not tank our rankings and only helps us in a positive way. Thanks in advance, Davit

    | Davit1985
    0

  • Hello everyone! I hope an expert in this community can help me verify the canonical codes I'll add to our store is correct. Currently, in our Shopify store, the subsequent pages in the collections are not indexed by Google, however the canonical URL on these pages aren't pointing to the main collection page (page 1), e.g. The canonical URL of page 2, page 3 etc are used as canonical URLs instead of the first page of the collections. I have the canonical codes attached below, it would be much appreciated if an expert can urgently verify these codes are good to use and will solve the above issues? Thanks so much for your kind help in advance!! -----------------CODES BELOW--------------- <title><br /> {{ page_title }}{% if current_tags %} – tagged "{{ current_tags | join: ', ' }}"{% endif %}{% if current_page != 1 %} – Page {{ current_page }}{% endif %}{% unless page_title contains shop.name %} – {{ shop.name }}{% endunless %}<br /></title>
    {% if page_description %} {% endif %} {% if current_page != 1 %} {% else %} {% endif %}
    {% if template == 'collection' %}{% if collection %}
    {% if current_page == 1 %} {% endif %}
    {% if template == 'product' %}{% if product %} {% endif %}
    {% if template == 'collection' %}{% if collection %} {% endif %}

    | ycnetpro101
    0

  • Hi everyone, I have a weird problem that has been bothering me for the last few months. One of our urls: https://www.dcacar.com/lax-car-service.html Ranks for keywords such as "car service to lax", "lax car service", etc. It does pretty well from any location that we check: page 1, position 5-7. But here is the interesting part, when a searcher in actually in Los Angeles it does not show up at all, even on page 4. So if you're anywhere else in USA you'll see our landing page on page 1 position 5-7 but when you're in Los Angeles where we actually want the people to see the landing page it's no where to be found. I added our local office address at the bottom, also added a link to our local Yelp page hoping that might send some kind of signal to Google but as of now no luck. Has anyone experienced anything like this and what is the solution? What do we have to do to fix this weird problem. Thanks in advance

    | Davit1985
    1

  • So on Oct. 21, our direct traffic increased 3x and our organic traffic decreased 3x. And it has been that way ever since. Almost like they flip flopped. Additionally, that was the same day I started retargeting to our site. I have tagged all the links from the ads and they're being counted as google paid clicks in GA. And our accounts are linked. I am just dumbfounded as to how this could happen.

    | Eric_OWPP
    1

  • Hi Mozzers, I am working for an international client, in a highly regulated industry. As such, their international set-up is slightly confusing. They currently operate websites across multiple countries (with ccTLDs), as well as a global .com. E.g: domain.co.uk domain.it domain. es domain.com etc. Additionally, they offer multiple languages across each of these domains, which often cross over. E.g: domain.co.uk/en/, domain.co.uk/fr/, domain.co.uk/de/ domain.es/en/, domain.es/es/ domain.it/en/, domain.it/it/ domain.com/en/, domain.com/es/, domain.com/fr/, domain.com/de/ They are not currently using HREFLANG of any sort. Using EN as an example, this results in 6 URLs showing the same content, albeit for different languages/locations: Main URL domain.co.uk/en/category-A/     hreflang="en-GB" Multi-lingual variants from same domain... domain.co.uk/fr/category-A/     hreflang="fr-GB" domain.co.uk/de/category-A/     hreflang="de-GB" Cross domain variants from other ccTLDs... domain.es/en/category-A/     hreflang="en-ES" domain.it/en/category-A/     hreflang="en-IT" domain.com/en/category-A/     hreflang="en" Can anyone cleverer than myself confirm that the above would be the most effective set-up for this scenario, with each URL referencing each other in this way?

    | Pan1234
    0

  • Is there any set benefit in using a URL tracking engine on a domain for passing link juice? I.E. xxxx.com?$id=1111 to then redirect to shareasale? The client has an affiliate program and is thinking of running one in-house as well. Is there a benefit to a “redirect engine” that uses the website root domain?

    | KellyBrady
    1

  • Firstly, I understand what this percentage is. It's the ratio of external links that are "follow" -> compared to the links that are "no-follow". Four questions: This is definitely not accurate! I have loads of no-follow links Does anyone have ideas or techniques to add more healthy no-follow links? Am I completely misunderstanding this? Will this high score negatively affect my ranking? I could definitely use some help. Thanks so much in advance. I don't think my website address should help, but if you need it for context, it's estatediamondjewely.com.

    | SamCitron
    0

  • Hello, Let's say I want to rank on "Alsace bike tour" whatever tool I use Moz keyword explorer, google suggest , keyword.io, answer the public ... there are not questions... so... what  do I need to answer ? I imagine that for google there are some questions more relevant than others ? Should I answer do I need to bring my own bike or where will I go... ? and will google give me "points " for answering those questions even though people don't have questions... For the keyword title tag, it is easy, people ask the character limit,  title tag generator and so on but for may keywords like that ones I am targeting people have NO Questions ! Thank you,

    | seoanalytics
    0

  • Hi Everyone, I was wondering how Google counts the page depth on paginated pages. DeepCrawl is showing our primary pages as being 6+ levels deep, but without the blog or with an infinite scroll on the /blog/ page, I believe it would be only 2 or 3 levels deep. Using Moz's blog as an example, is https://moz.rankious.com/_moz/blog?page=2 treated to be on the same level in terms of page depth as https://moz.rankious.com/_moz/blog? If so is it the https://site.comcom/blog" /> and https://site.com/blog?page=3" /> code that helps Google recognize this? Or does Google treat the page depth the same way that DeepCrawl is showing it with the blog posts on page 2 being +1 in page depth compared to the ones on page 1, for example? Thanks, Andy

    | AndyRSB
    0

  • I have heard rumours that AddThis isn't good for SEO is that correct? Just thinking about adding it to my site.

    | seoman10
    0

  • Hello Moz! A massive site that you've all heard of is looking to syndicate some of our original editorial content. This content is our bread and butter, and is one of the primary reasons why people use our site. Note that this site is not a competitor of ours - we're in different verticals. If this massive site were to use the content straight up, I'm fairly confident that they'd begin to outrank us for related terms pretty quickly due to their monstrous domain authority. This is complex because they'd like to use bits and pieces of the content interspersed with their own content, so they can't just implement a cross-domain canonical. It'd also be difficult to load the content in an iframe with noindex,nofollow header tags since their own content (which they want indexed) will be mixed up with ours. They're also not open to including a link back to the product pages where the corresponding reviews live on our site. Are there other courses of action that could be proposed that would protect our valuable content? Is there any evidence that using schema.org (Review and Organization schemas) pointing back to our review page URLs would provide attribution and prevent them from outranking us for associated terms?

    | edmundsseo
    1

  • We have pdf files uploaded in the media of wordpress and used in our website. As these pdfs are duplicate content of the original publishers, we have marked links to these pdf urls as nofollow. These pages are also disallowed in robots.txt Now, Google Search Console has shown these pages Excluded as "Duplicate without user-selected canonical" As it comes out we cannot use canonical tag with pdf pages so as to point to the original pdf source If we embed a pdf viewer in our website and fetch the pdfs by passing the urls of the original publisher, would the pdfs be still read as text by google and again create duplicate content issue? Another thing, when the pdf expires and is removed, it would lead to 404 error. If we direct our users to the third party website, then it would add up to our bounce rate. What should be the appropriate way to handle duplicate pdfs? Thanks

    | dailynaukri
    1

  • I have a client who wants a website in Spanish and one in English. Typically we would use a multi-language plugin for a single site (brandA.com/en or /es), but this client markets to their Spanish-speaking constituents under a different brand. So I am wondering if we have BrandA.com in English, and the exact same content in Spanish at BrandB.com if there will be negative SEO implications and/or if it will be recognized as duplicate content by search engines?

    | Designworks-SJ
    1

  • If I have no reviews/ratings on the page itself and special/limited time offers and just a regular product page with a standard price, is there any ability to do product schema with it getting flagged for errors? Google's Structured Markup Testing Tool threw me an error when I test it without any of those: | One of offers or review or aggregateRating should be provided. | And even if it's possible, is there any point?

    | SearchStan
    0

  • Hi Moz Pros! I have been reading on the board for quite some time, despite all the insights you all share with us in the SEO world. I have a nut I can't crack and thought I would ask. Does any guru here know the factors google uses when they choose sites to add to GEO specific SERPS on a general query? Here is an example.  "Car insurance companies" Also attached. https://www.google.com/search?sxsrf=ACYBGNRvHrdh6z6sNatu-Pgbh-tbgiiQLQ%3A1569462097607&ei=UReMXbPfJIfwtAX_77aYCQ&q=car+insurance+companies+&oq=car+insurance+companies+&gs_l=psy-ab.3..0i71l8.321.321..471...0.2..0.0.0.......0....1..gws-wiz.8MmdfVjHrt4&ved=0ahUKEwjzy8P2re3kAhUHOK0KHf-3DZMQ4dUDCAs&uact=5
    I live in Austin obviously 🙂 Thanks for any input you'd be willing to share! 5Brenxf

    | TicksTire
    0

  • Hi Mozers, We have images on their own separate pages that are then pulled onto content pages. Should the standalone pages be indexable? On the one hand, it seems good to have an image on it's own page, with it's own title. On the other hand, it may be better SEO for crawler to find the image on a content page dedicated to that topic. Unsure. Would appreciate any guidance! Yael

    | yaelslater
    1

  • The website FAQ page we are working on has more than 50 FAQs. FAQ Schema guidelines say the markup must be an exact match with the content. Does that mean all 50+ FAQs must be in the mark-up? Or does that mean the few FAQs we decided to put in the markup are an exact match?

    | PKI_Niles
    0

  • Hello, My website - www.musillawfirm.com was recently hacked and has been de-listed by google. It had some sort of a crypto mining script on it that I was able to remove.  It shows up if you type in the domain but even a generic search for "musil law firm" does not show the site - it used to rank # 1 for that term and #1 or 2 for immigration lawyer in my local area.  If anyone can assist me in getting it re-indexed please let me know and let me know how much it would cost.  I tried getting it re-indexed through the search console, but no luck. Thank you kindly

    | musillawfirm
    0

  • We have a real estate website in which agents and builders can create their profiles. My question is shall we use h1 or h2 tags in business profile pages or make them according to web 2.0 standards? In case header tags are used,  if two agents have the same name and we have used h2 tag for them, then search result page will end up having two same h2's. Can someone please tell me the right way to manage business profiles in a website? Thanks

    | dailynaukri
    1

  • Hey there, I have a site with many languages. So here are my questions concerning the sitemaps. The correct way of creating a sitemap for a multilingual site is as followed ( by the official blog of Google ) <urlset xmlns="</span>http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xhtml="http://www.w3.org/1999/xhtml"> http://www.example.com/loc> <xhtml:link rel="alternate" hreflang="en" href="</span>http://www.example.com/"/> <xhtml:link rel="alternate" hreflang="de" href="</span>http://www.example.com/de"/> <xhtml:link rel="alternate" hreflang="fr" href="</span>http://www.example.com/fr"/><a href=" http:="" www.example.com="" fr"="" target="_blank"></xhtml:link><a href=" http:="" www.example.com="" de"="" target="_blank"></xhtml:link><a href=" http:="" www.example.com="" "="" target="_blank"></xhtml:link><a href=" http:="" www.sitemaps.org="" schemas="" sitemap="" 0.9"="" rel="nofollow" target="_blank"></urlset> **So here is my first question. My site has over 200.000 pages that all of them support around 5-6 languages. Am I suppose to do this example 200.000 times?****My second question is. My root domain is www.example.com but this one redirects with 301 to www.example.com/en should the sitemap be at ****www.example.com/sitemap.xmlorwww.example.com/en/sitemap.xml ???****My third question is as followed. On WMT do I submit my sitemap in all versions of my site? I have all my languages there.**Thanks in advance for taking the time to respond to this thread and by creating it I hope many people will solve their own questions.

    | Angelos_Savvaidis
    0

  • BACKGROUND: We are developing a new multi-language website that is going to have: 1. Multiple directories for various languages:
    /en-us, /de, etc....
    2. Hreflang tags
    3. Universal footer links so user can select their preferred language.
    and
    4. Automatic JS detection of location on homepage only, so that when the user lands on /, it redirect them to the correct location. Currently, the auto JS detection only happens on /, and no other pages of the website. The user can also always choose to override the auto-detection on the homepage anytime, by using the language-selector links on the bottom. QUESTION: Should we try to place a 301 on / to point to en/us? Someone recommended this to us, but my thinking is "NO" - we do NOT want to 301 /.  Instead, I feel like we should allow Google Access to /, because that is also the most authoritative page on the website and where all incoming links are pointing.  In most cases, users / journalists / publications IMHO are just going to link to /, not dilly dally around with the language-directory. My hunch is just to keep / as is, but also work to help Google understand the relationship between all of the different language-specific directories.  I know that Google officially doesn't advocate meta refresh redirects, but this only happens on homepage, and we likewise allow user to override this at any time (and again, universal footer links will point both search engines and users to all other locations.) Thoughts?  Thanks for any tips/feedback!

    | mirabile
    2

  • Okay, I thought I was following best practices.  In our industry, electronic hardware, we were ranking well for a particular product line (/spacers) but we wanted to do better.  We addressed several concerns that Moz found first; duplicate page titles, lack of meta-descriptions and overall lack of targeted keywords.  We also took a new approach to add a better structure to our site.  Instead of being presented with a list of part numbers we wanted the user to learn more about our products with content.  So we added a /products page with content and a product specific page (/spacers) that is almost a definitive buyers guide. We are attempting to answer the questions that we think our customers find most relevant.  Well our customers might find it relevant but Google sure didn't.  After our deployment of new content our rankings for targeted keywords in Google fell from  10-15 to 80-95 As an open ended question, could somebody explain to me why our ranks fell off a cliff? Homepage: https://www.lyntron.com
    New catalog summary page:  https://www.lyntron.com/products
    New content with focus to rank high:  https://www.lyntron.com/spacers TPdn6ym

    | jandk4014
    1

  • Hi there, Google is up to mass spamming, the latest one refers o an Enhancements > Breadcrukbs report, the message is: "...Google systems show that your site is affected by 24 instances of Breadcrumbs markup issues. This means that your Breadcrumbs pages might not appear as rich results in Google Search. Search Console has created a new report just for this rich result type..." I've used their Structured Data Testing Tool, no errors were highlighted. Can anyone fathom out what they're referring to, please?

    | jasongmcmahon
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.