Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi, how would google+ disappearing after this year would affect the rel=publisher markup? Is it still relevant? Thanks!

    | rascordido
    0

  • I have a client who sell retirement homes. Their current schema for each property is LocalBusiness - should this in fact be Product schema?

    | Adido-105399
    0

  • Hi, I am doing link cleaning, and still a bit new to this, and would appreciate the community's help 🙂 So, I have a site which has quite a lot of low DA (or no DA) follow backlinks. BUT, the links are from my niche the sites are not spammy the anchors are okay and they are from good Geo location for me The only negative thing is that these sites are a bit "dead" meaning that there is no new content, and thus there is no traffic or clicks coming from them. Should I keep those links or disavow them? To me these links are natural, but do they help me at all.... FYI I have plenty of good DA links. But what do you guys think, if I disavow all these low DA backlinks, does Google think that I am trying to manipulate my backlink structure to look better than it naturally is? Cheers guys and girls! 🙂

    | RistoM
    0

  • Hello! I have a website with over 1300 landings pages for specific products. These individual pages update on a 24hr cycle through out API. Our API pulls reviews/ratings from other sources and then writes/updates that content onto the page. Is that 'bad"? Can that be viewed as spammy or dangerous in the eyes of google? (My first thought is no, its fine) Is there such a thing as "too much content". For example if we are adding roughly 20 articles to our site a week, is that ok? (I know news websites add much more than that on a daily basis but I just figured I would ask) On that note, would it be better to stagger our posting? For example 20 articles each week for a total of 80 articles, or 80 articles once a month? (I feel like trickle posting is probably preferable but I figured I would ask.) Is there any negatives to the process of an API writing/updating content? Should we have 800+ words of static content on each page? Thank you all mozzers!

    | HashtagHustler
    0

  • Hi All, I am really concerned about doing a 301 redirect. This is my situation: Both Current and New Domain is registered with a local domain registrar (similar to GoDaddy but a local version) Current Domain: Servers are pointing to Wix servers and the website is built and hosted with Wix I would like to do a 301 redirect but would like to do it in the following way with a couple of factors to keep in mind: 99% of my link are only pointed to the home page/root domain only. Not to subdirectories. New Domain: I will register this with wix with a new plan but keep the exact sitemap and composition of current website and launch with new domain. Current Domain: I want to change server pointing to wix to point to local domain registrar servers. Then do a 301 redirect for only the home page/root domain to point to the new domain listed with wix. So 301 is done via local registrar and not via Wix. Another point to mention is it will also change from Http to Https as well as a name change. Your comments on the above will be greatly appreciated and as to whether there is risk in trying to do a 301 redirect as above. Doing it as above it also cheaper if I do the 301 via the wix platform I will need to register a full new premium plan and run it concurrently to the old plan whereas if I do it as mentioned above will only have the additional domain annual fee. Look forward to your comments. Mike

    | MikeBlue1
    0

  • Hi, Wondering if someone could possibly shed some light on why some of our pages are not being ranked properly on Google. For example this page https://www.mypetzilla.co.uk/dog-breeds Keyword "Dog Breeds" we can't be found on and we are absolutely baffled why? Could it be that we are listing all 100 and something dog breeds on one page? Should we introduce pagination or load more as user scrolls down. This page has been up for at least 4 years. Any suggestion or advice would be much appreciated. Many thanks

    | Mypetzilla
    0

  • Hi, i have a couple fashion clients who have very active blogs and post lots of fashion content and images. Like 50+ images weekly. I want to check if these images have been used by other sources in bulk, are there any good reverse image search tools which can do this? Or any recommended ways to efficiently do this for a large number of images? Cheers

    | snj_cerkez
    0

  • Hello, When I do this google search, this page(amandine roses category) appears before the one it is canonical-ed to(this multi-product version of amandine roses). This happens often with this multi-product template, where they don't rank as well as their category version(that are canonical to the multi-product version). Can someone maybe point us in the right direction on what the issue may be? What can be improved?

    | globalrose.com
    0

  • On my company's events calendar page when you click an event, it populates and overlay using AJAX, and then the link that is populated in that overlay then takes you to the actual events page. I see this as a problem with Google because it can't follow the AJAX link to the true event page, so right now nothing on those pages is getting indexed and we can't utilize our schema to get events to populate in the Google rich snippets or the knowledge graph. Possible solutions I considered: 1. Remove the AJAX overlay and allow the link from the events calendar to go directly to the individual event. 2. Leave the AJAX overlay and try to get the individual event pages directly indexed in Google. Thoughts and suggestions are greatly appreciated!

    | MJTrevens
    0

  • Hi all, I have been looking into this for about a month and haven't been able to figure out what is going on with this situation. We recently did a website re-design and moved from a separate mobile site to responsive. After the launch, I immediately noticed a decline in pages crawled per day and KB downloaded per day in the crawl stats. I expected the opposite to happen as I figured Google would be crawling more pages for a while to figure out the new site. There was also an increase in time spent downloading a page. This has went back down but the pages crawled has never went back up. Some notes about the re-design: URLs did not change Mobile URLs were redirected Images were moved from a subdomain (images.sitename.com) to Amazon S3 Had an immediate decline in both organic and paid traffic (roughly 20-30% for each channel) I have not been able to find any glaring issues in search console as indexation looks good, no spike in 404s, or mobile usability issues. Just wondering if anyone has an idea or insight into what caused the drop in pages crawled? Here is the robots.txt and attaching a photo of the crawl stats. User-agent: ShopWiki Disallow: / User-agent: deepcrawl Disallow: / User-agent: Speedy Disallow: / User-agent: SLI_Systems_Indexer Disallow: / User-agent: Yandex Disallow: / User-agent: MJ12bot Disallow: / User-agent: BrightEdge Crawler/1.0 ([email protected]) Disallow: / User-agent: * Crawl-delay: 5 Disallow: /cart/ Disallow: /compare/ ```[fSAOL0](https://ibb.co/fSAOL0)

    | BandG
    0

  • Hi, I am doing the SEO for a webshop, which has a lot of linking and related websites on the same root domain. So the structure is for example: Root domain: example.com
    Shop: shop.example.com
    Linking websites to shop: courses.example.com, software.example.com,... Do I have to check which keywords these linking websites are already ranking for and choose other keywords for my category and product pages on the webshop? The problem with this could be that the main keywords for the category pages on the webshop are mainly the same as for the other subdomains. The intention is that some people immediately come to the webshop instead of going first to the linking websites and then to the webshop. Thanks.

    | Mat_C
    0

  • I have a client with a site with a m-dot mobile version. They will move it to a responsive site sometime next year but in meanwhile I have a massive doubt. This m-dot site has some 30k indexed pages in Google. Each of this page is bidirectionally linked to the www. version (rel="alternate on the www, rel canonical on the m-dot) There is no noindex on the m-dot site, so I understand that Google might decide to index the m-dot pages regardless of the canonical to the www site. But my doubts stays: is it a bad thing that both the version are indexed? Is this having a negative impact on the crawling budget? Or risking some other bad consequence? and how is the mobile-first going to impact on this? Thanks

    | newbiebird
    0

  • Hello Moz Community :)**Back Story:**For the past 8 years, we have been running our main e-commerce site on Magento 1For the past year, we have been building out a new far superior version on Magento 2The new Magento 2 site is essentially done we have a couple little last tiny tweaks/improvements to complete. All signs point to it being 100% done by January 31 - 2019. The new site is currently totally functional and fully operational you could complete a checkout no problem. We currently have the site blocked from being crawled. Although we have rewritten all the site content & taken all new photos we've been concerned with duplicate content issues.
    I have attached an image that is a very good representation of our average sales pattern in a typical year. We are a seasonal business so there are big highs and lows. **Concerns:**My hopes were to launch the site around March - 1 2019. I have to 301s meticulously prepared in a spreadsheet and ready to go. My concern though is making this change right before busy season is about to hit. On the other hand though there really never is an ideal time to make the change as cash flow is equally important if not more important during the slow season.Ideally, I would allow the site to be crawled and run them both in tandem for a while but I'm concerned about potential duplicate content issues even though the content has been significantly altered.**Main Question:**What's my best bet?1) Do the 301 in March and hope for the best2) Allow the site to be indexed - risk duplicate content problems and run them both in tandem for a while.Final thoughts.The old site is really starting to fall apart. Every day I keep it open and running we steal resources from moving forward with the new site.-Any thoughts, direction, suggestions or input would be greatly appreciated9wDl7lD

    | Shop-Sq
    0

  • I've been seeing great results with my efforts in the last few months. But I think my workflow is a mess is non-existent! Does anyone have a specific workflow that I could use as a base?

    | Madstrategist
    1

  • Hi, I have seen number of websites where they keep their website homepage as first link from every page of sub folders and even from sub domain pages some times. For example, giving website homepage as top navigational menu from every page in their blog. Will this helps in boosting the ranking of homepage if we link from sub folder or sub domain pages where they employ related content like blogs or help guides. Thanks

    | vtmoz
    0

  • Hi Guys So we have a whole load of mystery urls showing in analytics .The urls are completely not relevant and have somehow been created However - when you click on the URLs - they all go to 404 pages - pages not found. The website is a travel website but is showing pages like /overcome-fatigue-during-mesothelioma-treatment/ in analytics. Webmaster is not showing any of these pages - but analytics is showing traffic for them??? My initial thought was that it was a spam URL injection - but they are not pages. They don't exist Our database is fine, WP admin seems fine - none of these supposed pages have been created on WP - so why are they showing on analytics as having driven traffic??? None of the urls are indexed on Google. Its a mystery!!!! Can anyone help?  Has anyone seen this before????

    | CayenneRed89
    0

  • Dear All, Have a question. We've a client (pharma), who has a prescription medicine approved only in the US, and has only one global site at .com which is accessed by all their target audience all over the world.
    For the rest of the US, we can create a replica of the home page (which actually features that drug), minus the existence of the medicine, and set IP filter so that non-US traffic see the duplicate of the home page. Question is, how best to tackle this semi-duplicate page. Possibly no-index won't do because that will block the site from the non-US geography. Hreflang won't work here possibly, because we are not dealing different languages, we are dealing same language (En) but different Geographies. Canonical might be the best way to go? Wanted to have an insight from the experts. Thanks,
    Suparno (for Jeff)

    | jrohwer
    1

  • Hello Everyone! I'm new here! My husband and I are working on creating a website: https://sacwellness.com .The site is an online therapist directory for the the Sacramento California area. Our problem is this: In wordpress our category system is being used for blog posts. Our theme is using a custom taxonomy system to categorize different therapist specialties, therapeutic approaches, etc. We've found ourselves in a position where our custom taxonomy and categories are near duplicates. for example we have the blog categories: ADHD counseling, Anxiety therapy, and Career counseling our corresponding custom taxonomy/therapist categories are: ADHD, Anxiety, and....(oops) career counseling. My understanding is that google doesn't see a difference between identically named categories and custom taxonomies and will so choose one to rank and disregard the other, effectively leaving you competing against yourself. is this true in a case like this? Can google maybe understand the difference because of the custom taxonomy and/or URL paths? if this is a problem is it ok to have near duplicates....like ADHD vs. ADHD counseling. This has been our solution so far....but now we're questioning it....derp x_x. I thought about tagging the categories with no index, but I think the archive pages would be useful for people. Essentially we have 2 sets of archives for each keyword. One is for blog posts, and one is for therapists who work with that particular issue along with the 6 most recent blog posts in that category.....because we are putting the 6 most recent blog posts at the bottom of the therapist pages I feel like it wouldn't be as terrible of a loss if we had to noindex the category pages. ....what do you think? Thank you!

    | angelamaemae
    0

  • Hello, If you look at the top keyword in one of our companies, and across the long tail, you'll see one thing. Not better companies for the industry. Not companies with better information. Just Trusted names. Trust Trust Trust. Big guys you can trust and little guys you can trust. So since everything's about branding trust in this niche, how do you revamp a website to instill new trust and thus increase in rank? Resources are always helpful, or direct answers are more than welcome. Thanks.

    | BobGW
    1

  • Hi everyone, I am trying to add span tags in H1, break tag on 2 lines and style each line of H1 differently: Example: Line 1Line 2 I might add a smaller font for line 2 as well... Is this SEO friendly? Will crawlers read entire text or can interfere and block it. Thank you!

    | bgvsiteadmin
    0

  • Hello, I manage htts://globalrose.com When I search on Google for "Yellow Roses", "Yellow Roses Globalrose", or any search that might bring up one of our pages, sometimes our search results appear with dates right before the description. Does anyone know what this mean? Why they appear on some and not other pages? Here is a search result for example: Example Google Search Can someone please help clarify this for us?

    | globalrose.com
    0

  • Hello! I did something dumb back in the beginning of September. I updated Yoast and somehow noindexed a whole set of custom taxonomy on my site. I fixed this and then asked Google to validate the fixes on September 20. Since then they have gotten through only 5 of the 64 URLS.....is this normal? Just want to make sure I'm not missing something that I should be doing. Thank you! ^_^

    | angelamaemae
    0

  • Hello! I have created a directory website with a pretty active blog. I probably messed this up, but I pretty much have categories (for my blog) and custom taxonomy (for different categories of services) that are very similar. For example I have the blog category "anxiety therapists" and the custom taxonomy "anxiety". 1- is this a problem for google? Can it tell the difference between archive pages in these different categories even though the names are similar? 2- should I noindex my blog categories since the main purpose of my site is to help people find therapists ie my custom taxonomy?

    | angelamaemae
    0

  • Hi, I have launched a very new website and am tackling this term: nlp techniques (3K) with this page It's a 1 year target. I'm developing 100 new processes for the page. I've put up 11 so far. It looks like the top contenders have a lot to offer as far as number of techniques and quality of techniques. They are just honestly dumping their regular training material on as a big area of their website, what I'm doing is developing new innovative techniques from scratch that do the same thing and hopefully more. Let me know what you think I can do to move up the ranks. I can't do outreach for link building since this is a social community.

    | BobGW
    0

  • Hi all, My website uses relative URLs that has PHP to read a users IP address, and update the page's referenced canonical tag to an region specific absolute URL for ranking / search results. E.g. www.example.com/category/product - relative URL referenced for internal links / external linkbuilding If a US IP address hits this link, the URL is the same, but canonicalisation is updated in the source to reference www.example.com**/us/**category/product, so all ranking considerations are pointed to that page instead. None of these region specific pages are actually used internally within the site. This decision was done so external links / blog content would fit a user no matter where they were coming from. I'm assuming this is an issue in trying to pass link equity with Googlebot, because it is splitting the strength between different absolute canonical pages depending on what IP it's using to crawl said links (as the relative URL will dynamically alter the canonical reference which is what ranking in SERPs) Any assistance or information no matter how small would be invaluable. Thanks!

    | MattBassos
    0

  • On 26 of October 2018 My website have around 1 million pages indexed on google. but after hour when I checked my website was banned from google and all pages were removed. I checked my GWT and I did not receive any message. Can any one tell me what are the possible reasons and how can I recover my website? My website link is https://www.whoseno.com

    | WhoseNo
    0

  • Hi, everyone Basically I am editing my website page's URL for SEO Optimisation and I am not sure which URL structure is best for SEO. The main different is the sign ( dash or slash ) before the product-code. HERE ARE TWO EXAMPLE www.example.com/long-tail-keyword-product-code www.example.com/long-tail-keyword/product-code To get more idea of my page, here is one of the product from my website : http://www.okeus.co.uk/pro_view-3.html My website is selling my own product, as a result the only keyword can be found was the name of the product and I separated different design by different code. Any experts who are willing help would be very much appreciated.

    | chrisyu78
    1

  • I work with an organization that is ranking #2 for a branded search term, second to a competitor. They have zero similarity between their names, and we've worked with them to up their SEO game around all major areas (one drawback: SquareSpace is killing their site speed). Their DA is 59, the competitor's DA is 77. What are some smart, specific ways that we can help our client come back out on top?

    | ogiovetti
    0

  • Basically we get a lot of users uploading photos as part of their review, but many photos aren't moderated into our pages and therefore are never displayed. Things like selfies rather than photos of the product or just random google images that are completely unrelated to our products or services. Is there any benefit in cleaning up the gallery since some images we don't use are just sat there in admin?
    when a page loads, would it be quicker if we had less content in the gallery? With our SEO hat on.
    or does it not matter since it's not loading that content (photos) anyway?

    | Fubra
    0

  • Hello, I really hope someone can help. We recently moved our website from a shared server with one host to a VPS with another.  At the same time we decided it would be right to switch from the .co.uk to the .com and also purchase an SSL.  Since the switch we have had zero enquiries (3 weeks ago) when we would normally average one a day.  According to Google Analytics, I cannot see that traffic has been to adversely effected and rankings, though dropping very slightly have not dramatically fallen.  We have tested the site rigirously and there are no issues with it we can see. I ensured at domain level that there was a 301 redirect on the .co.uk site as well.  Does anyone have any suggestions as to why this would be the case?  And/or whether switching it all back to where it was with the .co.uk would be a foolish idea?  Many thanks!

    | Opus4Marketing
    0

  • Hi All, I'm trying to find more information on what IP address Googlebot would use when arriving to crawl your site from an external backlink. I'm under the impression Googlebot uses international signals to determine the best IP address to use when crawling (US / non-US) and then carries on with that IP when it arrives to your website? E.g. - Googlebot finds www.example.co.uk. Due to the ccTLD, it decides to crawl the site with a UK IP address rather than a US one. As it crawls this UK site, it finds a subdirectory backlink to your website and continues to crawl your website with the aforementioned UK IP address. Is this a correct assumption, or does Googlebot look at altering the IP address as it enters a backlink / new domain? Also, are ccTLDs the main signals to determine the possibility of Google switching to an international IP address to crawl, rather than the standard US one? Am I right in saying that hreflang tags don't apply here at all, as their purpose is to be used in SERPS and helping Google to determine which page to serve to users based on their IP etc. If anyone has any insight this would be great.

    | MattBassos
    0

  • Hello! Though I browse MoZ resources every day, I've decided to directly ask you a question despite the numerous questions (and answers!) about this topic as there are few specific variants each time: I've a site serving content (and products) to different countries built using subfolders (1 subfolder per country). Basically, it looks like this:
    site.com/us/
    site.com/gb/
    site.com/fr/
    site.com/it/
    etc. The first problem was fairly easy to solve:
    Avoid duplicated content issues across the board considering that both the ecommerce part of the site and the blog bit are being replicated for each subfolders in their own language. Correct me if I'm wrong but using our copywriters to translate the content and adding the right hreflang tags should do. But then comes the second problem: how to deal with duplicated content when it's written in the same language? E.g. /us/, /gb/, /au/ and so on.
    Given the following requirements/constraints, I can't see any positive resolution to this issue:
    1. Need for such structure to be maintained (it's not possible to consolidate same language within one single subfolders for example),
    2. Articles from one subfolder to another can't be canonicalized as it would mess up with our internal tracking tools,
    3. The amount of content being published prevents us to get bespoke content for each region of the world with the same spoken language. Given those constraints, I can't see a way to solve that out and it seems that I'm cursed to live with those duplicated content red flags right up my nose.
    Am I right or can you think about anything to sort that out? Many thanks,
    Ghill

    | GhillC
    0

  • Hello! I posted about this a week ago but haven't solidly figured it out yet. I'm building a website that is a directory of local therapists. I have categories for my blog and custom taxonomy to classify therapists. My problem is that my categories and my custom taxonomy overlap by necessity. For example I have the category "anxiety therapy" and the custom taxonomy "anxiety". Will this confuse google?...Do you think google will be able to figure out the differences between my blog archives and my therapist listing archives?...even though their names are similar and in a couple of cases the same? should I noindex my categories because the point of my site is to get customers to the directory....not the blog?.....even though the blog has lots of useful content? I should note here that I have my custom taxonomy pages set up so that they will display the 6 most recent blog posts in the corresponding category at the bottom of the page....so maybe that makes noindexing the categories more ok? Thank you for your help!

    | angelamaemae
    0

  • Hello, Just wondering if frames are read by google bot and if so how is is different than anything else ? Good or bad for seo ? Thank you,

    | seoanalytics
    0

  • Hello everyone! I'm first time here, and glad to be part of Moz community! Jumping right into the question I have. For a type of pages we have on our website, there are multiple tabs on each page. To give an example, let's say a page is for the information about a place called "Ladakh". Now the various urls that the page is accessible from, can take the form of: mywanderlust.in/place/ladakh/ mywanderlust.in/place/ladakh/photos/ mywanderlust.in/place/ladakh/places-to-visit/ and so on. To keep the UX smooth when the user switches from one tab to another, we load everything in advance with AJAX but it remains hidden till the user switches to the required tab. Now since the content is actually there in the html, does Google count it as duplicate content? I'm afraid this might be the case as when I Google for a text that's visible only on one of the tabs, I still see all tabs in Google results. I also see internal links on GSC to say a page mywanderlust.in/questions which is only supposed to be linked from one tab, but GSC telling internal links to this page (mywanderlust.in/questions) from all those 3 tabs. Also, Moz Pro crawl reports informed me about duplicate content issues, although surprisingly it says the issue exists only on a small fraction of our indexable pages. Is it hurting our SEO? Any suggestions on how we could handle the url structure better to make it optimal for indexing. FWIW, we're using a fully responsive design with the displayed content being exactly same for both desktop and mobile web. Thanks a ton in advance!

    | atulgoyal
    0

  • Hello SEO's, Recently some of my VIPs (Very Important Pages) have slipped, and all the pages above them are AMP. I've been waiting to switch to AMP for as long as possible bc I've heard it's a very mixed bag. As of Oct 2018, what do people think? Is it worth doing? Is there a preferred plugin for wordpress? Are things more likely to go right than wrong? The page that has gotten hit the hardest is https://humanfoodbar.com/plant-paradox-diet/plant-paradox-diet-full-shopping-list-for-lectin-free-diet/. It used to bring in ~70% of organic traffic. It was #1 and is now often near the bottom of the page. 😞 Thanks all! Remy

    | remytennant
    1

  • I’ve been given the task of optimizing a company’s websites (15 in total) that has multiple websites selling the same product. In terms of optimizing them, can I use the same set of meta descriptions, page title tags and key words for them all or do I need to produce a different set for each? The sites are for independently branded companies that are set up in a franchise-like arrangement. They all exclusively sell the parent companies joinery products

    | aplnzoctober18
    0

  • Hi everyone! We are ranked #1 for about 30 product pages at www.oldsite.com/product1 and we are wanting to move about 30 of those pages to a new site www.newsite.com/product1 (new domain and hosting - which we own). What is the best way to do this? I'm confused if you recreate those pages on the new domain vs. ftp move them, 301 re-directs, etc. Looking for the things we must do and the sequence to do it all, etc. Thanks so much!

    | Jamesmcd03
    0

  • Hello, We are a healthcare company with a strong domain authority and several thousand pages of service related content at brand.com. We've been operating an ancillary ecommerce store that sells related 3rd party products at brand.com/shop for a little over a year. We recently  invested in a platform upgrade and moved our site to a new domain, brandshop.com. We implemented page-level 301 redirects including all category pages, product detail pages, canonical and non-canonical URLs, etc.. which the understanding that there would not be any loss in page rank. What we're seeing over the last 2 months is an initial dive in organic traffic, followed by a ramp-up period of if impressions (but not position) in the following weeks, another drop and we've steady at this low for the last 2 weeks. Another area that might have hurt us, the 301 redirects were implemented correctly immediately post launch (on a wednesday), but it was discovered on the following Monday that our .htaccess file had reverted to an old version without the redirect rules. For 3-4 days, all traffic was being redirected from brand.com/shop/url to brandshop.com/badurl. Can we expect to recover our organic traffic giving the launch screw up with the .htaccess file, or is it more of an issue with us separating from the brand.com domain? Thanks,
    Eugene

    | eugene_p
    0

  • Hello, My company is based is Switzerland with a Swiss address and US number but my client are only in the USA. I only have links from US websites and no Swiss website. Can I be penalised by google for that ? Thank you,

    | seoanalytics
    0

  • A client of mine lost their domain when an ex business partner sold it out from under them. They've filed with WIPO, but in the meantime we're trying to figure out how to help them out. They had two really excellent links - one from the NY Times and one from a .edu website. I'm going to reach out to the authors of those articles (the articles are pretty old, so I doubt they'll change the links), but does anyone have any advice on how to let search engines know the new domain replaces the old without having the ability to do redirects? The content on the site is exactly the same - we were able to get the files over, happily. I've re-submitted the site for indexing, changed the domain links in Moz Local, changed in Analytics, and on all their social sites. Is there anything I'm not thinking of that can be done to let Google know that this new domain replaces the old? Thank you!

    | newwhy
    0

  • We just launched a website about 1 month ago and noticed that Google was indexing, but not displaying, URLs with "?location=" parameters such as: http://www.castlemap.com/local-house-values/?location=great-falls-virginia and http://www.castlemap.com/local-house-values/?location=mclean-virginia. Instead, Google has only been displaying our root URL http://www.castlemap.com/local-house-values/ in its search results -- which we don't want as the URLs with specific locations are more important and each has its own unique list of houses for sale. We have Yoast setup with all of these ?location values added in our sitemap that has successfully been submitted to Google's Sitemaps: http://www.castlemap.com/buy-location-sitemap.xml I also tried going into the old Google Search Console and setting the "location" URL Parameter to Crawl Every URL with the Specifies Effect enabled... and I even see the two URLs I mentioned above in Google's list of Parameter Samples... but the pages are still not being added to Google. Even after Requesting Indexing again after making all of these changes a few days ago, these URLs are still displaying as Allowing Indexing, but Not On Google in the Search Console and not showing up on Google when I manually search for the entire URL. Why are these pages not showing up on Google and how can we get them to display? Only solution I can think of would be to set our main /local-house-values/ page to noindex in order to have Google favor all of our other URL parameter versions... but I'm guessing that's probably not a good solution for multiple reasons.

    | Nitruc
    0

  • Hi everyone, We have more than 20 websites for different region and all the sites have their specific ccTLD. The thing is we are having conflict in SERP for our English sites and almost all the English sites have the same content I would say 70% of the content is duplicating. Despite having a proper hreflang, I see co.uk results in (Google US) and not only .co.uk but also other sites are showing up  (xyz.in, xyz.ie, xyz.com.au)The tags I'm using are below, if the site is for the US I'm using canonical and hreflang tag :https://www.xyz.us/" />https://www.xyz.us/" hreflang="en-us" />and for the UK siteshttps://www.xyz.co.uk/" />https://www.xyz.co.uk/" hreflang="en-gb" />I know we have ccTLD so we don't have to use hreflang but since we have duplicate content so just to be safe we added hreflang and what I have heard/read that there is no harm if you have hreflang (of course If implemented properly).Am I doing something wrong here?  Or is it conflicting due to canonicals for the same content on different regions and we are confusing Google so (Google showing the most authoritative and relevant results)Really need help with this.Thanks,

    | shahryar89
    0

  • Hey guys, I would love to hear your thoughts on how you think SEO will change in the 2020's. The 2010's saw some pretty cool stuff like Panda, Penguin, penalties for non-mobile-friendly, non-secure and slow loading sites. What will be more or less important for SEO's in the 2020's than today? How will machine learning and AI change SEO?

    | GreenHatWeb
    0

  • Hello, For example let say I do hiking tour in different regions and all my pages are presented the same way with the highlights, hotels, what is included, the price, the level and the dates. I guess that across my pages the meta description is going to be the same, the only thing that is going to change is the destination. Is it ok to do it this way ? I know it isn't recommend to do duplicate but in this type of configuration I have no idea on have to do different meta knowing all the pages present the same things. Thank you,

    | seoanalytics
    0

  • Hi Guys So we manage a client website doing their seo and ppc The site has become a success so the client has now asked if we would like to create our own site and become an affiliate of theirs The idea is target the same set of keywords etc. My question is - in the world of google is this ok?  I know about google penalising same business owners for having two websites targeting the same keyword.... But in this case - the websites are owned by different owners, different hosting, different domain ownership, different analytics code, different code development, different about us Everything is different but I am just a little paranoid that google knows we SEO the clients website Does anyone have any advice? Thanks Duncan

    | CayenneRed89
    0

  • Hello, I use the divi theme and got pages that were automatically generated with images. Is google going to penalise me because of those and consider it is thin content ? Should I remove those ? Thank you,

    | seoanalytics
    0

  • I realise this is incredibly controversial!  And I also realise I’ll get a ton of trolls pulling me to shreds but… I’m in need of running a short-term experiment, but to do the experiment I need to get a test site ranking high very quickly (not worried about if it gets penalised, it is only a short-term test).

    | seoman10
    0

  • Hi all, I have pages with schema on but there are some gaps. Rather than ask my dev team / wait for the changes to be made, can I use the data highlighting tool in GSC to fill in these gaps? Will it let me add these and will Google generally consider both the schema and the highlighted data? To note, if I have used GSC to highlight data and then test it in Google's Structured Markup Test Tool it won't show so I understand it may be difficult to test whether it's working or not. Any advice would be appreciated. Thanks!

    | KJH-HAC
    0

  • How does google decide which image show up in the image search section ? Is is based on the alt tag of the image or is google able to detect what is image is about using neural nets ? If it is using neural nets are the images you put on your website taken into account to rank a page ? Let's say I do walking tours in Italy and put a picture of the leaning tower of pisa as a top image while I be penalised because even though the picture is in italy, you don't see anyone walking ? Thank you,

    | seoanalytics
    1

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.