Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Keyword Cannibalization on Professional Service Firm
Hi all: We do ongoing SEO for a tax law firm. Their home page, which contains very little text is marked up in the title tag with the phrase 'tax attorneys and preparers.' We are getting warnings from our SEO software that individual bio pages for practitioners are cannibalizing the homepage for the keyword 'tax attorney.' Should I be concerned? The head of this firm is a very well known 'tax attorney.' Its kind of hard to describe him differently but we keep getting told his page competes with the firm's homepage for this search string. Thanks in advance.
| Daaveey1 -
Having a Keyword in # is not that important in 2018, Do you agree?
Earlier having a Keyword in was one of the important ranking factor or at least every SEO guru use to suggest this. But, of late, we are noticing that Google is not giving much weightage to it. What are your thoughts on this?
| SameerBhatia3 -
Sudden downfall of the results, intelligents can easily find the solution
Dear, previously my site got top rank results (first place in first page) for 1. "Vastu" (Landing page: https://www.subhavaastu.com/vastu.html) 2. "Vaastu" (Landing page: https://www.subhavaastu.com/vaastu.html) 3. "Vastu shastra" (Landing page: https://www.subhavaastu.com/about-vastu-shastra.html). 4. "Vastu Consultant", (Landing page: https://www.subhavaastu.com) Previously say about 4/5 years back, I got only 1/2 place in google for the above keywords. Now my site is not visible for the said above keywords. But for these below keywords I am getting very good rankings 5. "Vasthu" (Landing page: https://www.subhavaastu.com/vasthu.html) (3 rd position in first page) 6. "Vaasthu" (Landing page: https://www.subhavaastu.com/vaasthu.html) (2nd position in first page) I got good improvements with my attemps. Require to get again first page first rank for these keywords. Vastu, Vaastu, Vastu shastra, vastu consultant. Anybody can help please.
| SubhaVaastu0 -
How to avoid getting penalized for having same website in 2 languages
Hi, I have a price comparison website in English with .com domain. Now as we are expanding, we want to localize our website and target different markets in their local languages. The first market we are targetting is France. For that purpose we have a different domain name in French .fr domain The website however, will have the exact same content & mostly translated in French. My question is what is the best way to avoid getting penalized by Google for having duplicate content? Thanks,
| kh-priyam0 -
Could another site copying my content hurt my ranking?
Earlier this week I asked why a page of mine might not be ranking locally. (https://moz.rankious.com/_moz/community/q/what-could-be-stopping-us-from-ranking-locally). Maybe this might be part of the answer – another firm has copied huge chunks of my website copy: **My company: **https://idearocketanimation.com/video-production-company/ The other company: http://studio3dm.com/studio3dm-com/video/ Could this be causing my page to not rank? And is there anything I can do about it, other than huff and puff to the other firm? (Which I am already doing.)
| Wagster0 -
Displaying Vanity URL in Google Search Result
Hi Moz! Not sure if this has been asked before, but is there any way to tell Google to display a vanity URL (that has been 301d) instead of the actual URL in the SERP? Example: www.domainA.com is a vanity URL (bought specifically for Brand Identity reasons) that redirects to www.domainB.com. Is it possible to have the domainA Url show up in Google for a Branded search query? Thanks in advance! Arjun
| Lauriedechaseaux0 -
How to Handle Spammy Top Referring Domains
We keep getting links from the domain lyricswithoutmelody.org. Currently we have the most referring backlinks of all from them. I'm not sure what to do with it... is it hurting us? I know I can disavow them, but I'm afraid it will hurt since we have 472 total backlinks from the domain. Their trust flow is 9 and citation flow is 11. Another option I was thinking is to block the domains IP from seeing our website, would that work? Just trying to figure out the best coarse of action... or if no action at all is best. I've attached a screenshot of my top referring domains. The ones outlined in red I don't know who they are and if it's helping or hurting. Moz Fam HELP! Ijb09DNhIW5
| LindsayE0 -
If a page ranks in the wrong country and is redirected, does that problem pass to the new page?
Hi guys, I'm having a weird problem: A new multilingual site was launched about 2 months ago. It has correct hreflang tags and Geo targetting in GSC for every language version. We redirected some relevant pages (with good PA) from another website of our client's. It turned out that the pages were not ranking in the correct country markets (for example, the en-gb page ranking in the USA). The pages from our site seem to have the same problem. Do you think they inherited it due to the redirects? Is it possible that Google will sort things out over some time, given the fact that the new pages have correct hreflangs? Is there stuff we could do to help ranking in the correct country markets?
| ParisChildress1 -
Change of URLs - Part of Migration
We are looking to change our URLs to this format /SKU/TITLE/COLOUR as part of our SEO migration.
| christwix
e.g. https://example.com.au/ac-rck-b/rolla-crew-knit/berry.html As of the moment, our URLs are TITLE/NO
e.g. https://example.com.au/rolla-crew-knit/6562563.html
(Shopify is creating a random number on the end of the URL which is representing a different colour) Is this fine SEO wise? Will this affect rankings and user experience?0 -
Why is a canonicalized URL still in index?
Hi Mozers, We recently canonicalized a few thousand URLs but when I search for these pages using the site: operator I can see that they are all still in Google's index. Why is that? Is it reasonable to expect that they would be taken out of the index? Or should we only expect that they won't rank as high as the canonical URLs? Thanks!
| yaelslater0 -
Is Google Filling in Search Forms?
Hi Mozers, Is it true that Google will fill in search forms and thus generate potentially thousands of pages by itself? We have a spike in number of pages indexed and it corresponds with the time we added an advanced search form to the site... Thanks for the advice! Yael
| yaelslater0 -
Major Drop in Traffic Outside AU After Migration
We have a major drop in traffic after migration especially in the US and other international countries but we are more worried about the US. Before Migration:
| christwix
February 3 - March 18, 2018 (44 days)
vs
After Migration:
March 20 - May 2, 2018 (44 days) See GA traffic comparison (screenshot 1) Also, based on SEMrush the traffic, the United States is down by 46% (see screenshot 2) while Australia has decreased by 12% (screenshot 3). What could be the main issue issue of this? Would really love to hear an in-depth explanation about this. Keen to hear your thoughts about this. Cheers, iNMdNNq OTq5wYn gCBmFOF0 -
Footer no follow links
Just interested to know when putting links at the foot of the site some people use no-follow tags. I'm thinking about internal pages and social networks. Is this still necessary or is it an old-fashioned idea?
| seoman100 -
Realistic expectations to increase domain authority
A) what is a realistic timeline to increase a websites domain authority by 20 points? B) what are the most important factors to increase a websites domain authority?
| WebMarkets0 -
Domain Migration Hell!
5 weeks ago we migrated our site to a new domain. We also installed an SSL certificate on the new domain. The new domain was purchased 5 years ago but we only used it as a redirect address. It was more consistent with our brand so we decided to migrate to it. Great care was taken setting up page to page redirects. A formal domain change request was made to Google. In fact the move was implemented with only a handful of broken links on a 500 page site. Those links were quickly fixed. Our traffic declined from about 350 visitors a week to as low as 40 visitors the first full week after the move. Now the number of organic Google visits is up to 80, a drop of 75% !!! All except 20 (out of 500) pages are reindexed on Google Search Console. MOZ domain authority for the new domain has climbed from 5 to about 12. The old domain had a DA of 23. In Google Search Console hundreds of "URL Not Allowed" errors are the site map for our previous domain that redirects to our new domain. Attached please see image of this. The site map for the new domain appears normal, but about 160 pages are indexed that are not in the sitemap. I wonder if these two issues have somehow contributed to the drop in ranking. I have included images showing GCT for the 2 domains. I posted on MOZ a month ago and was told it just might take time. No improvement and now I am wonder if there is not some issue with the sitemaps causing havoc. Are traffic is down more than 80%. This does not seem normal. Any advice? Any suggestions as to how to expedite recovery? Thanks,
| Kingalan1
Alan0 -
How to fix Duplicate Content Warnings on Pagination? Indexed Pagination?
Hi all! So we have a Wordpress blog that properly has pagination tags of rel="prev" and rel="next" set up for pages, but we're still getting crawl errors with MOZ for duplicate content on all of our pagination pages. Also, we are having all of our pages indexed as well. I'm talking pages as deep as page 89 for the home page. Is this something I should ignore? Is it hurting my SEO potentially? If so, how can I start tackling it for a fix? Would "noindex" or "nofollow" be a good idea? Any help would be greatly appreciated!
| jampaper0 -
Will "repurposing" a keyword on our website affect rankings gained over time?
Hi team! Thinking of "repurposing" a keyword on our website. Reason: when researching this particular keyword, GMS are quite high, however, the new content we're creating is more up to date, better in general, than the old content this keyword is attached to. How will this affect rankings we've gained over time? (i.e., will any "age" benefits gained as that keyword has been in use on our website for a few years, be lost?) Will Google see the keyword/URL as totally new because it's attached to new content/something that has gone live recently? Thanks
| MariaPuche-Jimenez_Parker0 -
301 Redirect and Canonical link tag pointing in opposite directions!
I'm working on a site which redirects the non-WWW version to WWW version so, for example https://website.com/page redirects to https://www.website.com/page However, canonical link tags have been set up on the page - pointing back to the non-WWW so for example Q - is this going to cause issues and should the canonical be updated to the same version as the redirect?
| SWEMII0 -
Internal search pages (and faceted navigation) solutions for 2018! Canonical or meta robots "noindex,follow"?
There seems to conflicting information on how best to handle internal search results pages. To recap - they are problematic because these pages generally result in lots of query parameters being appended to the URL string for every kind of search - whilst the title, meta-description and general framework of the page remain the same - which is flagged in Moz Pro Site Crawl - as duplicate, meta descriptions/h1s etc. The general advice these days is NOT to disallow these pages in robots.txt anymore - because there is still value in their being crawled for all the links that appear on the page. But in order to handle the duplicate issues - the advice varies into two camps on what to do: 1. Add meta robots tag - with "noindex,follow" to the page
| SWEMII
This means the page will not be indexed with all it's myriad queries and parameters. And so takes care of any duplicate meta /markup issues - but any other links from the page can still be crawled and indexed = better crawling, indexing of the site, however you lose any value the page itself might bring.
This is the advice Yoast recommends in 2017 : https://yoast.com/blocking-your-sites-search-results/ - who are adamant that Google just doesn't like or want to serve this kind of page anyway... 2. Just add a canonical link tag - this will ensure that the search results page is still indexed as well.
All the different query string URLs, and the array of results they serve - are 'canonicalised' as the same.
However - this seems a bit duplicitous as the results in the page body could all be very different. Also - all the paginated results pages - would be 'canonicalised' to the main search page - which we know Google states is not correct implementation of canonical tag
https://webmasters.googleblog.com/2013/04/5-common-mistakes-with-relcanonical.html this picks up on this older discussion here from 2012
https://moz.rankious.com/_moz/community/q/internal-search-rel-canonical-vs-noindex-vs-robots-txt
Where the advice was leaning towards using canonicals because the user was seeing a percentage of inbound into these search result pages - but i wonder if it will still be the case ? As the older discussion is now 6 years old - just wondering if there is any new approach or how others have chosen to handle internal search I think a lot of the same issues occur with faceted navigation as discussed here in 2017
https://moz.rankious.com/_moz/blog/large-site-seo-basics-faceted-navigation1 -
Duplicate Homepage - How to fix?
Hi Everyone, I've tried using BeamUsUp SEO Crawler and have found one warning and two errors on our site. The warning is for a duplicate meta description, and the errors are a duplicate page and a duplicate title. For each problem it's showing the same two pages as the source of the error, but one has a slash at the end and one doesn't. They're both for the homepage. https://www.url.com/ And https://www.url.com Has anyone seen this before? Does anyone know if this is anything we should worry about?
| rswhtn1 -
Related Keywords: How many separate pages?
We have an attorney website. There is a practice area that our research shows many different 2-4 word length keyword queries for. The keywords are all very different, but they end up in the same kind of legal action. We're wondering whether we should write many different pages, perhaps 10, to cover all the basic different keyword categories, or whether we should just write a few pages. In the latter situation, many of the target key words would be mentioned in the text, but wouldn't get placement in a url or title tags. One basic problem is that since the keyword queries are made up of different words, but result in the same kind of legal action and applicable law, the content of the pages might be similar with the only difference being a paragraph that speaks to that specific key word. The rest of the content would be quite similar among the pages, i.e. "here is the law that applies, contact us." Also, some of the keywords, like the name of the law, would have to be repeated on all the pages.
| RFfed90 -
Moz Pro > Links > Top Pages: many are images, useful?
My site is 10 years old, and has always ranked well for the variety of garden tools it sells. Looking at our Moz Pro > Links > Top Pages report I see that many of the "pages" are actually image URLs. And many of those are images we do not even use anymore (though they are still hosted). Question: As a way of gaining some link juice to deeper pages, what about 301 redirecting some of those old images over to appropriate pages? (example: redirecting old-weeding-hoe.jpg to the page garden-hoes.html) Would it be worthwhile? Would it be safe? Thanks for any and all input!
| GregB1230 -
How many images should I use in structured data for a product?
We have a basic printing website that offers business cards. Each type of business card has a few product images. Should we use structured data for all the images, or just the main image? What is your opinion about this? Thanks in advance.
| Choice0 -
Content Strategy/Duplicate Content Issue, rel=canonical question
Hi Mozzers: We have a client who regularly pays to have high-quality content produced for their company blog. When I say 'high quality' I mean 1000 - 2000 word posts written to a technical audience by a lawyer. We recently found out that, prior to the content going on their blog, they're shipping it off to two syndication sites, both of which slap rel=canonical on them. By the time the content makes it to the blog, it has probably appeared in two other places. What are some thoughts about how 'awful' a practice this is? Of course, I'm arguing to them that the ranking of the content on their blog is bound to be suffering and that, at least, they should post to their own site first and, if at all, only post to other sites several weeks out. Does anyone have deeper thinking about this?
| Daaveey0 -
Does MOZ Have a Tool to Find a Competitors Broken Links?
I would like to use MOZ to identify a competitors broken links. Does MOZ have such a tool? Where is it? My SEO consultant suggests that we have bloggers write contend and then link it to our site. I am concerned about the quality of the links that will be generated. He is confident that they will be of high quality and the process will be relatively quick and easy. My concern is future spam penalty or low quality links. Another consultant suggested replacing a competitors broken links. How effective is this? Is it more labor intensive than other link building techniques? Thanks,
| Kingalan1
Alan Rosinsky1 -
How to Optimize With Wordpress SEO Plugin YOAST?
Hi Everyone, I am currently using Moz's page optimization format to improve our website's SEO. https://mathandmovement.com/ This is the format, these are all of the areas that we need to improve for each of our website's pages, according to Moz. include 3 keywords max: <url>www.mysite.com/my-keyword-phrase</url> <page title="">2 keywords max <title>Primary Keyword - Secondary Keyword - Brand</title></page> 2 keywords max: keywords in my headers 2 keywords max: keywords in my headers ![keyword](image file) <focus keyword="">1 with YOAST</focus> We are currently using the free version of YOAST for our SEO. My question to you is this, will our pages still have good SEO if we use appropriate keywords (high monthly volume, below 40 difficulty ranking, High Organic CTR,) and put them in the format above? Or since the free version of YOAST only let's us optimize 1 keyword, will we still rank for the other two/three that we put in our meta and page titles/h1s, h2s, urls, and overall paragraph text? Please also let us know what we can do to improve our SEO! Thanks so much, Emma
| emmamathandmovement0 -
The client wants to close the current e-commerce website and open a new one.
The client wants to close the current e-commerce website and open a new one on a completely different engine without losing income. I have no idea how to approach this topic. Old site has over 100 000 pages, and in terms of SEO is quite great - we hit almost every important keyword in our niche but thanks to heavy modifications of source code site become unmaintainable. Content on new shop will be almost 1:1 with old page but: domain will be different (I can't explain to the client that this will damage our core brand). Beacuse of that I'm forcing idea of going with brandname.com/shop domain instead of newshop.com beacuse our main brand is well known to our customers, not as much as old shop but still better than new shop brand. engine and design will be different we will lost almost 30 000 backlinks. budget: only IT. No content and seo tools budget. BONUS: client hired before me some "SEO magician" - now SEO audit score with tools like ahrefs etc. is around 6 - 12% for 100 000 pages on new shop. Great. Does anyone have idea how to approach such task with minimal losses?
| meliegree0 -
Javascript and SEO
I've done a bit of reading and I'm having difficulty grasping it. Can someone explain it to me in simple language? What I've gotten so far: Javascript can block search engine bots from fully rendering your website. If bots are unable to render your website, it may not be able to see important content and discount these content from their index. To know if bots could render your site, check the following: Google Search Console Fetch and Render Turn off Javascript on your browser and see if there are any site elements shown or did some disappear Use an online tool Technical SEO Fetch and Render Screaming Frog's Rendered Page GTMetrix results: if it has a Defer parsing of Javascript as a recommendation, that means there are elements being blocked from rendering (???) Using our own site as an example, I ran our site through all the tests listed above. Results: Google Search Console: Rendered only the header image and text. Anything below wasn't rendered. The resources googlebot couldn't reach include Google Ad Services, Facebook, Twitter, Our Call Tracker and Sumo. All "Low" or blank severity. Turn off Javascript: Shows only the logo and navigation menu. Anything below didn't render/appear. Technical SEO Fetch and Render: Our page rendered fully on Googlebot and Googlebot Mobile. Screaming Frog: The Rendered Page tab is blank. It says 'No Data'. GTMetrix Results: Defer parsing of JavaScript was recommended. From all these results and across all the tools I used, how do I know what needs fixing? Some tests didn't render our site fully while some did. With varying results, I'm not sure where to from here.
| nhhernandez1 -
410 or 301 after URL update?
Hi there, A site i'm working on atm has a thousand "not found" errors on google console (of course, I'm sure there are thousands more it's not showing us!). The issue is a lot of them seem to come from a URL change. Damage has been done, the URLs have been changed and I can't stop that... but as you can imagine, i'm keen to fix as many as humanly possible. I don't want to go mad with 301s - but for external links in, this seems like the best solution? On the other hand, Google is reading internal links that simply aren't there anymore. Is it better to hunt down the new page and 301-it anyway? OR should I 410 and grit my teeth while google crawls and recrawls it, warning me that this page really doesn't exist? Essentially I guess I'm asking, how many 301s are too many and will affect our DA? And what's the best solution for dealing with mass 404 errors - many of which aren't attached or linked to from any other pages anymore? Thanks for any insights 🙂
| Fubra0 -
If I disavow bad Backlinks of my website. If, I create Backlinks again, those websites. Did that again become count in my Backlinks?
Hi, all please tell me. If I disavow bad Backlinks of my website. If, I create Backlinks again, those websites. Did that again become count in my Backlinks?
| sourav60 -
Website structure - best tools to analyse and plan, visually
Hi - I am about to analyse and then re-plan the structure of a website and think it would be best to do it graphically - in the form of a chart. Are there any tools you would recommend to visualise the structure of an existing website (perhaps something that can scan and then represent a websites) - or plan out a new/revised website? Thanks in advance, Luke
| McTaggart0 -
Weird Site is linking to our site and links appears to be broken
I have got a lot of weird links indexed from this page: http://kzs.uere.info/files/images/dining-table-and-2-upholstered-chairs.html When clicking the link it shows 404. Also, the spam score is huge. What do you guys suggest to do with this?
| Miniorek
Could it be done by somebody to get our rankings down or domain penalized? Best Regards
Mike & Alex0 -
Google Indexing Request - Typical Time to Complete?
In Google Search Console, when you request the (re) indexing of a fetched page, what's the average amount of time it takes to re-index and does it vary that much from site to site or are manual re-index request put in a queue and served on a first come - first serve basis despite the site characteristics like domain/page authority?
| SEO18050 -
New link explorer
I was checking this new tool which is really cool by the way and was wondering if I can outrank big guys with just content. I have a Domain authority of 28 with a spam score of 28 % Can I outrank with amazing content a site that hase a domain authority of 50 and a spam score of 1 % ? Should I ask for all my bad links to be removed so that my spam score goes down or doesn't it matter anymore those days and what matters is good content, link just don't count anymore ? Thank you,
| seoanalytics1 -
Javascript content not being indexed by Google
I thought Google has gotten better at picking up unique content from javascript. I'm not seeing it with our site. We rate beauty and skincare products using our algorithms. Here is an example of a product -- https://www.skinsafeproducts.com/tide-free-gentle-he-liquid-laundry-detergent-100-fl-oz When you look at the cache page (text) from google none of the core ratings (badges like fragrance free, top free and so forth) are being picked up for ranking. Any idea what we could do to have the rating incorporated in the indexation.
| akih0 -
New Website SEO Implications
Hi Moz Community, A client of mine has launched a new website. The new website is well designed, mobile friendly, fast loading and offers a far better UX than the old site. It has similar content but 'less wordy'. The old website was tired, slow, not mobile responsive etc but still ranked well. The domain has marketing leading authority and link metrics. Since the launch, the rankings for virtually every word has plummeted. Even previously ranked #1 words have disappeared to page 3 or 4. New pages have different URLs (301s from the old urls are working fine) and still score the same 98% (using the Moz page optimiser tool). Is it usual to experience some short term pain, or are these rankings drop an indication that something else is missing? My theory is that the new URLs are being treated like new pages, and that those new pages don't have the engagement data which is used for ranking. Thus, despite having the same authority of the old pages, as far as user data is concerned, they are new pages and therefor, not ranking well - yet. That theory would make logical sense but I'm hoping some experts here can help. Any suggestions welcome. Here's a quick checklist of things I have already done: complete 301 redirect list
| I.AM.Strategist
New sitemap
Submitted to console
Created internal links from within their large blog
Optimised all the new pages (img alts, H1s etc) Extra info: Platform changed from Wordpress to Expression engine
Target pages now on level 3 not level 2 (extra subfolder used)
Less words used (average word count per page from 400+ to 250) Thanks in advance 🙂0 -
How to use Rich Snippets?
Hi there! I have been hearing a lot about Rich Snippets lately but I don't really know how they work. Are they a very important factor to consider for SEO? I would love to know your thoughts about this. Thanks!
| lucywrites0 -
SEO'ing a sports advice website
Hi Team Moz, Despite being in tech/product development for 10+ years, I'm relatively new to SEO (and completely new to this forum) so was hoping for community advice before I dive in to see how Google likes (or perhaps doesn't) my soon to be built content. I'm building a site (BetSharper, an early-stage work in progress) that will deliver practical, data orientated predictive advice prior to sporting events commencing. The initial user personas I am targeting would need advice on specific games so, as an example, I would build a specific page for the upcoming Stanley Cup Game 1 between the Capitals and the Tampa Bay Lighting. I'm in the midst of keyword research and believe I have found some easier to achieve initial keywords (I'm realistic, building my DA will take time!) that include the team names but don't reference dates or state of the tournament. The question is, hypothetically if I ranked for this page for this sporting event this year, would it make sense to refresh the same page with 2019 matchup content when they meet again next year, or create a new page? I am assuming I would be targeting the same intended keywords but wondering if I get google credit for 2018 engagement post 2019 refresh. Or should I start fresh with a new page and specifically target keywords afresh each time? I read some background info on canonical tabs but wasn't sure if it was relevant in my case. I hope I've managed to articulate myself on what feels like an edge case within the wonderful world of SEO. Any advice the community delivers would be much appreciated...... Kind Regards James.
| JB19770 -
Rel=canonical Question
Alright, so let's say we've got an event coming up. The URL is website.com/event. On that page, you can access very small pages with small amounts of information, like website.com/event/register, website.com/event/hotel-info, and website.com/event/schedule. These originally came up as having missing meta descriptions, and I was thinking a rel=canonical might be the best approach, but I'm not sure. What do you think? Is there a better approach? Should I have just added a meta description and moved on?
| MWillner0 -
301 Redirect from Authoritative but Loosely-Related Domain
We acquired a health-related blog about a year ago with good domain authority and a pretty strong link profile (TF ~40). We have been publishing good relevant content in it but it's not really paying dividends and we are considering doing a 301 to our money site, which is focused primarily on senior issues but has a lot of health-related content. The question is - with the two domains only being loosely related in subject matter, do we stand to harm our main site by redirecting from the other domain?
| sa_787040 -
Should you shorten very long URLs?
Hi Moz Community! If the nav architecture URL is long, like this: https://www.savewildlife.org/wildlife-conservtion/endangered-species-act-protections/endangered-species-list/birds/mexican-spotted-owl can I and should I shorten that new destination URL to make it easy for Google to see that the page topic is really the owl, like this: https://savewildlife.org/endangered-species-list/mexican-spotted-owl Thank you! Jane
| CalamityJane771 -
Redirection: Load balancer or CNAME?
We had a bunch of domains which no longer have sites tied to them but many have decent links pointing to them. In most cases we have other relevant content on live sites we can redirect these URL's to. We have been given the choice of redirection through the load balancer or direct as a cname on our CDN. I only have experience of 301's - What would be the preferred choice from an SEO perspective? Thanks, Sam
| Samsam00000 -
Redirect closed shop to main shop, or keep the domain and content alive and use it for link building?
Hello, We used to have two shops selling our products, a small shop with a small selection of only our best quality products (domain smallshop.com), and a big shop with everything (bigshop.com). It used to make sense (without going into full detail), but it's not relevant anymore, and so we decided to stop maintaining the small shop, because it was time consuming and not worth it. There is some really good links pointing to smallshop.com, and the content is original (the product descriptions are different between both shops). So far, we just switch the "add to cart" button on the small shop into a link to the same product on the big shop, and did links from the small shop to the big shop also on categories pages. So the question is: in your opinion, is it better to do that, keep the small shop and content alive and build links to our big shop, or do 301 redirections and shut down completely the small shop ? Thanks for your opinion!
| Colage0 -
Least Text for Home Page
We are rebranding our web site and intend to create more visual pages with less text on the assumption that no one want to read anymore. What is the least amount of text that we can include in a home page without damaging out ability to rank on Google? Google recently increased the permitted amount of text on description tags. Can we shift text to the description tax and place more on ALT tags that are not immediately visible to visitors. Any thoughts, comments, advice?? I am adding image of the old home page and new home page (text to be written, 3 columns of dummy text) so the change in the amount of text is visible. Thanks,
| Kingalan1
Alan GkvnNR8 UH9ptbh0 -
Outranking reasons
On the keyword Piedmont bike tours. How can this website https://goo.gl/hUnSUF (we are going to call it number 1) Be above this website https://goo.gl/6vG3D4 (we are going to call it number 2) I know number 1 is a big company but a few weeks back it was behind. Could someone let me why the number 1 is above number 2. Doesn't number 2 answer the people's questions ? what could be missing ? Thank you,
| seoanalytics0 -
X Default on hreflang tags
Hi guys, I would like to clarify something about hreflang markups and most importantly, x-default. Sample URLs:
| geekyseotools
http://www.example.com/au/collection/dresses (Australia)
http://www.example.com/us/collection/dresses (United States)
http://www.example.com/uk/collection/dresses (United Kingdom) Sample Markups: Questions:
1. Can I use my AU page as x default? I noticed that some x default are US. Note that my biggest market is AU though.
2. If I indeed use AU page as x default, and the user is searching from China, does it mean that Google will return my AU page?
3. Can you spot any issues with these markups I made? Anything that I need to correct. Keen to hear from you! Cheers,
Chris0 -
International Country URL Structure
Hey Guys, We have a www.site.com (gTLD) site, the primary market in Australia. We want to expand to US and UK. For the homepage, we are looking to create 3 new subfolders which are: site.com/au/ site.com/uk/ site.com/us/ Then if someone visits the site.com redirect based on their ip address to to the correct location. We are also looking to setup hreflang tags between the 3 sub-folders and set geo-location targeting in google search console at sub-folder level. Just wondering if this setup sounds ok for international SEO? Cheers.
| pladcarl90 -
Ranking #1 but Bounce Rate is 90%?!
Hi Mozers, We have a page that's ranking #1 for several very high volume queries but the bounce rate is 90%. It's puzzling that the page is ranking so well even though the bounce rate is exceedingly high. The algorithm takes user engagement metrics into account so you would think that it those metrics would push the page down. Having said that, the page does have lots of backlinks. So maybe it's ranking despite the fact that people are clicking out? Does anyone have an idea? Thanks, Yael
| yaelslater0 -
Is it better to optimise for several keywords/keyword variations on one page, or create sub categories for those specific terms?
I've done a fair of research to try to find the answer to this, but different people seem to give very different opinions, and none of the info I could find is recent! I'm working with a company that produces a range of industrial products that fit into 6 main categories, within this categories, there are types of products and the products themselves. Prior to my involvement most of the content was added to the product pages and very little was added to the overall category page. The structure works like this: Electronic devices > type of device > products The 'type of device' category could be something like a switch, but within that category are 3/4 different switch types...leaving me with 11 or 12 primary keyword/phrases to aim for as each switch is searched for in more than one way. Should I try to rank for all of those terms using that one category page? Or should I change the structure to something like: Electronic devices > type of device > sub-category/specific variation of device > product This would mean creating a page for each variation to have a more accute focus for a small number of phrases..but it also means I've added another step between the home page and the products. Any advice is welcome! I'm worried I'm overthinking it!
| Adam_SEO_Learning0 -
Unique content for international SEO?
Hi Guys, We have a e-commerce store on generic top-level domain which has 1000s of products in US. We are looking to expand to aus, uk and canda using subfolders. We are going to implement hreflang tags. I was told by our SEO agency we need to make all the content between each page unique. This should be fine for cateogry/product listing pages. But they said we need to make content unique on product pages. If we have 1000 products, thats 4000 pages, which is a big job in terms of creating content. Is this necessary? What is the correct way to approach this, won't the hreflang tag be sufficent to prevent any duplicate content issues with product pages? Cheers.
| geekyseotools0
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.