Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi All, We use schema.org on most of our eCommerce website apart from on our "latest news/blog section" which we have all our how to's ...and other useful articles on . Am I missing a trick here ?.. I have found there is a https://schema.org/Article  which I guess we could implement if it's a big help or if there's a better one ? Just wondered peoples thought as whether it is must have from an SEO/ranking point of view thanks Pete

    | PeteC12
    0

  • Hi Guys! Would like to have your expert opinion on the structure of a big international company. They are active over 27 regions, with all their own local TLD website.
    Some of them are translated, but most of them are in English (big duplicated content, you know it). Next to that they have a webshop on 4 subdomains of theses 27 local TLD's. In my opinion it would be best to merge them all back to the .com domain and set-up a 301 redirect for all local TLD`s.
    However what is your opinion on these 4 webshops? should I make them in the following structure : .com/region/shop (for example .com/fr/shop) Thanks for the feedback! Kind Regards S.

    | Sie.SAS
    0

  • A friend has relaunched a website but his web guys (he didn't consult me!) didn't do any 301s and now traffic unsurprisingly has tanked. The old site and database no longer exists and there are now 2000+ 404's. Any ideas how to do the 301s from old urls to new product urls WITHOUT it being a massive manual job?

    | AndyMacLean
    0

  • Hi friends,
      In one of our websites (ecommerce) with the implementation of structured data we noticed a significant drop in positions in the results.
    Does anyone have a similar experience? Thanks... 🙂

    | zkouska
    0

  • Hi all. I apologize for any redundancy in advance.  I've seen similar queries about branded search but nothing that quite resembles mine. First of all, my site is Withdrawal-ease.com In the last few weeks, I've seen a significant drop in ranking for specific pages on my site that normally rank well in a brand search. I was trying to figure out what the heck was going on. So I looked at a bunch of queries for branded terms and it looks like I've been penalized...but I haven't For instance, when I google "Withdrawal Ease" I would typically see my home page first and then 3-4 of my most popular pages/posts directly underneath in the organic listings. Some of those popular pages include: http://withdrawal-ease.com/proven-home-remedies-for-opiate-withdrawal/ http://withdrawal-ease.com/how-to-detox-from-opiates-at-home/ http://withdrawal-ease.com/how-to-detox-from-suboxone-at-home/ Although the titles are similar, the content is all original and comprehensive...it's good content that has always ranked very well...and still does. But as of Late March they don't show up for branded searches. Now, when I google "Withdrawal Ease" I only get my home page listed and the other pages have all fallen to #60 according to Moz, WMT etc. What is strange is the fact that these popular pages still rank highly for my key terms and relevant searches. I have not noticed any significant drops other than when I do a branded search. I have added a picture from WMT as an example. The image displays a search from my brand name and my most popular page that would typically have ranked #2 below my homepage. As you can see. the impressions and clicks fall, then jump and then totally disappear for my branded query. Looking at the chart, one might observe that the mobile update seems to roughly correspond to the rankings drop but we've been very diligent about being mobile friendly and all of the checks for mobile friendliness pass with flying colors. I've looked at Robot TXT files and everything is fine there...all of our diagnostic tests have turned up nada as far as I can tell. Again, I have not been informed of any manual penalty and my backlink profile is not raising any alarms on Moz; I get a 1 on the spam score. We do not actively seek out links because I DID have a manual penalty in 2010 due to a horrible experience with an SEO firm. It took me over a year and 15k to clear out all of the spammy links that this guy got and I was finally cleared of the penalty in 2013. So I'm totally flummoxed. My organic search is now down significantly across ALL browsers...but again, just for branded searches. I have also attached an image of my crawl stats just in case this may shed any light. Thank you all so much for any help that you can provide; it's been extremely stressful and frustrating so I'm hoping someone can point me in the right direction. Whoa.jpg crawl%20stats.jpg

    | gcat333
    0

  • Hi All, We created 'EN' and 'FR' version of a website and translated all labels and message from English to France with the help of Google Translator. Lets take an example: English version URL - https://www.sitegeek.com/softlayer France version URL - https://fr.sitegeek.com/softlayer France version also contain same reviews available on English version page. So the reviews content or language is same on both pages. To eliminate the duplicate content issue we put following meta tags on both 'EN" and 'FR' version pages : So My question is that (1.)  Is this the correct implementation of  Multilingual  Version of a Website? (2.)  Is Added meta tags work for both Google and Bing Search engine? (as Bing not indexing all pages) (3.) We are translated labels and messages from Google Translator. Is this the issue pages not being Indexed in Bing? (4.) Finally, What would the correct SEO approch if we translate our site in other languages? Rajiv

    | gamesecure
    0

  • Google search is creating 4 sitelinks for a website.  One of these is the blog (blog.website.com).    Webmaster tools allows for site links to be demoted, but there is no way to demote the blog we which we don't want to show up since it is not a "/" extension.   Is there a way to remove it as a site link?

    | EugeneF
    0

  • Hi All, I read an interesting answer from Tom Roberts on another question and one of things he mentioned was that there are conflicting reports about "Click to Expand " Content being discounted with a link to the article  - https://www.seroundtable.com/google-hidden-tab-content-seo-19489.html My eCommerce uses this method on every landing page and tabs on our products pages so I am wondering if others had first hand knowledge of whether using these methods had affected their sites or not ?. I may try to experiment on a few pages to remove this and show all content at the bottom but it does kinda ruin the pages a bit so I wanted to know what peoples thoughts were before I screw up my pages and think about page redesign etc etc. Also , if I was to try it ,then how long should I experiment for ?. would a couple of weeks be sufficient providing google crawls them regually thanks Pete

    | PeteC12
    0

  • I'm torn. Many of our 'niche' ecommerce products rank well, however I'm concerned that duplicate content is negatively effecting our overall rankings via Panda Algo. Here is an example that can be found through quite a few products on the site. This sub-category page (http://www.ledsupply.com/buckblock-constant-current-led-drivers) in our 'led drivers' --> 'luxdrive drivers' section has three products that are virtually identical with much of the same content on each page, except for their 'output current' - sort of like a shirt selling in different size attributes: S, M, L and XL. I could realistically condense 44 product pages (similar to example above) down to 13 within this sub-category section alone (http://www.ledsupply.com/luxdrive-constant-current-led-drivers). Again, we sell many of these products and rank ok for them, but given the outline for how Panda works I believe this structure could be compromising our overall Panda 'quality score', consequently keeping our traffic from increasing. Has anyone had similar issues and found that its worth the risk to condense product pages by adding attributes? If so, do I make the new pages and just 301 all the old URLs or is there a better way?

    | saultienut
    0

  • Hello In a Wordpress blog ( or part of an ecommerce site that runs under wordpress )  it is good to show recent posts in the sidebar on most pages. Obviously the posts aren't going to be relevant to every post , so my questions are: Is having these on the page hurting SEO for the page? Is there good metadata structure to put in there? ( like rel="nofollow"  or similar ) Thoughts?
    Thanks for your time
    Marty

    | s_EOgi_Bear
    0

  • Hey there, So, we are "redesigning" our website, it will have a new user journey and overall layout, use, and feel. Situation: Previously, most of our keywords ranked over time organically though all of them pull up our domain.com as the landing page. Now that we are redesigning the site, most of the keywords pointing to the home page will now have their own page. Keywords properly grouped and content will now be on topic and focused per page. Q: What are the things that we need to do so we won't lose those keywords? Appreciate your help. Also, if you can cite specific SEO checklist when redesigning a site, that'll be a great help! Thanks! Jac

    | jac.reyes
    0

  • Hello all! A nice interesting one for you on this fine Friday... I have some pages which are accessible by 2 different urls - This is for user experience allowing the user to get to these pages in two different ways. To keep Google happy we have a rel canonical so that Google only sees one of these urls to avoid duplicates. After some SEO work I need to change both of these urls (on around 1,000 pages). Is the best way to do this... To 301 every old url to every new url Or... To not worry as I will just point the indexed pages to the new rel canonical? Any ideas or suggestions would be brilliant. Thanks!

    | HB17
    0

  • I'm thinking of adding a Discourse discussion forum to one of my websites.  I'm not sure if it's going to be something that works well or not for the site.  So I'm thinking ahead and wondering what Google issues I could have if after a few months of having the forum, I decide to remove it.  What would Google think about all the then non-existent pages it might have indexed?  Would there be a simple wildcard redirect I could do in htaccess that would satisfy that? Or some other thing I should do?

    | bizzer
    0

  • Hello, I want some help regarding Bad links, I have Uploaded Disavow links, webmaster tools before 4-5 months But still, They are showing in Back links to my Site & Not disavow, can any one Help For this ? why they still appears in backlinks to my site, Why not removed Still ? Thanx in Advance, Falguni

    | Sanjayth
    0

  • Currently we are performing very poorly in organic clicks. We are a e-commerce site with over 2000 products. Issues we thought plagued us: Copied Images from competitors Site wide duplicate content duplicate content from competitor site Number of internal links on a page (300+) Bad backlinks (2.3k from 22 domains and ips) being linked to from sites like m.biz URLs URLs are abbreviated, over 50% lack our keywords Lack of meta descriptions, or too long meta descriptions Current State of fixing these issues: 50% images are now our own Site wide duplicate content near 100% completed Internal links have been dealt with Rewrote content for every product 90% of meta descriptions are fixed From all of these changes we have yet to see increase in traffic...10% increase at best in organic clicks. We think we have penalties on certain URLs. My question for the MOZ community is what is the best way to attack the lack of organic clicks. Our main competition is getting 900% more clicks than us. Any more information you need on the topic let me know and will get back to you.

    | TITOJAX
    0

  • Hello folks, We are restructuring some URLS which forms a fair chunk of the content of the domain.
    These content are auto generated rather than manually created unlike other parts of the website. The same content is currently accessible from two URLs: /used-books/autobiography-a-long-walk-to-freedom-isbn
    /autobiography/used-books/a-long-walk-to-freedom-isbn The URL 1 uses the URL 2 as the canonical url and it has worked allright since Moz does
    not show the two as duplicate of each other. Google has also indexed the canonical URL although
    there is still a few 'URL 1s' which were indexed before the canonical was implemented. The updated URL structure will look like something like this: /used-books/autobiography-a-long-walk-to-freedom-author-name-isbn
    /autobiography/used-books/a-long-walk-to-freedom-authore-name-isbn It would be great to have just a single URL but a few business requirement prevents
    us from having just the canonical URL only even with the new structure. Since we will still have two URLs to access the same content and we were wondering
    whether we will need to do a 1:1 301 redirect on the current URLs or since there  will be canonical URL
    (/autobiography/used-books/a-long-walk-to-freedom-authore-name-isbn),
    we won't need to worry about doing the 1:1 redirect on the the indexed content? Please note that the content will still be accessible from the OLD URL (unless 301ed of course). If it is advisable to do a 1:1 301 redirect this is what we intend to do: /used-books/autobiography-a-long-walk-to-freedom-isbn 301 to
    /used-books/autobiography-a-long-walk-to-freedom-author-name-isbn /autobiography/used-books/a-long-walk-to-freedom-isbn 301 to
    /autobiography/used-books/a-long-walk-to-freedom-authore-name-isbn Any advice/suggestions would be greated appreciated. Thank you.

    | HB17
    0

  • Hey there, we have some ranking issues with our international website. It would be great if any of you could share their thoughts on that. The website uses subfolders for country and language (i.e. .com/uk/en) for the website of the UK branch in English. As the company has branches all over the world and also offers their content in many languages the url structure is quite complex. A recent problem we have seen is that in certain markets the website is not ranking with the correct country. Especially in the UK and the US, Google prefers the country subfolder for Ghana (.com/gh/en) over the .com/us/en and .com/uk/en versions. We have hreflang setup and should also have some local backlinks pointing to the correct subfolders as we switched from many ccTLDs to one gTLD. What confuses me is that when I check for incoming links (Links to your site) with GWT, the subfolder (.com/gh/en) is listed quite high in the column (Your most linked content). However the listed linking domains are not linking at all to this folder as far as I am aware. If I check them with a redirect checker they all link to different subfolders. So I have now idea why Google gives such high authority to this subfolder over the specific country subfolders. The content is pretty much identical at this stage. Has any of you experienced similar behaviour and could point me in a promising direction? Thanks a lot. Regards, Jochen

    | Online-Marketing-Guy
    0

  • Hi everyone, My client is a chain of franchised restaurants with a local domain website named after the franchise. The franchise exited the market while the client stayed and built its own brand with a separate website. The franchise website (which is extremely popular) will be shut down soon but the client will not be able to redirect the franchise website to the new website for legal reasons. What can I do to ensure that we start ranking immediately for the franchise keyphrase as soon as the franchise website is shutdown. We currently have the new website and access to the old website (which we can't redirect) Thanks, T

    | Tarek_Lel
    0

  • We're not currently displaying the last modified date in our sitemap, e.g.: <url><loc>http://www.soschildrensvillages.org.uk/about-our-charity</loc></url> Are their any advantages to including this data? One benefit that occurred to us is that it will enable Google to determine which pages have fresh content and which are therefore worth crawling, helping Google index beneficial changes quicker. Thanks!

    | SOS_Children
    1

  • Our rankings are up and down but our domain is clean, DA and PA good
    and there is really in depth content which is all original. We are at a bit of a loss; The site is
    http://www.fightstorepro.com I can use the phrase "Boxing Gloves" as an example
    http://www.fightstorepro.com/gear/gloves/boxing-gloves.html PA 26 , DA29
    Good original content; Video content, On page grade A
    Not ranking in top 50 places?? The competitor in Pos4 is not matching our placing Anyone shed any light on this?

    | AlexSTUDIO18
    0

  • I recently (http://moz.com/community/q/less-tags-better-for-seo) started reviewing my category and tag policy, and things have been going very well.  I thought I would share what I have done: Removed all tags from site Added unique descriptions for each post for the category excerpt. Only had the category description on the first page and use the description like a post to summarise and interlink to sub-categories or posts.  This keeps pages from slipping down the number of clicks until it can be reached, improving link juice distribution.  I also reduced the number of posts showing to 5, to allow more focus on the description (main part) of the category post. To add the category description on the first category page only in Wordpress, you need to go to the category.php or archive.php and change: to The overall aim was to have a hierarchal resource contained in the category page description.  Whilst this is still a work in progress, you can see an example of what I am trying to achieve here: https://www.besthostnews.com/web-hosting-tutorials/cpanel/ https://www.besthostnews.com/web-hosting-tutorials/cpanel/mail/ If you have any further tips and advice as I continue to implement this (with good results so far), please feel free. Also, you can use the Visual Term Description Editor plugin to allow the wysiwyg editor for the category descriptions.

    | TheWebMastercom
    1

  • Our site has several hundred toxic links. We would prefer that the webmaster remove them rather than submitting a disavow file to Google. Are we better off writing web masters over and over again to get the links removed? If someone is monitoring the removal and keeps writing the web masters will this ultimately get better results than using some automated program like LinkDetox to process the requests? Or is this the type of request that will be ignored no matter what we do and how we ask? I am willing to invest in the manual labor, but only if there is some chance of a favorable outcome. Does anyone have experience with this? Basically how to get the highest compliance rate for link removal requests? Thanks, Alan

    | Kingalan1
    1

  • Hi Mozzers, On my website, I have a FAQ Page (with the questions-responses of all the themes (prices, products,...)of my website) and I would like to add some thematical faq on the pages of my website. For example : adding the faq about pricing on my pricing page,... Is this duplicate content? Thank you for your help, regards. Jonathan

    | JonathanLeplang
    0

  • Greetings: In April of 2014 an SEO firm ran a link removal campaign (identified spammy links and uploaded a disavow). The overall campaign was ineffective and MOZ domain rank has fallen to 24 from about 30 in the last year and traffic is 20% lower. I purchased a basic package for Link Detox and ran a report today (see enclosed) to see if toxic links could be contributing to our mediocre rankings. As a novice I have a few questions for you regarding this the use of Link Detox: -We scored a domain wide detox risk of 1,723. The site has referring root domains with 7113 links to our site. 121 links were classified as high audit priority. 56 as medium audit priority. 221 links were previously disavowed and we uploaded a spreadsheet containing the names of the previously disavowed links. We had LinkDetox include an analysis of no-follow links as they recommend this. Is our score really bad? If we remove the questionable links should we see some benefit in ranking? -Some of the links we disavowed last year are still linking to our site. Is it worthwhile to include those links again in our new disavow file? -Prior to filing a disavow we will request that Webmaster remove offending links. LinkDetox offers a package called Superhero for $469.00 that automates the process. Does this package effectively help with the entire process of writing and tracking the removal requests? Do you know of any other good alternatives? -A feature called "Boost" is included in the LinkDetox Super Hero package. It is suppose to expedite Google's processing of the disavow file. I was told by the staff at Link Detox that with Boost Google will process the disavow within a week. Do you have any idea if this claim is valid??? It would be great if it were true. -We never experienced any manual penalty from Google. Will uploading a disavow help us under the circumstances? Thanks for your feedback, I really appreciate it!!! Alan p2S6H7l

    | Kingalan1
    0

  • Is it possible to get a pretty good idea of my site's link profile by merging the link data from Google Webmaster Tools, MOZ and SEMRUSH (backlinks).  I would think that combining the linking domains from these three packages would create a pretty good profile. Once I create a list of domains that link to my site is it possible to run them thru MOZ so as to evaluate their quality? Last year I paid a reputable SEO firm to run a link analysis, process link removal requests and finally a disavow, only to see my domain authority decline from 33 to 24. So I am leary of the process. That being said I have reviewed the disavow file that was submitted last year and still see about a third of the low quality domains still linking to our site. Alternatively is it worthwhile to run a link detox report. Maybe it is worth biting the bullet and spending the $175.00 dollars to run a report. Our site (www.nyc-officespace-leader.com) does not have too many links so maybe I can research this manually. Thoughts???

    | Kingalan1
    0

  • Hi there, I'm currently targeting Australia and the US for one of my web-pages. One of my web-pages begin with a subdomain (au.site.com) and the other one is just the root domain (site.com). After searching the website on Australian Google and checking the description and title, it keeps the US ones (i.e. root domain) and after checking the cached copy, it was cached earlier today but it is displayed exactly as the American website when it is supposed to be the Australian one? In the url for the caching it appears as au.site.com while displaying the American page's content. Any ideas why? Thanks, Oliver

    | oliverkuchies
    0

  • HI there, we are moving a website from Shoptrader to Magento, which has 45.000 indexations.
    yes shoptrader made a bit of a mess. Trying to clean it up now. there is a 301 redirect list of all old URL's pointing to the new one product can exist in multiple categories want to solve this with canonical url’s for instance: shoptrader.nl/categorieA/product has 301 redirect towards magento.nl/nl/categorieA/product shoptrader.nl/categorieA/product-5531 has 301 redirect towards magento.nl/nl/categorieA/product shoptrader.nl/categorieA/product¤cy=GBP has 301 redirect towards magento.nl/nl/categorieA/product shoptrader.nl/categorieB/product has 301 redirect towards magento.nl/nl/categorieB/product, has canonical tag towards magento.nl/nl/categorieA/product shoptrader.nl/categorieB/product?language=nl has 301 redirect towards magento.nl/nl/categorieB/product, has canonical tag towards magento.nl/nl/categorieA/product Her comes the problem:
    New developer insists on using /productname as canonical instead of /category/category/productname, since Magento says so. The idea is now to redirect to /category/category/productname and there will be a canonical URL on these pages pointing to /productname, loosing some link juice twice. So in the end indexation will take place on /productname … if Google picks it up the 301 + canonical. Would be more adviseable to direct straight to /productname (http://moz.com/community/q/is-link-juice-passed-through-a-301-and-a-canonical-tag), but I prefer to point to one URL with categories attached. Which has more advantages(?): clear menustructure able to use subfolders in mobile searchresults missing breadcrumb What would you say?

    | onlinetrend
    0

  • Hello there, If a quality blog in our specific niche writes an article about us which is clearly labelled "sponsored post" as we have either paid them or given them a product, will Google discount that link going back to our website? Should we request for the link to be "no-follow"? Thanks Robert

    | roberthseo
    0

  • Hello! Our company has been growing in terms of traffic and ranking well for a couple of years but we are now kind of stagnating because we just don't know what to do next. We have a good blog - and with our blogs, we have been targeting all major keywords with their related keywords as a bucket. - "keyword theme / page" for a long time. But it seems we now don't have any major keyword theme to write about. What is worse is that we don't see any traffic growth since 2014 September. (although we added many good blogs) So what would do you when you run out of keywords? or keyword themes? Would you just keep pumping in more blogs and hope that you get more clicks? or at some point, you just don't care about keywords and write whatever relevant to your site? Wouldn't it hurt our site if we create similar keyword themed pages? (like regurgitating our keywords?)  or even same keyword targeting pages? You must have similar experience if you are an owner of a niche site. Can you please share your experience with this kind of headaches? Thank you and look forward to your comments.

    | joony
    3

  • I have an e-commerce site with quite a large (subdirectory) blog attached. The blog is very successful, having attracted about 2 million visitors last year - almost 4 times that of our actual e-commerce pages. Although all content is tangentially relevant, the blog does not convert well directly (mostly because it attracts people at the wrong point in the funnel). Our average bounce rate on e-commerce pages is around 40%, while the blog is about 90% (it answers questions directly with some outbound links); and average page visits to e-commerce pages is 4, compared to 1.3 on the blog. I am concerned that this 80% of my traffic that does not often convert and leaves the site quickly, is costing me in rankings on the pages that do perform well. We recently re-released the e-commerce section of the site and despite cleaning up our structure and content, fixing bad URL structure etc., we saw little  benefit. I am therefore considering taking the blog OFF our site and moving it elsewhere, linking back to the e-commerce site and allowing it to stand on its own two feet. Is this a bad idea? Thoughts?

    | redtalons1
    0

  • Hello, We're giving our website a bit of a spring clean in terms of SEO. The site is doing ok, but after the time invested in SEO, content and last year's migration of multiple sites into one, we're not seeing the increase in traffic we had hoped. Our current urls look something like this: /a-cake-company/cup-cakes/strawberry We have the company name as the first level as we with the migration we migrated many companies into one site. What we're considering is testing some pages with a structure like this: /cup-cakes/cup-cake-company-strawberry So we'll lose a level and we'll focus more on the category of the product rather than the brand. What's your thoughts on this? We weren't going to do a mass change yet, just a test, but is this something we should be focusing on? In terms of organisation our current url structure is perfect, but what about from an SEO point of view? In terms of keywords customers are looking for both options. Thanks!

    | HB17
    0

  • After migrating 8 sites into one last year, which went quite successfully, we're now looking into SEO much deeper and how we can improve overall. Something I have noticed is the deeper the pages, the longer the url, the lower the page authority. It almost halves for each level the page gets deeper. Is this true? And if so how can we combat this? I know content is key, but is there anything else we can do? Many thanks

    | HB17
    0

  • Just a quick question really. Say I have a Promotions page where I list all current promotions for a product, and update it regularly to reflect the latest offer codes etc. On top of that I have Offer announcement posts for specific promotions for that product, highlighting very briefly the promotion, but also linking back to the main product promotion page which has a the promotion duplicated.  So main page is 1000+ words with half a dozen promotions, the small post might be 200 words, and quickly become irrelevant as it is a limited time news article. Now, I don't want the promotion page indexed (unless it has a larger news story attached to the promotion, but for this purpose presume it is doesn't).  Initially the core essence of the post will be duplicated in the main Promotion page, but later as the offer expires it wouldn't be.  Therefore would you Rel Canonical or just simply No-index?

    | TheWebMastercom
    0

  • Hi All, I'm trying to figure out whether or not my developer is properly implementing Itemprop elements in code. Here is an example of where my confusion lies: "If you're taking an itemprop=" name drugClass" itemtype="http://schema.org/DrugClass">antiepileptic drug" When the span opens both recommended itemprops "name" and "drugclass" are listed together. Does this allow both to be properly read or is it effectively creating an itemprop that does not exist? Thanks!

    | CMIMedia
    1

  • We lost our Google organic ranking (position 1 - 3) for our highest converting key phrase (cotton tees) in February.  The ranking was for our homepage (brandname.com) which is very image heavy and doesn't have much readable content.  We noticed that all of our competitors are ranking above us for their category page, not their homepage.  The difference between us and our competitors is that we specialize in this key phrase and they just offer one category of the key phrase.  For example, we only sell cotton tee's and they sell cotton tees, handbags and shoes. When we dropped we noticed that Google began showing our homepage AND category page in the results, so we pointed our brandname.com to brandname.com/cotton-tees canonically. The idea was that this would assure that the homepage and category page were not competing with each other.  The homepage was not really optimized for cotton tees so we thought this might help. 1. Is there any harm in removing the canonical and allowing both pages to rank? (We're also working on redesigning the homepage to add more readable text & optimize for cotton tees.) 2. Our homepage URL used to be "brandname.com/cotton-tees" and we consistenly ranked between 1 and 3 for cotton tees during that time.  We modified the homepage URL because it seemed spammy and are now just "brandname.com".  Does it make sense to go back to the URL with the key phrase in it if that is our main product and we want to rank for it?

    | EileenCleary
    0

  • Hi All, I have a blog with a lot of content (news and pr messages), I want to move my blog to new domain. What is your recommendation? 1. Keep it as is. old articles -> 301 -> same article different URL
    2. Remove all the duplicate content and create 301 from the old URL to my homepage.
    3. Keep it as is, but add in the meta-tags NoIndex in duplicate articles. Thanks !

    | JohnPalmer
    0

  • My sites domain authority is only 23. The home page has a page authority of 32. My site consists of about 400 pages. The topic of the site is commercial real estate (I am a real estate broker). A number of the sites we compete against have a domain authority of 30-40. Would our overall domain authority improved if we re-wrote the content for several hundred of pages that had the lowest page authority (say 12-15)?  Is the overall domain authority derived by an average of the page authority of each page on a domain? Alternatively could we increase domain authority by setting the pages with the lowest page authority to "no index". By the way our domain is www.nyc-officespace-leader.com Thanks, Alan

    | Kingalan1
    0

  • Hi, Is it ok to fetch a section on a page using ajax. Will it be crawlable by Google. I have already seen google's directions to get a complete ajax fetched page crawled by Google. Is there a way to get a particular section on a page fetched through ajax & indexed by Google. Regards

    | vivekrathore
    0

  • Hello. I recently opened my website last month and after that did some seo and a press release. SEO guy was good and didnt over optimize any particular keyword. but now I am seeing a partular phrase is hovering around 10%. That is rather a money keyword and I am worried am I gonna get a penalty for that. That keyword came from the press release where I am seeing 7 rather very reputable website posted that press release. Now what are the chances of getting a penalty for this?

    | some1cool
    0

  • I have a client that is insisting that I add a list of approximately 50 cities  and 80 zipcodes that their business serves within the keyword meta tag.  Based on what I have been reading this will do absolutely nothing to help improve their search ranking.  What would be the proper way today to let inform search engines of the geolocations a business serves?

    | mmurphy
    0

  • For a long time, we had terrible on page SEO. No keyword targeting, no meta titles or descriptions. Just a brief 2-4 sentence product description and shipping information. Strangely, we weren't ranking too bad. For one product, we were ranking on page 1 of Google for a certain keyword. My goal to reach the top of page 1 would be easy (or so I thought). I have now optimized this page to rank better for the same keyword. I have a 276 word description with detailed specifications and shipping information. I have a strong title and meta description with keywords and modifers. I have also included a video demonstration, additional photos and an PDF of the owners manual. In my eyes, the page is 100% better than it ever was. In the eyes of MOZ, it's better also. I've got an A with the On-Page Grader. Why is this page now ranking on page 8 of Google? What have I done wrong? What can I do to correct it?

    | dkeipper
    0

  • Instead of block an entire sub-domain (fr.sitegeek.com) with robots.txt, we like to block one directory (fr.sitegeek.com/blog).
    'fr.sitegeek.com/blog' and 'wwww.sitegeek.com/blog' contain the same articles in one language only labels are changed for 'fr' version and we suppose that duplicate content cause problem for SEO. We would like to crawl and index 'www.sitegee.com/blog' articles not 'fr.sitegeek.com/blog'. so, suggest us how to block single sub-domain directory (fr.sitegeek.com/blog) with robot.txt? This is only for blog directory  of 'fr' version even all other directories or pages would be crawled and indexed for 'fr' version. Thanks,
    Rajiv

    | gamesecure
    0

  • Whats are the best way to create dynamic pages in eCommerce website having static urls? Or what are other ways to increase/create more pages in websites.

    | Obbserv
    0

  • Hi there, Recently I applied the href lang tags like so: Unfortunately, the Australian site uses the same description and title as the US site (which was the root directory initially), am i doing something wrong? Would appreciate any response, thanks!

    | oliverkuchies
    0

  • Thanks in advance for your time and expertise. I am having issues with duplicate page content and titles on a client's Shopify subdomain. Examples below. Two questions: #1 How can I solve this issue? Do I block the duplicate pages from being crawled? With meta NoIndex? Establish the main page as the canonical version and stop obsessing? Other... #2 Is it a big concern or am I needlessly obsessing? Feels like a concern that needs to be addressed, but maybe not? Duplicate Page Content Examples: #1 URL: http://shop.shopvandevort.com #1 Duplicate URLs: http://shop.shopvandevort.com/collections/all; http://shop.shopvandevort.com/collections/all?page=1 #2 URL: http://shop.shopvandevort.com/collections/accessories #2 Duplicate URLs: http://shop.shopvandevort.com/collections/accessories; http://shop.shopvandevort.com/collections/types?q=Accessories Duplicate Page Title Examples: http://shop.shopvandevort.com/collections/vendors?q=For%20Love%20And%20Lemons http://shop.shopvandevort.com/collections/for-love-lemons http://shopvandevort.com/blog/tag/for-love-and-lemons/ http://shop.shopvandevort.com/collections/for-love-lemons?page=1 Thanks again for taking a look here, very much appreciated.

    | AaronHurst
    0

  • Our site URL structure used to be (example site) frogsforsale.com/cute-frogs-for-sale/blue-frogs wherefrogsforsale.com/cute-frogs-for-sale/ was in front of every URL on the site. We changed it by removing the for-sale part of the URL to be frogsforsale.com/cute-frogs/blue-frogs. Would that have hurt our rankings and traffic by removing the for-sale?  Or was having for-sale in the URL twice (once in domain, again in URL) hurting our site? The business wants to change the URLs again to put for-sale back in, but in a new spot such as frogsforsale.com/cute-frogs/blue-frogs-for-sale as they are convinced that is the cause of the rankings and traffic drop.  However the entire site was redesigned at the same time, the site architecture is very different, so it is very hard to say whether the traffic drop is due to this or not.

    | CFSSEO
    0

  • I run a e-commerce site and we have many product tags. These product tags come up as "Duplicate Page Content" when Moz does it's crawl. I was wondering if I should use Nonindex or Canonical? The tags all go to the same product when used so I figure I would just nonindex them but was wondering what's the best for SEO?

    | EmmettButler
    1

  • Hi all, We experienced a strange phenonema after a Facebook push, it appears the Google organic traffic was all but dead for five days after. Totally not sure why! It has since returned to about 80% of previous levels. http://postimg.org/image/3n1b7m7hf/

    | ScottOlson
    0

  • Hi guys - I'm looking at a website which uses hashtags to reveal the relevant content So there's page intro text which stays the same... then you can click a button and the text below that changes So this is www.blablabla.com/packages is the main page - and www.blablabla.com/packages#firstpackage reveals first package text on this page - www.blablabla.com/packages#secondpackage reveals second package text on this same page - and so on. What's the best way to deal with this? My understanding is the URLs after # will not be indexed very easily/atall by Google - what is best practice in this situation?

    | McTaggart
    0

  • Hi All, We have approx 6,000 - 404 pages. These are for categories etc we don't do anymore and there is not near replacement etc so basically no reason or benefit to have them at all. I can see in GWT , these are still being crawled/found and therefore taking up crawler bandwidth. Our SEO agency said we should 410 these pages?.. I am wondering what the difference is and how google treats them differently ?. Do anyone know When should you 410 pages instead of 404 ? thanks Pete

    | PeteC12
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.