Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • I see that the gap uses gap.com, oldnavy.gap.com and bananarepublic.gap.com.  Wouldn't a better approach for SEO to have oldnavy.com, bananarepublic.com and gap.com all separate?  Is there any benefit to using the approach of store1.parentcompany.com, store2.parentcompany.com etc?  What are the pros and cons to each?

    | kcb8178
    0

  • Is there any harm to SEO having a homepage url that is not clean like www.domain.com.  For example citi uses https://online.citi.com/US/login.do  Does that matter in any way?  Would a company like citi benefit from changing to www.citi.com as their homepage?

    | kcb8178
    1

  • We had a strange thing happen to our website. We have a website that ranks for top keywords for some years. Last week we lost alle of our rankings for one day. The strange thing was, that the rankings were only lost on the homepage, the homepage didnt rank anymore. Even when i googled for 'homepage.com' google showed 'homepage.com/page' and not the homepage. After some reading i checked for the following: no-index sources in the source code - but no results checked for http respons code - 200 ok status checked for downtime in pingdom - no downtime With no rankings on the homepage i was losing 90% of the traffic, i went to GWT and did a fetch as google request, the website looked okay. After that i request a new indexation request. After this the website was back in google with all the old rankings. But what has happend, is still a question for me. Can this be an hack or a wordpress problem or is this realtime penguin with a hit. I also read things about Google flux, but it didnt happen before and it was just the homepage, with no index at all. What happend?

    | remkoallertz
    0

  • Hi Everyone, We are about to take down a number of websites in favour of a new singular B2B hub and would be looking to redirect all of these sites to the new home. For SEO purposes, what would be the best way to do this? Due to the difference in setups and scale of the site, it would be difficult to correctly match up each page to page between the sites for individual 301 redirects. Could someone advise on the best plan of action? Thanks.

    | chbiz
    0

  • Hi Experts, I am doing adwords conversion tracking via GTM all working fine. Now I want to track few products which I sell as Sample products whose product SKU is "Sample" so I want to see conversion of sample products and other products separately. Also when anyone purchase sample product and regular product then both tag should fire separately. So how can I create tag? Thanks!

    | dsouzac
    0

  • Hi, I'm using the software snip.ly, which allows me to add call to action into content I publish through social media. It's really powerful but I'm wondering how it can affect my SEO? Snip.ly now appears into my link report and its spam score is only 2, which is good. However I'm afraid that in the long term, it can be bad: links are created manually by the webmarketer Topics of this website are infinite the ancor is the same Your thoughts?..

    | 2MSens
    0

  • Mozzers, http:itsgr82bme.com Old domain homepage had a DA of 24 and a PA of 36 Currently redirected to http://thekidstime.com Homepage shows a DA of 6 and a PA of 1. That is a significant loss of authority. I thought a 301 is supposed to be better than that. What gives? What are the next steps, asking the old backlinks to update their links? Thanks for your help, Matt

    | matt.nails
    0

  • If a website did an HTTP to HTTPS migration using 302 re-directs, that were corrected to 301s about 4 months later, what is the expected impact? Will the website see a full recovery or has the damage been done? Thanks to anyone who can shed some light on this...

    | yaelslater
    0

  • Hello 🙂 Does anybody know about any ecommerce sites that have done PIM-enrichment projects with great success, resulting in significant increase in organic traffic and rankings? Our agency is doing a presentation about leveraging better and more product information for long-tail SEO, and we'd like some success cases to point to. Thanks 🙂
    Sigurd Bjurbeck, SEO-consultant

    | Inevo
    0

  • Hopefully a yes or no answer to this one... If I have a link pointing to my site as below, is the SEO value stripped away because of the query in the URL? https://mysite.co.uk/?WT.mc_id=Test The above mentioned page also has the canonical tag:  on it.

    | Marketing_Today
    0

  • Our website CMS is wordpress. Due to the Genesis Framework; below 4 phrases tuned into h2 tags: Skip links, Header Right, Main navigation and Footer. How to remove these?

    | vtmoz
    0

  • Hello, I've got a new Ecommerce site I'm jumpstarting. It's one of those sites that takes a while to rank for. Here's what we're doing: 1. Creating a beautiful, mobile friendly site. 2. Adding a long detailed home page answering all the questions that people come to our industry keyword results with. 3. Adding detailed, beautiful cateogy pages. 4. Adding detailed, beautiful product pages. 5. Adding beautiful, long About Us & Resource Sites list pages. 6. Offering straight up obvious free shipping and no tax even though that's taking a hit in our industry. 7. We're going after the 2 main informational terms (keyword explorer) in the industry with a vengance - 20X as good as the competition for the main term. 8. We're adding 20-30 pages of articles to help our customers and hit major keyword search terms, although there's not much in our industry. What else would you recommend doing to jumpstart a new Ecommerce site that has difficulty being in the top 50? Thanks.

    | BobGW
    0

  • Hi All, One of our subdomains has lot of content created by different users and mostly they are outgoing links from landing pages. Moreover the top ranking content is about "cigarettes" which is nowhere related to our niche. Will this hurt our domain rankings?

    | vtmoz
    0

  • Can we have organization markup schema for subdomain ? For example if my main domain is xyz.com and subdomain is sub.xyz.com If i plan to have organization markup schema for subdomain how should it look like ? Should the markup schema must have main domain url or sub domain url in markup schema ? Should it be like this ?

    | NortonSupportSEO
    0

  • Hi there. I am looking at a site that currently has 2 <title>elements present on every page through out the site. They have a main Title with appropriate optimization. Then they have another <title> that appears to be empty. It looks like this; <span class="html-tag"><title></span><span class="html-tag"></title> I know having 2 <title>elements is not ideal. But if the second one is empty, can it still have a negative affect? </p> <p>Thanks</p></title>

    | Ciceron_main
    0

  • Hi all, We always mention "brand & keyword" in every page title along with topic in the website, like "Topic | vertigo tiles". Let's say there is a sub-directory with hundreds of pages...what will be the best page title practice in mentioning "brand & keyword" across all pages of sub-directory to benefit in-terms if SEO? Can we add "vertigo tiles" to all pages of sub-directory? Or we must not give same phrase? Thanks,

    | vtmoz
    0

  • I have a site with a few hundred thousand backlinks and many of these are are legitimate, but the source site of the backlink often has a low authority and is broken, causing a spiral of bad backlinks from pagination, comment replies, etc. For example one site may have 4 legitimate backlinks with a spiral of 400+ bad backlinks, this is happening across dozens of domains. My site is more authoritative than these broken backlinks and regularly receives highly authoritative backlinks, because of this would it be best to disavow these spiraling low authority domains, attempt to contact the webmaster and add a nofollow, or any other solution?

    | FPD_NYC
    0

  • Hi All, As we can see the below statement from Google about internal linking: "The number of internal links pointing to a page is a signal to search engines about the relative importance of that page."
    https://support.google.com/webmasters/answer/138752?hl=en So if we interlink a page highly than other pages, will it rank on search results instead of homepage? Moreover if the page have "keyword" in URL slug...like www.website.com/keyword. Thanks

    | vtmoz
    0

  • I have a Magento site and just realized we have about 800 review pages indexed. The /review directory is disallowed in robots.txt but the pages are still indexed. From my understanding robots means it will not crawl the pages BUT if the pages are still indexed if they are linked from somewhere else. I can add the noindex tag to the review pages but they wont be crawled. https://www.seroundtable.com/google-do-not-use-noindex-in-robots-txt-20873.html Should I remove the robots.txt and add the noindex? Or just add the noindex to what I already have?

    | Tylerj
    0

  • Hi there, I had a website www.footballshirtcollective.com that has been live since July.  It contains both content and eCommerce. I am now separating out the content so that; 1.  The master domain is www.footballshirtcollective.com (content) pointing to a new site 2.  Subdomain is store.footballshirtcollective.com (ecommerce) - pointing to the existing site. What do you advise I can do to minimise the impact on my search? Many thanks Mike

    | mjmaxwell
    0

  • Hi guys, I have a massive list of URLs and want to check if the primary keyword for each URL has been optimised. I'm looking for something similar to Moz on-page grader which grades the URL and primary keyword with a single metric e.g. grade a, b, c However, Moz doesn't offer an API to pull this score automatically. I was wondering does anyone know of any tools which you can access their API to do something like this? Cheers.

    | jayoliverwright
    0

  • I have a site that is hundreds of page indexed on Google. But there is a page that I put in the footer section that Google seems does not like and are not indexing that page. I've tried submitting it to their index through google webmaster and it will appear on Google index but then after a few days it's gone again. Before that page had canonical meta to another page, but it is removed now.

    | odihost
    0

  • I'm clocking up nearly 1 second on time to first byte on a Joomla 2.5 site. It is on shared hosting so don't think there's much option for improvement from the hardware or core configuration angle. There isn't much in the way of plug-ins and it is using SSD, the problem seems to be with the CMS itself. Any ideas how I could produce the TTFB? Thanks

    | seoman10
    0

  • Hi Mozzers - was just looking at website speed and know the google guidelines on average page load time but I'm not sure whether Google issues guidelines on any of the other 4? Do you know of any guidance on domain lookup, server response, server connection or page download? Page Load Time (sec) - I tend to aim for 2 seconds max: http://www.hobo-web.co.uk/your-website-design-should-load-in-4-seconds/ 
    Server Response Time: [Google recommends 200ms]: https://developers.google.com/speed/docs/insights/Server Redirection Time (sec) [dependent on number of redirects so probably no guide figure]
    Domain Lookup Time (sec)
    Server Connection Time (sec)
    Page Download Time (sec) Thanks, Luke

    | McTaggart
    0

  • Hi there! We are currently evaluating data visualization / charting tools for rich content. Are there any open source solutions that work best in your opinion? Why? Some specific questions: Are static image / svg rendered images better than a javascript dynamic chart (canvas/HTML5)? Which gets indexed better? Is there any proven or perceived benefit to using Google Charts API that gives you an SEO boost? Are there tools for progressively enhancing HTML raw data tables to generate charts? Looking at a couple of solutions: Google Charts API C3.js Chartjs Thanks for your feedback!

    | insurifyusa
    0

  • So this is a bit of a strange one. My latest website was built on a different domain, then transferred over (as opposed to being built on a subdomain). I was told that the domain which my site was built on wasn't indexed by Google, but looking at the Google Search Console I can see that the old domain name is showing up as the most linked to domain name of my current site - meaning it was indexed. The domain (and all of its pages) does have a 301 redirect to the new website home page (as opposed to their individual pages), but could this be causing me a problem with SEO? Additionally, my website has a sister (UK and US websites), both link to each other on the footer (which appears on every page). Could this be pulling my SEO efforts down if it is a do-follow link?

    | moon-boots
    0

  • We have a client with an old domain which was spammy (bad links).  Until two months ago, it was forwarding to his current domain and (I believe) causing a penalty. Two months ago we transferred ownership of the spammy URL to a third party and setup an unrelated blog for Google to pick up on.  Google did pick up on the URL. After two months Google Webmaster Tools is still showing 200 links from the old domain, to the new domain (from the spammy domain).  Also, when you search the company name, the spammy domain still appears in the results (page two). Is there a faster way disassociate the old domain entirely from the business?  I.e., just delete the domain, forward the domain to another website, etc.? If you have experience in this, I'd love to hear from you. Thanks!

    | mgordon
    0

  • Hi all, I'm working on a dentist's website and want some advice on the best way to lay out the navigation. I would like to know which structure will help the site work naturally. I feel the second example would be better as it would focus the 'power' around the type of treatment and get that to rank better. .com/assessment/whitening
    .com/assessment/straightening
    .com/treatment/whitening
    .com/treatment/straightening or .com/whitening/assessment
    .com/straightening/assessment
    .com/whitening/treatment
    .com/straightening/treatment Please advise, thanks.

    | Bee159
    0

  • I have a website of a freelancer who is using a One Page template which includes the following section About Him Portfolio Resume I also got 5 sperate pages which are related to the keywords he wants to rank for.  Will this be sufficient or should I suggest him to go for a separate website template?

    | iamgaurav1290
    0

  • My website (www.kamagrauk.com) is showing www.likeyoursaytoday.com in google cache website domain further redirect to http://kamagrauknow.com/ problem :1)  info:kamagrauk.com shows www.likeyoursaytoday.com2) cache:kamagrauk.com shows www.likeyoursaytoday.comwww.likeyoursaytoday.com copied content from kamagraoraljelly.mei already checked done1) changed website hosting (New Virtual private server)2) Uploaded Fresh backup of website 3) Checked header response ( DNS perfect)4) Checked language meta tag (no error)5) fetch function worked fine 6) try to remove url and readded 7) no error in sitemap8) SSL all Ok9) no crawl errorsnothing worked ....... trying to contact www.likeyoursaytoday.com  but not responding backToday (23rd feb) www.likeyoursaytoday.com  gone down but our cache been replaced http://www.bagnak.com/so it seems google not able to read our page but here i am attaching screen shoot which google sees everything okblocked%20resources.png cache.png crawlerror.png robots%20test.png

    | Gauravbb
    1

  • My sitemap has been submitted to Google for well over 6 months and is updated frequently, a total of 979 URLs have been submitted by only 145 indexed. What can I do to get Google to index them all?

    | moon-boots
    0

  • I'm getting a "Your page is not mobile-friendly." notice in the SERPs for all of our PDFs.  I check the pdf on the phone and it appears just fine. rFtLq

    | johnnybgunn
    0

  • Hey Guys, We’re in the process of transitioning our key traffic generating pages on our website from m. to responsive. Today, our site uses Google’s ‘Separate URLs’ method. Rel alternate on desktop pages to m. pages 302 redirects pushing mobile visitors to m. pages Canonical on m. pages back to desktop pages As we make the transition to responsive we’ll be taking the following steps: Removal of 302 redirects pushing mobile visitors to m. pages 301 redirects from m. pages to desktop pages With those changes in mind, I’d love to get the communities opinion on how to best handle the real alternate attribute on desktop pages. I'm considering leaving the rel alternate attribute in place on desktop pages for 30-90 days so that search engines continue to see the alternate version without the 302 redirects in place, crawl it, and as a result discover the 301 redirects more readily. If we remove the 302 redirects as well as the rel alternate, then my feeling is that search engines would just index the responsive page accordingly and be less likely to catch the 301 redirects pointing from the m. pages and make the transition of mobile pages in search indices take longer than necessary. Ultimately, I'm probably splitting hairs and getting a bit nuanced because I believe things will work themselves out whether we leave the rel alternate or remove it but I thought it would be great to get any opinions or thoughts from community members that have made a similar transition. Thanks in advance for stopping by and providing your thoughts. All the best,
    Jon PS - for your reference, the only mention that I was able to dig up in Q&A for a move from m. to responsive are the following: Redirecting M Dot Mobile Website to Responsive Design Website Questions SEO Concerns From Moving Mobile M Dot site to Responsive Version?

    | TakeLessons
    0

  • Hi Mozzers, My first Moz post! Yay! I'm excited to join the squad 🙂 My client is a full service entertainment company serving the Washington DC Metro area (DC, MD & VA) and offers a host of services for those wishing to throw events/parties. Think DJs for weddings, cool photo booths, ballroom lighting etc. I'm wondering what the right URL structure should be. I've noticed that some of our competitors do put DC area keywords in their URLs, but with the moves of SERPs to focus a lot more on quality over keyword density, I'm wondering if we should focus on location based keywords in traditional areas on page (e.g. title tags, headers, metas, content etc) instead of having keywords in the URLs alongside the traditional areas I just mentioned. So, on every product related page should we do something like: example.com/weddings/planners-washington-dc-md-va
    example.com/weddings/djs-washington-dc-md-va
    example.com/weddings/ballroom-lighting-washington-dc-md-va OR example.com/weddings/planners
    example.com/weddings/djs
    example.com/weddings/ballroom-lighting In both cases, we'd put the necessary location based keywords in the proper places on-page. If we follow the location-in-URL tactic, we'd use DC area terms in all subsequent product page URLs as well. Essentially, every page outside of the home page would have a location in it. Thoughts? Thank you!!

    | pdrama231
    0

  • I found page duplicate content when using Moz crawl tool, see below. http://www.example.com
    Page Authority 40
    Linking Root Domains 31
    External Link Count 138
    Internal Link Count 18
    Status Code 200
    1 duplicate http://www.example.com/index.htm
    Page Authority 19
    Linking Root Domains 1
    External Link Count 0
    Internal Link Count 15
    Status Code 200
    1 duplicate I have recently transfered my old html site to wordpress.
    To keep the urls the same I am using a plugin which appends .htm at the end of each page. My old site home page was index.htm. I have created index.htm in wordpress as well but now there is a conflict of duplicate content. I am using latest post as my home page which is index.php Question 1.
    Should I also use redirect 301 im htaccess file to transfer index.htm page authority (19) to www.example.com If yes, do I use
    Redirect 301 /index.htm http://www.example.com/index.php
    or
    Redirect 301 /index.htm http://www.example.com Question 2
    Should I change my "Home" menu link to http://www.example.com instead of http://www.example.com/index.htm that would fix the duplicate content, as indx.htm does not exist anymore. Is there a better option? Thanks

    | gozmoz
    0

  • I have been working on a site and through all the tools (Screaming Frog & Moz Bar) I've used it recognizes the canonical, but does Google? This is the only site I've worked on that has apostrophes. rel='canonical' href='https://www.example.com'/> It's apostrophes vs quotes. Could this error in syntax be causing the canonical not to be recognized? rel="canonical"href="https://www.example.com"/>

    | ccox1
    0

  • Hi Guys I am currently working on a website where one of the keyword targets is fluctuating. The keyword is fluctuating between page 2 and page 5. What makes this strange is that we are not experiencing the issue with any other keyword targets. They are all ranking fine.   It is only 1 keyword. The keyword target happens to be the main homepage keyword target - not sure if this makes a difference? The homepage targets 2 keyword e.g. Business Offices & Accessories. The homepage ranks perfectly fine for e.g. Business Accessories but is fluctuating for e.g. Business Offices! Very strange. What makes it even stranger - the keyword variations of the fluctuating keyword e.g. office for business  - these variations are all fine and not fluctuating. Its only 1 keyword. If anyone has any ideas  or feedback that would be great! Thanks, Duncan

    | CayenneRed89
    0

  • Hello all, The company that develops our website recently contacted and asked me if we could remove a large amount of URL rewrites. I've described a few factors and my main questions below. Some information: One year ago we did a large migration. We went from 27 websites to one main website. We have got about 2000 rewrites in the htaccess file. And the file is 208kb. A lot of links from our old domains still have incoming traffic which are handled by the rewrite rules mentioned above. Questions:
    The company that develops our website said that the htaccess file is too large and is causing or could be causing us website performance issues. They have asked us to remove URL rewrites.
    My question is:
    a) How many rewrites is too much? 
    b) Is the filesize of the htaccess of any importance or is it just the amount of rewrites in the file?
    c) Could we solve any potential server/website performance issues due to a large htaccess file in any other way? Increasing some values like 'post_max_size' or by any other solutions handled serverside? I do not have a lot of knowledge of htaccess rules but I've seen websites that handled over a million of rewrite rules. This is why I'm having doubts on whether removing URL rewrites is the only solution and possibly not the best solution for us. Hopefully you can help me any further and with the best way to proceed without losing traffic or causing 404 pages. Thanks in advance!
    Iordache Voicu

    | DPA
    0

  • For an Ecommerce website I am required to create two pages. 1) One that will be displaying the "Deal of the day", which is basically a summary of the product on sale and another 2) product page where the actual product-deal resides. "Deal of the day" page Fixed url e.g. homepage.com/deal-of-the-day Product description summary Go to product-deal & Buy Now Button Content changes everyday Product Deal Page Similar to other products, sometimes will be a group of products, coupons etc. Product deals will be stored for later re-use Not visible from the main product catalogue These products are most of the time the same products from the catalogue but different copy Recommendations? Thanks!

    | raulreyes
    0

  • 1. Ok so I code my eCommerce site myself first 2. Then switched to Shopify and re-direct all my URLs  (terrible mistake) 3. Then Shopify didn't do in terms of seo, so I switched to BigCommerce and re-direct all my URLs (Yes I know but at least this platform is much better) I started getting 404 errors as you can imagine in webmaster tools after switching from Shopify and there were 504 of them. Why its too many because I realized that Shopfiy just creates so many urls for the same pages. One by one I re-direct them to their new destinations. As you can imagine my rankings dropped. As my site speed is now 5.5s at gmetrix. mobile 47 - desktop 80 on Google Site speed tool. Looking at the links now, some of the 404's does not make sense to redirect. How should I approach this? Should I remove some of them if they were not used on web anywhere, no sites linking to that page and let them die in time? OR Should I keep them all? I am giving some examples below, there are so many for each. Thank you all! /account/login/ /blog/?page=7 blog/tagged/recipe /blogs/news /blogs/news?page=6 /collections/all/category-name /collections/frontpage/category-name /collections/frontpage/products/product-name /collections/shop/category-name /collections/shop/products/product-name /product-name/ /pages/terms-privacy /pages/frontpage /products/product-name /shop/products/product-name

    | mounayoga
    0

  • HI, Is there any tool to check how a website's internal linking structure has been linked. Some times few important pages may not linked very well and some links will be over linked. This will surge rankings...like if more links are pointing to one page? Is there any tool to check this?

    | vtmoz
    0

  • Hi Moz commnity, Our website ranking was good and dropped for couple of recent months. We have around 10 sub-domains. I doubt them if they are hurting us. Being said all over in SEO industry like the sub-domains are completely different websites; will they hurt if they are not well optimised? And we have many links from our sub-domains to website top pages, is this wrong for Google? How to well maintain the sub-domains? Do I need to worry about them? Thanks

    | vtmoz
    0

  • One of our clients today has sent over a list of keywords which he hopes to be ranked on page one for, please check these out and try not to laugh. All the existing Birmingham xxxx searches Hosted Voice Cloud Communications Cloud Solutions Cloud Services Pure Cloud VoIP Telephony Communications Unified Communications Fixed line SIP & SIP Trunks Broadsoft Yealink Contact Centre & Hosted Contact Centre Cyber Security Ransomware Open DNS Secure device management IoT – Internet of Things CISCO Meraki partner System manager Routers Switches Virtual stacking SOPHOS UTM partner SOPHOS Silver partner General Data Protection Regulation Business Mobile Mobile / Mobility M2M – Mobile 2 Mobile EE Vodafone O2 Managed print Photocopier / Printer Ethernet Leased Line EoFTTC FTTC ADSL2+ Broadband Connectivity WiFi CMX location analytics High capacity 802.11ac Automatic RF optimisation Security radio Identity-based firewall AC Dual Band Cloud managed wifi MDM – mobile device management Critical data Insurance Critical data Storage Collaboration I'm not sure he understood why I wanted to gather this information but he's defiantly not got the right end of the stick!

    | chrissmithps
    0

  • I see this error in search console :International Targeting | Language > 'fa-ir' - no return tagsURLs for your site and alternate URLs in 'fa-ir' that do not have return tags.and it is really increasingi do not know what is the problem and what I have done wrong? Originating URL Crawl date Alternate URL 1 /abadan/%D8%A2%D8%A8%D8%A7%D8%AF%D8%A7%D9%86/browse/vehicles/?place=8,541&v01=0,1&saveLoc=1 11/16/16 http://divar.ir/

    | divar
    0

  • We're working on a project for a retail client who has multiple (5+) brick and mortar store locations in a given geographical area. They're regional, so they have locations in multiple states. We're optimizing their content (coupons, events, products, etc) across their site, but we're running into the issue of ranking well for specific products in one location, but not as well (or not at all) in others. The keywords we would like to rank for generally aren't super competitive, we're dealing with commodity products in local retail markets, so in most cases, good on page optimization is enough to rank in the top couple results. Our current situation: (specific examples are fictitious but representative) Title: My Company | Dogwood Trees - Fredericksburg, VA, Rocky Mt, NC, Rock Hill, SC…
    Url: http://mycompany.com/catalog/product/dogwood-trees The content on the page is generally well optimized. We've claimed all the locations in Google places and we've deployed schema.org markup for each location that carries the item on the product page. We have specific location pages that rank well for Company name or Company Name Location, but the actual goal is to have the product page come up in each location. In the example above, we would rank #1 for "Dogwood Trees Fredericksburg VA" or just "Dogwood Trees" if the searcher is in or around Fredericksburg, on the first page for "Dogwood Trees Rocky Mt, NC", but not at all for any other locations. As these aren't heavily linked to pages, this indicates the title tag + on page content is probably our primary ranking factor, so as Google cuts the keyword relevance at the tail of the title tag, the location keywords stop helping us. What is the proper way to do this? A proposed solution we're discussing is subfolder-ing all the locations for specific location related content. For Example: My Company | Dog wood Trees - Fredericksburg, VA, Rocky Mt, NC, Rock Hill, SC…http://mycompany.com/catalog/product/dogwood-trees Becomes: My Company | Dogwood Trees - Fredericksburg, VA
    http://mycompany.com/fredericksburg-va/product/dogwood-trees My Company | Dogwood Trees - Rocky Mt, NC
    http://mycompany.com/rocky-mt-nc/product/dogwood-trees My Company | Dogwood Trees - Rock Hill, SC
    http://mycompany.com/rock-hill-sc/product/dogwood-trees Of course, this is the definition of duplicate content, which concerns me, is there a "Google approved" way to actually do this? It's the same exact tree being sold from the same company in multiple locations. Google is essentially allowing us to rank well for whichever location we put first in the title tag, but not the others. Logically, it makes complete sense that a consumer in Rock Hill, SC should have the same opportunity to find the product as one in Fredericksburg, VA. In these markets, the client is probably one of maybe three possible merchants for this product within 20 miles. As I said, it's not highly competitive, they just need to show up. Any thoughts or best practices on this would be much appreciated!

    | cballinger
    2

  • Hi guys, Has anyone seen cases where a site has been impacted negatively from internal linking from blog content to commercial based pages (e.g. category pages). Anchor text is natural and the links improve user experience (i.e it makes sense to add them, they're not forced). Cheers.

    | jayoliverwright
    0

  • Hi all, there seems to have been an algorithm update on February 7. One of my big sites www.poussette.com, lost about 25 % of its organic traffic afterwards and has not revovered yet.  What are the best steps to take right now? It is 7 years old we continuously did conservative SEO (technical, link building, adding content). Thanks in advance. Dieter

    | Storesco
    0

  • Hi All, I've been working for a website/publisher that produces good content and has been around for a long time but has recently been burdened by a high level of repetitious production, and a high volume in general with pages that don't gather as much traffic as desired. One such fear of mine is that every piece published doesn't have any links pointing to when it is published outside of the homepage or syndicated referrals. They do however have a lot (perhaps too many) outbound internal links away from it. Would it be a good practice, especially for new content that has a longer shelf life, to go back to older content and place links pointing to the new one? I would hope this would boost traffic via internal recircultion and Page Authority, with the added benefits of anchor text boosts.

    | ajranzato9
    1

  • Hello the great Moz Community! Gev here from BetConstruct, a leading gaming and betting software provider in the world. Our company website is performing great on SERP. We have 20+ different dedicated pages for our 20+ softwares, event section, different landing pages for different purposes. We also run a blog section, Press section, and more... Our website's default language is EN. 4 months ago we opened the /ru and /es versions of the website! I have set the correct hreflang tags, redirects, etc.. generated correct sitemaps, so the translated versions started to rank normally! Now our marketing team is requesting different stuff to be done on the website and I would love to discuss this with you before implementing! There are different cases! For example: They have created a landing page under a url betconstruct.com/usa-home and want me to set that page as the default website page(ie homepage), if the user visits our website from a US based IP. This can be done in 2 different ways: I can set the /usa-home page as default in my CMS, in case the visitor is from US and the address will be just betconstruct.com(without /use-home). In this case the same URL (betconstruct.com) will serve different content for only homepage. I can check the visitor IP, if he is from US, I can redirect him to betconstruct.com/usa-home. In this case user can click on the logo and go to the homepage betconstruct.com and see the original homepage. Both of the cases seems to be dangerous, because in the 1st case I am not sure what google will think when he sees different homepage from different IPs. And in the 2nd case I am not sure what should be that redirection. Is it 301 or 303, 302, etc... Because Google will think I don't have a homepage and my homepage redirects to a secondary page like /usa-home After digging a lot I realised that my team is requesting from me a strange case. Because the want both language targeting(/es, /ru) and country targeting (should ideally be like /us), but instead of creating /us, they want it to be instead of /en(only for USA) Please let me know what will be the best way to implement this? Should we create a separate version of our website for USA under a /us/* URLs? In this case, is it ok to have /en as a language version and /us as a country targeting? What hreflangs to use? I know this is a rare case and it will be difficult for you to understand this case, but any help will be much appreciated! Thank you! Best,
    Gev

    | betconstruct
    0

  • Hi,I have some questions related to duplicate content on e-commerce websites. 1)If a single product goes to multiple categories (eg. A black elegant dress could be listed in two categories like "black dresses" and "elegant dresses") is it considered duplicate content even if the product url is unique? e.g www.website.com/black-dresses/black-elegant-dress duplicated> same content from two different paths www.website.com/elegant-dresses/black-elegant-dress duplicated> same content from two different paths www.website.com/black-elegant-dress unique url > this is the way my products urls look like Does google perceive this as duplicated content? The path to the content is only one, so it shouldn't be seen as duplicated content, though the product is repeated in different categories.This is the most important concern I actually have. It is a small thing but if I set this wrong all website would be affected and thus penalised, so I need to know how I can handle it. 2- I am using wordpress + woocommerce. The website is built with categories and subcategories. When I create a product in the product page backend is it advisable to select  just the lowest subcategory or is it better to select both main category and subcategory in which the product belongs? I usually select the subcategory alone.  Looking forward to your reply and suggestions. thanks

    | cinzia09
    1

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.