Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi Guys, Just wondering can anyone recommend any tools or good ways to check if rel=“next” and rel=“prev” attributes have been implemented properly across a large ecommerce based site? Cheers. rel=“next” and rel=“prev”

    | jayoliverwright
    0

  • We have two pages on our site with similar content. One was originally a landing page for a marketing campaign, somewhat of a micro-site feel with a lot of content. We recently optimized another page on the site with much of the same content from the original landing page/micro-site. In order to avoid duplicate content, and to let Google know our authority page is the new page, we're wondering what is best practice: Should we... 301 redirect the old page? No index the old page? Keep both pages and use a canonical to tell Google the new page is authority? Or something else?

    | seo_1234b
    0

  • Hi there, I'm experiencing a problem where google is pick and choosing different links structures to rank my Wordpress site for my main keywords.  The site had pretty good #1 rankings for a long time but recently I noticed Google is choosing to rank the page in one of two ways.  Let me just say that the original way where it held good rankings looked like this for example: flowers.com/the-most-beautiful-wedding-bouquets/ this is just an example it' is not my site. And when google decides to switch it up it uses this link structure:flowers.com > weddings (this still points to this link flowers.com/the-most-beautiful-wedding-bouquets when I hover my mouse over it) however this link structure that never appeared before and now does, usually has much lower rankings.  Please note it's not both link structures being ranked at the same time for the keywords.  It's one or the other that google is currently alternating in ranking and I believe it's hurting the sites position.
    I'm not sure if this is a wordpress settings thats gone wrong or what the problem is but I do know when shows the expanded and descriptive link structure flowers.com/the-most-beautiful-wedding-bouquets  the rankings are higher and in 2nd place.  I'm hoping by rectifying this I can regain back my position. I'm very grateful for any insight you could offer on why this is happening and how I could fix it.  Thank you. PS Wordpress site has several SEO plugins

    | z8YX9F80
    0

  • Recently, I made one of my site page responsive. It was ranking on 13th Position and after I made the page responsive, it dropped to 24 desktop and 35 on mobile rankings. I ran mobile friendly test and it said "awesome, your page is mobile friendly". Not sure, what went wrong here. Page in question is: https://www.itcontractorsuk.com/ Can someone please advice? Please Note: My site is partially responsive.

    | ThinkWebUK
    0

  • We manage content for many clients in the same industry, and many of them wish to keep their customers on their individualized websites (understandably).  In order to do this, we have duplicated content in part from the manufacturers' pages for several "models" on the client's sites.  We have put in a Canonical reference at the start of the content directing back to the manufacturer's page where we duplicated some of the content.  We have only done a handful of pages while we figure out the canonical reference potential issue. So, my questions are: Is this necessary? Does this hurt, help or not do anything SEO-wise for our ranking of the site? Thanks!

    | moz1admin
    1

  • Hi Guys, I'm working on a project (premium-hookahs.nl) where I stumble upon a situation I can’t address. Attached is a screenshot of the crawled pages in Search Console. History: Doing to technical difficulties this webshop didn’t always no index filterpages resulting in thousands of duplicated pages. In reality this webshops has less than 1000 individual pages. At this point we took the following steps to result this: Noindex filterpages. Exclude those filterspages in Search Console and robots.txt. Canonical the filterpages to the relevant categoriepages. This however didn’t result in Google crawling less pages. Although the implementation wasn’t always sound (technical problems during updates) I’m sure this setup has been the same for the last two weeks. Personally I expected a drop of crawled pages but they are still sky high. Can’t imagine Google visits this site 40 times a day. To complicate the situation: We’re running an experiment to gain positions on around 250 long term searches. A few filters will be indexed (size, color, number of hoses and flavors) and three of them can be combined. This results in around 250 extra pages. Meta titles, descriptions, h1 and texts are unique as well. Questions: -          Excluding in robots.txt should result in Google not crawling those pages right? -          Is this number of crawled pages normal for a website with around 1000 unique pages? -          What am I missing? BxlESTT

    | Bob_van_Biezen
    0

  • Hi all, I have a website that is setup to target different countries by using subfolders. Example /aus/, /us/, /nz/. The homepage itself is just a landing page redirect to whichever country the user belongs to. Example somebody accesses https://domain/ and will be redirected to one of the country specific sub folders. The default subfolder is /us/, so all users will be redirected to it if their country has not been setup on the website. The content is mostly the same on each country site apart from localisation and in some case content specific to that country. I have set up each country sub folder as a separate site in Search Console and targeted /aus/ to AU users and /nz/ to NZ users. I've also left the /us/ version un-targeted to any specific geographical region. In addition to this I've also setup hreflang tags for each page on the site which links to the same content on the other country subfolder. I've target /aus/ and /nz/ to en-au and en-nz respectively and targeted /us/ to en-us and x-default as per various articles around the web. We generally advertise our links without a country code prefix, and the system will automatically redirect the user to the correct country when they hit that url. Example, somebody accesses https://domain/blog/my-post/, a 302 will be issues for https://domain/aus/blog/my-post/ or https://domain/us/blog/my-post/ etc.. The country-less links are advertised on Facebook and in all our marketing campaigns Overall, I feel our website is ranking quite poorly and I'm wondering if poor social signals are a part of it? We have a decent social following on Facebook (65k) and post regular blog posts to our Facebook page that tend to peek quite a bit of interest. I would have expected that this would contribute to our ranking at least somewhat? I am wondering whether the country-less link we advertise on Facebook would be causing Googlebot to ignore it as a social signal for the country specific pages on our website. Example Googlebot indexes https://domain/us/blog/my-post/ and looks for social signals for https://domain/us/blog/my-post/ specifically, however, it doesn't pick up anything because the campaign url we use is https://domain/blog/my-post/. If that is the case, I am wondering how I would fix that, to receive the appropriate social signals /us/blog/my-post/, /aus/blog/my-post/ & /nz/blog/my-post/. I am wondering if changing the canonical url to the country-less url of each page would improve my social signals and performance in the search engines overall. I would be interested to hear your feedback. Thanks

    | destinyrescue
    0

  • Hi, A mobile phone accessory client of ours has a retail site (B2C) and a trade site (B2B). The retail site does pretty well and ranks highly for a number of terms. The trade site doesn't really rank for anything as they don't optimise it. They would like to merge the two sites and allow trade customers to log-in and purchase goods in bulk for their business. If they were to merge the trade site into the already successful consumer site, what would be the best way of doing this and what, if any, implications would it have on the organic visibility of the B2C site? Would it be possible to target retail and trade customers on one website? Cheers, Lewis

    | PeaSoupDigital
    0

  • Hi, I was doing some research of new huge sites for example carstory.com that have over million pages and i notice that many new sites have strong growing for number of keywords and then at some point everything start going down (Image of traffic drop attached) there are no major updates at this time but you can clearly see even on recent kewyords changes that this site start loosing keywords every day , so number of new keywords are much less that lost keywords. How would you explain it ? Is that at some point when site have more than X number of indexed pages then power of domain is not enough to keep all of them at the top and those keywords start dropping ? Please share you opinion and if you have any experience by yourself with huge sites. Thank You very appreciated 2LC3AxE

    | logoderivv
    0

  • Hi i have some old pages with more link equity, i m planning to key some bestseller in the main content.. my question is on best use of anchor text, can i use the below for eg: Product name is Chloride Exide Safepower Cs 7-12 12V Sealed Battery so i want to use the key word which is "12v 7ah Battery" in anchor text or buy 12v 7ah battery in Anchor text, will this google consider as spam?? Pls suggest

    | Rahim119
    0

  • We're an e-commerce company with two domains. One is our original company name/domain, one is a newer top-level domain. The older domain doesn't receive as much traffic but is still searched and used by long-time customers who are loyal to that brand, who we don't want to alienate. The sites are both identical in products and content, which creates a duplicate content issue. I have come across two options so far: 1. a 301 redirect from the old domain to the new one. 2. Optimize the content on the newer domain (the strongest of the two) and leave the older domain content as is. Does anyone know of a solution better than the two I listed above or have experience resolving a similar problem in the past?

    | ilewis
    0

  • Hi We need some expert advise here, we are dealing with batteries, tyres, online at nation level in india, and we have dealer registered with us at local places... and also we dont show the dealer details on the product page, once the order is recieved will be pass on the deal locally... we want rank well in the local key words search as many of times user search for "buy car battery in bangalore"   here is bangalore is one location.. We want to develop one url like this www.abc.com/car-battery/bangalore/productname.html and also i will add breadcrumb like this car-battery>Bangalore > Productname which will be canonical to  www.abc.com/productname.html  (where product details are there). Pls suggest is this way i should go.. or??

    | Rahim119
    0

  • I have a pharmaceutical brand that treats two diseases, but wants to primarily promote one. We want searches for "brand dosing" to go to Side A, but currently "brand dosing" goes to Side B. BUT, I want "brand dosing Side B" to still show up in organic search, so a noindex on Side B, or canonicalization of Side A, won't work. Essentially, I want any searches that are not specific to a disease treatment to go to Side A, and then specific Side B related searches, go to Side B. Because this is a client paying me to optimize their site, I obviously want to optimize their whole site, so only optimizing Side A, or unoptimizing Side B, aren't solutions I want to employ. I don't think a solution exists, but I figured my fellow Mozers would know best. Thanks in advance,

    | GTO_Pharma_SEO
    0

  • There are websites that have linked to my site. Whenever I hover over link I see my direct website URL and I am not seeing "no follow" when viewing source code so I assume these are passing link juice. However when I click on link it directs briefly to shareasale (affiliate account) in web address bar, but then quickly directs back to my website URL as directed. I was curious if these good links I am acquiring truly pass juice or since they briefly pass through an affiliate site if that cancels or dilutes the link juice. Also I am noticing when inspecting element that after the HREF it says class="external-link" I am just not sure if my link building efforts are being ruined by having an affiliate account running.I did not tell them I had one. I guess they are searching to see that I have one and trying to make a few extra commission dollars.

    | nchachula
    0

  • Hello Everyone, Have recently seen a Google result for "vps hosting" showing service page details in Answer Box. I would really like to know, what can cause a service page to appear in the Answer Box? Have attached a screenshot of result page. CaRiWtQUcAALn9n.png CaRiWtQUcAALn9n.png

    | eukmark
    0

  • Hey Guys, We've seen an alarming drop in our main keyword for our website. Our biggest driver of traffic has always been the search term 'gifts for men' which we commanded the top spot for a while, but have always been in the top 4 for. Recently (in the last 3-4 months) we dropped to 6 and as of last night we dropped down to 9th. We still rank number 2 for 'gift ideas for men'. Both search terms point to this page: GIfts For Men Nothing onsite or technically has changed, and there is consistently new content in the form of products being added almost daily. We hit a manual action back in October of last year and I'm concerned that the toxic links (that we didn't create mind you) we disavowed may have been unnaturally boosting this page and now we're dropping significantly because they're gone. Any ideas on how we can curb this concerning trend? Thanks a lot

    | TheGreatestGoat
    0

  • We have a huge httaccess file over several MB which seems to be the cause for slow server response time. There are lots of 301 redirects related to site migration from 9 months ago where all old URLs were redirected to new URL and also lots of 301 redirects from URL changes accumulated over the last 15 years. Is it safe to delete all 301 redirects which did not receive any traffic in last 2 months ? Or would you apply another criteria for identifying those 301 that can be safely deleted? Any way to get in google analytics or webmaster tools all 301 that received traffic in the last 2 months or any other easy way to identify those, apart from checking the apache log files ?

    | lcourse
    0

  • Hi there, Apols for the basic question but is it considered good practice to nofollow one of one's own URLs? Basically our 'print page' command produces an identical URL in the same window but with .../?print=1 at the end. As far as I've been reading, the nofollow html attribute is, broadly speaking, only for links to external websites you don't want to vouch for or internal links to login/register pages that together with noindex, you're asking Google not to waste crawl budget on. (The print page is already noindexed so we're good there) Can anyone confirm the above from their own experience? Thanks so much!

    | Daft.ie
    0

  • Hi, Our website of 3K+ pages currently has more links coming to internal pages (2nd & 3rd Level), compared to links to homepage. Just wanted to know if this is bad for rankings ? Please share your thoughts. Thanks.

    | Umesh-Chandra
    0

  • I run a website that revolves around a niche rugged computer market.  There are several "main" models for each computer that also has several (300-400) "sub" models that only vary by specifications for each model.  My problem is I can't really consolidate each model to one product page to avoid duplicate content. To have something like a drop down list would be massive and confusing to the customer when they could just search the model they needed.  Also I would say 80-90% of the market searches for a specific model when they go to purchase or in Google. A lot of our customers are city government, fire departments, police departments etc. they get a list of approved models and purchase off that they don't really search by specs or "configure" a model so each model number having a chance to rank is important.  Currently we have all models in each sub category rel=canonical back to the main category page for that model.  Is there a better way to go about this?  Example page you can see how there are several models all product descriptions are the same they only vary by model writing a unique description for each one is an unrealistic possibility for us.  Any suggestions on this would be appreciated I keep going back on forth on what the correct solution would be.

    | The_Rugged_Store
    0

  • some sitemap, i have observed, that google is showing in the result for our website.. wht is wrong? any idea?

    | Rahim119
    0

  • Hi Moz community, I've a coleague that's working to rank this site: www.devsar.com. The selected keywords are:
    Mobile development
    Web development
    Django Development
    Python Development I've checked the site: It's fast and clean. Has a good PA and DA. It's responsive and good lookking. Meta description , title, hreflang.. everything in order. Link profile a little rare (checked with ahref.com), it's because someone made a mistake redirecting some expired domain Can you help me to help my mate out?
    Thanks
    GR.

    | Gaston Riera
    0

  • Hi please help i have some spammy sites backlinks i think its hurting my site can some body help me out how to fix this. my site is www.bassilimousine.com. can some body review and help me how to improve my ranking. Thanks.

    | GarySahota
    1

  • there are some pages, my competitor is ranking well and also, we have done page optimization it is 100% for page title keywords as im going to use the same title of the competitor? Will this affect me? Pls suggest wht should I do..

    | Rahim119
    0

  • On my main site we link to pdfs that are located on another one of our domains. The only thing that is on this other domain is the pdfs. It was setup really poorly so I am going to redesign everything and probably move it. Is it worthwhile trying to add these pdfs to our sitemap and to try and get them indexed? They are all connected to a current item, but the content is original.

    | EcommerceSite
    0

  • Hi i have seen few urls in the html improvements duplicate titles Can i disable one of the below url in the robots.txt? /store/Solar-Home-UPS-1KV-System/75652
    /store/solar-home-ups-1kv-system/75652 if i disable this Disallow: /store/Solar-Home-UPS-1KV-System/75652 will the Search engines scan this /store/solar-home-ups-1kv-system/75652 im little confused with case senstive.. Pls suggest go ahead or not in the robots.txt

    | Rahim119
    0

  • Howdy Moz fans! Okay so there's a mountain of information out there on the webernet about internal search results... but i'm finding some contradiction and a lot of pre-2014 stuff. Id like to hear some 2016 opinion and specifically around a couple of thoughts of my own, as well as some i've deduced from other sources. For clarity, I work on a large retail site with over 4 million products (product pages), and my predicament is thus - I want Google to be able to find and rank my product pages. Yes, I can link to a number of the best ones by creating well planned links via categorisation, silos, efficient menus etc (done), but can I utilise site search for this purpose? It was my understanding that Google bots don't/can't/won't use a search function... how could it? It's like expeciting it to find your members only area, it can't login! How can it find and index the millions of combinations of search results without typing in "XXXXL underpants" and all the other search combinations? Do I really need to robots.txt my search query parameter? How/why/when would googlebot generate that query parameter? Site Search is B.A.D - I read this everywhere I go, but is it really? I've read - "It eats up all your search quota", "search results have no content and are classed as spam", "results pages have no value" I want to find a positive SEO output to having a search function on my website, not just try and stifle Mr Googlebot. What I am trying to learn here is what the options are, and what are their outcomes? So far I have - _Robots.txt - _Remove the search pages from Google _No Index - _Allow the crawl but don't index the search pages. _No Follow - _I'm not sure this is even a valid idea, but I picked it up somewhere out there. _Just leave it alone - _Some of your search results might get ranked and bring traffic in. It appears that each and every option has it's positive and negative connotations. It'd be great to hear from this here community on their experiences in this practice.

    | Mark_Elton
    0

  • The reason for this mights be obvious to the right observer, but somehow I'm not able to spot the reason why. The situation:
    I'm doing an SEO-audit for a client. When I'm checking if the rel=canonical tag is in place correctly, it seems like it: view-source:http://quickplay.no/fotball-mal.html?limit=15) (line nr 15) Anyone seing something wrong with this canonical? When I perform a site:http://quickplay.no/ search, I find that there's many url's indexed that ought to have been picked up by the canonical-tag: (see picture) ..this for example view-source:http://quickplay.no/fotball-mal.html?limit=15 I really can't see why this page is getting indexed, when the canonical-tag is in place. Anybody who can? Sincerely 🙂 GMdWg0K

    | Inevo
    0

  • Hi does anyone know the correct hreflang for the UK Google webmaster error: International Targeting | Language > 'en-GB' - no return tags (sitemaps)Sitemap provided URLs and alternate URLs in 'en-GB' that do not have return tags.Thanks you all

    | Taiger
    0

  • I have two 301s from http://www. to https://non-www version of my site. I wonder how can get rid of one so it will look like this: 301-200 instead of 301-301-200 All other combinations work fine and give me 301-200 status codes. Thank you very much!

    | lovemozforever
    0

  • About 2 months ago we relaunched our Ecommerce store on Shopify Plus and have since seen a massive drop in traffic, sales and our most valuable pages are nowhere to be found. Also, GWT is showing that Google is indexing about half of our pages and none of the images are being indexed. We did extensive keyword research, created/implemented a keyword framework, wrote brand new category/product page content, implemented schema markup, optimized our blog content and even did link building where we got some 90+ DA links. We are literally at a loss for what is causing this. Our experience with Shopify Plus has been very poor because it doesn't even do basic SEO stuff so we've had to do a lot of workarounds to make it "SEO friendly". Has anyone else ever switched to Shopify Plus and had similar issues? Is there a silver bullet that you can think of that we are missing that could get the site being indexed/ranking again?

    | Aquatell
    0

  • Hi pls look at the below page, http://www.powerwale.com/store/exide-xplore-xltz4-3ah-battery/76933 is questions and review should be in seperate page, as  i think that in the future the comments, will become Key word stuffing for the product page. Pls suggest.. If yes, suggest the best url as well.. thanks

    | Rahim119
    1

  • Hi guys, We had aN SEO agency do a disallow request on one of our sites a while back. They have no trace of the disallow txt file and all the links they disallowed. Does anyone know if there is a way to recover this file in google webmaster tools or anyway to find which links were disallowed? Cheers.

    | jayoliverwright
    0

  • I have a client who is a major player in the continuing education vertical.  They have recently noticed that the Google Knowledge Graph is displaying erroneous information whenever somebody searches for their brand name (no matter the location/IP address).  The brand Knowledge Graph is pulling information from a permanently closed location.  Our client has multiple locations.  Why Google decided to pull data from this particular (closed) location is beyond us. Our client has reclaimed the permanently closed location Google+ page and they are going to permanently delete it.  However we are wondering if there is anyway to expedite the process of updating the Knowledge Graph.  Is there anyway to submit feedback to Google about the KG?  Is there anyway to request a Knowledge Graph correction?  The erroneous "permanently closed" data is very embarrassing for our client.

    | RosemaryB
    0

  • I have a client that has multiple websites for providing to other countries. For instance, they have a .com website for the US (abccompany.com), a .co.uk website for the UK (abccompany.co.uk), a .de website for Germany (abccompany.de), and so on. The have websites for the Netherlands, France, and even China. These all act as separate websites. They have their own addresses, their own content (some duplicated but translated), their own pricing, their own Domain Authority, backlinks, etc. Right now, I write content for the US site. The goal is to write content for long and medium tail keywords. However, the UK site is interested in having myself write content for them as well. The issue I'm having is how can I differentiate the content? And what is the best way to target content for each country? Does it make sense to write separate content for each website to target results in that country? The .com site will still show up in UK web results still fairly high. Does it make sense to just duplicate the content but in a different language or for the specific audience in that country? I guess the biggest question I'm asking is, what is the best way of creating content for multiples countries' search results? I don't want the different websites to compete with each other in a sense nor do I want to spend extra time trying to rank content for multiple sites when I could just focus on trying to rank one for all countries. Any help is appreciated!

    | cody109
    0

  • Hi, Can anyone advise on the best internal linking practice for an eCommerce website? Should the introduction copy on each category page contain naturally placed links down to sub categories and products and should each sub category link back up to the main category page. Is there a 'best practice' method of linking categories, sub categories and products? In terms of internal linking product pages, I presume the best practice would be to link other relevant products to each each? Thanks

    | SmiffysUK
    0

  • Hi guys. For past 2.5-3 weeks We've been experiencing lots of changes in our rankings, unfortunately, mostly going down. Our website is regexseo.com. I Know there was a January 12 "core update" in Google Algorithms, but there is no any specific information I can find. Now, here is what doesn't make sense to me and it's the same reason I'm worried if we are doing something wrong (or not doing something right? 🙂 - we've been rankings for "web design houston" in positions 2-4 for years, and recently after months of work we finally got to positions 4-6 on "seo houston". But during last two weeks we got down to beginning of the second page 😢 Our backlink profile is growing, traffic to website is growing over all and stable to main landing pages (+/- 5%), new content is being released/updated every week, nothing stupid or drastic is being done. All the metric tools are saying we are "supposed" to do really good. Now, while we are going down, our competitors have some fluctuation (+/-2 positions, but not anything close to what we are experiencing). Any ideas, thoughts, suggestions?

    | DmitriiK
    2

  • I checked my site links and top pages by page authority. What i have found i dont understand, because the first 5-10 pages did not exist!! Should know that we launched a new site and rebuilt the static pages so there are a lot of new pages, and of course we deleted some old ones. I refreshed the sitemap.xml (these pages are not in there) and upload it in GWT. Why those old pages appear under the links menu at top pages by page authority?? How can i get rid off them? thx, Endre

    | Neckermann
    0

  • I have ownership over the .co.uk and the .com for my brand name and I was wondering which URL I should use. Most my sales come from within the UK, (with only a small amount from the US). Currently I'm using the .co.uk and redirecting the .com URL but should I be doing this the other way around? Thanks,

    | the-gate-films
    0

  • Hello, I am sure I am not the first to go through this but I did not find the answers in the previous Q&As. So, sorry if this question is bit redundant for some folks. We have a site in English with readers in multiple english speaking countries. How to make www.mysite.com accepted in Google news for US, UK, India, Australia and Canada at the same time? It is currently accepted in the US but, as I stated above, we do have a strong audience in other countries. I have read about sub-domains, but wouldn't it be considered duplicated content if I had the exact same article in different sub-domains? We are talking about creating 4 copies on mysite.com just to be added to Google News in those specific countries www.mysite.com/same-article/
    uk.mysite.com/same-article/
    au.mysite.com/same-article/
    in.mysite.com/same-article/
    ca.mysite.com/same-article/ Isn't there a better way of having mysite.com included in Google News for all English speaking countries?

    | Koki.Mourao
    0

  • Webmaster is giving errors of Duplicate Meta Descriptions and Duplicate Title Tags after I changes the permalinks structure in wordpress. It there a quick fix for this and how damaging is the above for seo. Thanks T

    | Taiger
    0

  • Hi, I want to check cache and index page of  mobile site. I am checking it on mobile phone but it is showing the cache version of desktop. So anybody can tell me the way(tool, online tool etc.) to check mobile site index and cache page.

    | vivekrathore
    0

  • I'm sure this has probably been asked somewhere before... We're implementing a URL re-write rule to convert non dub pages to the www. subdomain and also removing all trailing slashes as part of a basic canonicalisation exercise. The question is, as well as doing the URL rewrites within htaccess, should I also 301 those duplicate pages or does the URL rewrite do the job on it's own? Thanks mozzers.

    | Ultramod
    0

  • To optimize for googles page speed, our developer has moved the 72KB CSS code directly in the page header (not in external CCS file). This way the above the fold loading time was reduced. But may this affect indexing of the page or have any other negative side effects on rankings? I made a quick test and google cache seems to have our full pages cached, but may it affect somehow negatively our rankings or that google indexes fewer of our pages (here we have some problems with google ignoring about 30% of our pages in our sitemap".)

    | lcourse
    0

  • Hello everyone, I have a page on my directory for example:
    https://ose.directory/topics/breathing-apparatus The title on this page is small yet a bit unspecific:
    Breathing Apparatus Companies, Suppliers and Manufacturers On webmaster tools these terms hold different values for each category so "topic name companies" sometimes has a lot more searches than "topic name suppliers". I was thinking if I could split the page into the following into three separate pages would that be better: https://ose.directory/topics/breathing-apparatus (main - Title: Breathing Apparatus)
    https://ose.directory/topics/breathing-apparatus/companies (Title: Breathing Apparatus Companies)
    https://ose.directory/topics/breathing-apparatus/manufacturers (Title: Breathing Apparatus Manufacturers)
    https://ose.directory/topics/breathing-apparatus/suppliers (Title: Breathing Apparatus Suppliers) Two Questions: Would this be more beneficial from an SEO perspective? Would google penalise me for doing this, if so is there a way to do it properly. PS. The list of companies may be the same but the page content ever so slightly different. I know this would not effect my users much because the terms I am using all mean pretty much the same thing. The companies do all three.

    | SamBayPublishing
    0

  • Here's a specific question about title tags for ecommerce website... We've got lists of products (category list pages) that stretch for many pages... is there any benefit to added a something to make the title tag unique. For example: Page 1: <title></span><span class="html-tag" data-mce-mark="1">Category List Page Example</span><span class="html-tag" data-mce-mark="1"></title> Page 2: <title></span><span class="html-tag" data-mce-mark="1">Category List Page Example - Page 2</span><span class="html-tag" data-mce-mark="1"></title> Page 3: <title></span><span class="html-tag" data-mce-mark="1">Category List Page Example - Page 3</span><span class="html-tag" data-mce-mark="1"></title> FWIW, we've got the pagination and canonicalization nailed down tight. Moz crawl actual brought a dupe content issue based on title tags.

    | 19prince
    0

  • Hi all. I saw in support.google.com the following text: Create and save the Sitemap and lists of links A Sitemap file containing the new URL mapping A Sitemap file containing the old URLs to map A list of sites with link to your current content I would like to better understand about a "A list of sites with bond link to current content" Question 1: have I need tree sitemaps simultaneously ?
    Question 2: If yes, should I put this sitemap on the Search Console of the new website?
    Question 3: or just Google gave a about context how do we make the migration? And I'll need really have sitemaps about the new site only..? What about is Google talking? Thanks for any advice.

    | mobic
    0

  • Hi All, Pls check the below url http://www.powerwale.com/inverter-battery for inverter battery keyword in google.co.in im scoring 100% in the page optimization, wht else I need to do, and also I still rank in between 7 to 12 in search results.. How can be in Top 3 search results.. Pls suggest.. Thanks

    | Rahim119
    1

  • Hello Moz World, I am stuck on a problem, and wanted to get some insight. When I attempt to use Screaming Spider or SEO Powersuite, the software is only crawling the homepage of my website. I have 17 pages associated with the main domain i.e. example.com/home, example.com/sevices, etc. I've done a bit of investigating, and I have found that my client's website does not have Robot.txt file or a site map. However, under Google Search Console, all of my client's website pages have been indexed. My questions, Why is my software not crawling all of the pages associated with the website? If I integrate a Robot.txt file & sitemap will that resolve the issue? Thanks ahead of time for all of the great responses. B/R Will H.

    | MarketingChimp10
    0

  • My company is looking at expanding internationally, we have sudomains in the UK and Canada currently. I'm making recommendations on improving SEO and one of the parts that I'm struggling with is the benefits of ccTLDs vs using folders. I know the basic argument about Google recognizing the ccTLDs as being geo specific so they get priority. But I'd like to know HOW much priority they get. We have unique keywords and a pretty strong domain, is having a ccTLDs so much better that'd be worth going that route rather then creating folders within our current domain? Thanks, Jacob

    | jacob.young.cricut
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.