Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • google search console said my site failed due to malicious software being hosted. the files creating the problems are behind a username and login gate and are user submitted. the host url is also a subdomain of the main site. unfortunately, catching all malicious files is an unwinable cat and mouse. What is he best seo strategy for this situation?

    | woolbert
    0

  • I just completed a long post that reviews 16 landing page tools. I want to add 256 new pages that compare each tool against each other. For example: Leadpages vs. Instapage Leadpages vs.  Unbounce Instapage vs. Unbounce, etc Each page will have one product's information on the left and the other on the right. So each page will be a unique combination BUT the same product information will be found on several other pages (its other comparisons vs the other 15 tools). This is because the Leadpages comparison information (a table) will be the same no matter which tool it is being compared against. If my math is correct, this will create 256 new pages - one for each combination of the 16 tools against each other! My site now is new and only has 6 posts/pages if that matters. Want to make sure I don't create a problem early on...Any thoughts?

    | martechwiz
    0

  • Just taken over a new client who recently moved from A.N. other platform to Shopify. I just found reference to their old website IP address and it appears to be not redirecting. Can I simply use something like Traffic Control (Shopify app) to redirect to the new domain?

    | muzzmoz
    0

  • Hi there, I am soon to launch a new platform/directory website, however, have a concern over doorway pages. I have read many articles on the difference between Doorway and Landing pages and do have a good understanding, however, am still very anxious that what I intend to do will be risking Google penalties. I have looked at other directory/platform websites and have noticed that a lot of them are still using doorway pages and are not getting penalised. So I was wondering if someone wouldn't mind kindly letting me know their opinion on which of the following examples are doorway pages and which are not so I can better understand what I can and cannot do? Example 1: When I Google 'piano lessons new york' and 'trumpet lessons new york' I get the following 'landing pages' in search: https://takelessons.com/new-york/piano-lessons https://takelessons.com/new-york/trumpet-lessons To me, the above pages are definitely doorway pages as they are very similar with content and text and are simply an intermediary step between the Google search and their listings pages for piano/trumpet teachers in New York. Is this correct? Example 2: When I Google 'piano lessons Sydney' I get presented with the following web page in search: http://www.musicteacher.com.au/directory/sydney-nsw/lessons/piano/ I would think that this is NOT a doorway page as the user has been taken directly to the search results page in the directory and the page doesn't seem to have been set up for the sole purpose of listing in search results for 'Piano Lessons in Sydney'. Example 3: When I Google 'pet minding Sydney' I get presented with the following two pages in search: https://www.madpaws.com.au/petsitters/Sydney-New-South-Wales?type=night&service=1&from=0&to=99&city=Sydney&state=New-South-Wales https://www.pawshake.com.au/petsitters/Sydney%252C%2520New%2520South%2520Wales%252C%2520Australia Like Example 2, I don't think these pages would be classified as doorway pages as they too direct to the search results page in the site directory instead of an intermediary page. What do you think? Thanks so much in advance for your expertise and help! Kind Regards, Adrian

    | Amor2005
    0

  • I've heard a few people mention this now. I have seen hosting packages range from £5 to £1000 per month, and I understand that each comes with their own amounts of storage space, bandwidth and all. Now I understand that page speed is important to SEO and the type of hosting will dictate your page speed, but other than this why is hosting important to SEO?

    | moon-boots
    0

  • I want to construct website internal like structure better, can you advise me what's model architecture to build menu, navigation, build link, hub content will good for audience and search engine. Thank your advise

    | dunghv36
    0

  • Hi all, I KNOW the hard and true answer to this, but I'm looking for deeper insights regarding Links like those contained on this page. I understand the by-the-book answer to this would be only pursue a paid link if it is "nofollowed" OR if it has the potential to bring in new business and traffic. My question is ....does a link like this actually pass SEO value? I see businesses killing it from an SEO standpoint with link profiles full of paid directory links like this. I also thought this conversation was more interested now that Google appears to devaluing links like this instead of issue penalties. Thoughts??

    | RickyShockley
    0

  • Hi I am getting a couple of issues flag with my href lang tags, but when I manually check the pages I can't see the issues. Issue 1. No self referencing href lang tag example URL - http://www.key.co.uk/en/key/300kg-capacity-manutan-mobile-lift-table-lift-height-860mm-125h204 (these are SKU pages with duplicate content, so we have canonicals pointing to the main product page) Issue 2. Conflicting hreflang and rel=canonical - http://www.key.co.uk/en/key/500kg-capacity-manutan-mobile-lift-table-lift-height-945mm-127h204 I have checked the source code of the pages with errors against the pages which don't have errors and they look the same - so I am unsure what's wrong?

    | BeckyKey
    0

  • When searching for "Madrid hotels" in google I see that the top organic search results have one row of sitelinks. 
    What can I do that also my site shows sitelinks if I am among the top organic search results?
    Anything onpage that I can do to increase probability that google will show sitelinks? Strangely the text which shows as sitelink for SERPs from booking.com and tripadvisor does actually for most of the sitelinks not appear on the landing page (I also checked the source code).

    | lcourse
    0

  • Hello everyone I have a problem here. My website has been hit by Panda several times in the past, the first time back in 2011 (first Panda ever) and then another couple of times since then, and, lastly, the last June 2016 (either Panda or Phantom, not clear yet). In other words, it looks like my website is very prone to "quality" updates by big G: http://www.virtualsheetmusic.com/ Still trying to understand how to get rid of Panda related issues once for all after so many years of tweaking and cleaning my website of possible duplicate or thin content (301 redirects, noindexed pages, canonicals, etc), and I have tried everything, believe me. You name it. We recovered several times though, but once in a while, we are still hit by that damn animal. It really looks like we are in the so called "grey" area of Panda, where we are "randomly" hit by it once in a while. Interestingly enough, some of our competitors live joyful lives, at the top of the rankings, without caring at all about Panda and such, and I can't really make a sense of it. Take for example this competitors of ours: http://8notes.com They have a much smaller catalog than ours, worse quality of offered music, thousands of duplicate pages, ads everywhere, and yet... they are able to rank 1st on the 1st page of Google for most of our keywords. And for most, I mean, 99.99% of them. Take for example "violin sheet music", "piano sheet music", "classical sheet music", "free sheet music", etc... they are always first. As I said, they have a much smaller website than ours, with a much smaller offering than ours, their content quality is questionable (not cured by professional musicians, and highly sloppy done content as well as design), and yet they have over 480,000 pages indexed on Google, mostly duplicate pages. They don't care about canonicals to avoid duplicate content, 301s, noindex, robot tags, etc, nor to add text or user reviews to avoid "thin content" penalties... they really don't care about anything of that, and yet, they rank 1st. So... to all the experts out there, my question is: Why's that? What's the sense or the logic beyond that? And please, don't tell me they have a stronger domain authority, linking root domains, etc. because according to the duplicate and thin issues I see on that site, nothing can justify their positions in my opinion and, mostly, I can't find a reason why we instead are so much penalized by Panda and such kind of "quality" updates when they are released, whereas websites like that one (8notes.com) rank 1st making fun of all the mighty Panda all year around. Thoughts???!!!

    | fablau
    0

  • Dear Moz community, Noticed that several groups of websites after HTTP -> HTTPS migration update their schema markup from, example : {
    "@context": "http://schema.org",
    "@type": "WebSite",
    "name": "Your WebSite Name",
    "alternateName": "An alternative name for your WebSite",
    "url": "http://www.your-site.com"
    } becomes {
    "@context": "https://schema.org",
    "@type": "WebSite",
    "name": "Your WebSite Name",
    "alternateName": "An alternative name for your WebSite",
    "url": "https://www.example.com"
    } Interesting to know, because Moz website is on https protocol but uses http version of markup. Looking forward for answers 🙂

    | admiral99
    0

  • We own a large wiki site which allows people to make articles about their business and other things that Wikipedia would prohibit. To make our site more rich and expand the pages people can link to on their pages, we scraped between 1-2 million pages from the english wikipedia, pages such as “Los Angeles, CA” and “United States” etc. We’ve been getting a steady supply of organic backlinks from users who create their own pages and cite their wikis on their website, in news etc. However, starting 2 months ago our organic traffic has started slowly decaying as if we have received some kind of algorithmic penalty. What could it be? Could it be dupe content from the wikipedia pages we imported and indexed? Could it be some kind of algo from the Penguin update? We are just very confused why our organic search traffic would begin to drop at all since every day we have organic users making quality pages, some of whom organically backlink their articles on their own website and these obviously add up over time.

    | teddef
    1

  • Hi How important are internal anchor text links & rankings? I'm researching competitors and am seeing a lot of internal anchor text links with keywords helping them rank - but they have these links in their menu which at the moment isn't possible for us. We can include our top level 1 categories, but nothing below this in the top navigation Thanks!

    | BeckyKey
    1

  • Hi Does anyone know if Google prefers paragraphs over content in a table, or doesn't it make much difference?

    | BeckyKey
    0

  • As recent Penguin update makes quick move with backlinks with immediate impact; does Disavow tool also results the changes in few days rather than weeks like earlier? How long does it take now to see the impact of disavow? And I think still we must Disavow some links even Google claim that it'll take care of bad backlinks without passing value from them?

    | vtmoz
    0

  • Hi Guys, Looking for some advice regarding duplicate content across different domains. I have reviewed some previous Q&A on this topic e.g. https://moz.rankious.com/_moz/community/q/two-different-domains-exact-same-content but just want to confirm if I'm missing anything. Basically, we have a client which has 1 site (call this site A) which has solids rankings. They have decided to build a new site (site B), which contains 50% duplicate pages and content from site A. Our recommendation to them was to make the content on site B as unique as possible but they want to launch asap, so not enough time. They will eventually transfer over to unique content on the website but in the short-term, it will be duplicate content. John Mueller from Google has said several times that there is no duplicate content penalty. So assuming this is correct site A should be fine, no ranking losses. Any disagree with this? Assuming we don't want to leave this to chance or assume John Mueller is correct would the next best thing to do is setup rel canonical tags between site A and site B on the pages with duplicate content? Then once we have unique content ready, execute that content on the site and remove the canonical tags. Any suggestions or advice would be very much appreciated! Cheers, Chris

    | jayoliverwright
    0

  • 0% is of course the best case and 100% would be the worst case but what would is considered average. How do you address this subject with your clients?

    | jjgonza
    0

  • Hi If my facets are being crawled, how can I stop this? Or  set them up so they are SEO friendly - this is new to me as I haven't had to deal with lots of facets in the past. Here's an example of a page on the site - https://www.key.co.uk/en/key/lift-tables Here's an example of a facet URL - https://www.key.co.uk/en/key/lift-tables#facet:-1002779711011711697110,-700000000000001001651484832107103,-700000000000001057452564832109109&productBeginIndex:0&orderBy:5&pageView:list& I've been trying to read up on URL parameters etc, I'm new to it so it's taking some time to understand Any advice would be great!

    | BeckyKey
    0

  • Hello Moz community I know this question has been asked before but it seems there is no real answer other than putting a summary of the PDF on the HTML page. My problem is other websites are using my PDFs, I have some PDFs with very high authority links and I would like to either pass the link juice on to my product/category page or do rel=canonical somehow. I'm using bigcommerce as my platform. My website is cwwltd.com. Any help would be greatly appreciated. Thank you

    | Neverstop123
    1

  • We have a situation where a vendor, who manages a great deal of our websites, is migrating their platform to HTTPS. The problem is that the HTTP & new HTTPS versions will be live simultaneously (in order to give clients time to audit both sites before the hard switch). I know this isn't the way that it should be done, but this is the problem we are facing. My concern was that we would have two websites in the index, so I suggested that they noindex the new HTTPS website until we are ready for the switch. They told me that they would just add cannonicals to the HTTPS that points to the HTTP and when it's time for the switch reverse the cannonicals. Is this a viable approach?

    | AMSI-SEO
    0

  • Hi Having weighed up all the angles, it's time to bite the bullet and move our blog from a subdomain to a subfolder on our ecommerce store. But as someone new to SEO I am struggling to find the correct process for doing this properly for our situation. Can anyone help? I have outlined what I have learned so far in 10 steps below to hopefully help you understand my situation, where I am at and what I am struggling with. Advice, tips and suggested further reading on all/any of the 10 points would be great. Some quick background The blog is on Wordpress, and on a subdomain of our store (blog.store.com). It is four years old, with 80 original posts we want to move to a subfolder of the store (store.com/blog). The store has been built using BigCommerce, and has also been active for four years. Both the blog and the store exist as properties within our Google Search Console. The 10 steps required for the move, based on research so far, and the associated questions 1 Prepare new site: which I am guessing means reproducing all of the content over at the new subfolder location (store.com/blog)? 2 Setup errors for any pages not being transferred: I have no idea how to do this! 3 Make sure analytics is working for the new pages: it should be as the both the site the pages are moving to a subfolder of is already running with analytics and has been for years - is this a safe assumption? 4 Map all URLs being moved to their new counterparts: is this just record keeping? In a spreadsheet? Or is it a process I don't yet understand?? 5 Add rel='cannonical' tags: while I understand the concept of these, I have no idea how to implement them properly here! 6 Create and save new sitemaps: as both the blog.store.com and store.com exist in Google Search Console already, can I just refresh the sitemap for store.com/blog once the subfolder is created to achive this? 7 Setup and test 301 redirects: these can be created in BigCommerce for the new pages in the store.com/blog subfolder, and will refer back to the blog.store.com URLs the pages came from - is this the right way to do this? I am still learning here and know enough to know how much this can matter, but not enough to fully grasp the intricacies of the process 8 Move URLs simultaneously: I have no idea what this means or how to achieve it! is this just for big site moves? Does it still apply to 80 blog posts shifting from a subdomain to a subfolder on the same root? If so, how? 9 Submit a change of address in Google Search Console: This looks simple enough although Google ominously warn: ‘Don't use this tool unless you are moving your primary website presence to a new address’ Which makes me wonder how simple it really is - my primary website in this case is the store, which is not moving. address But does 'primary' here simply mean the individual property with search console? I am going in circles on this one! 10 Configure the old blog on the subdomain to redirect people and engines to the new pages: I thought the 301 redirects and rel='cannonical' stuff did that already? What did I miss?? For anyone still here, thanks for making it this far and if you still have the energy left, any advice would be great! Thanks

    | Warren_33
    1

  • I've been wondering for a while now, how Google treats internal duplicate content within classified sites. It's quite a big issue, with customers creating their ads twice.. I'd guess to avoid the price of renewing, or perhaps to put themselves back to the top of the results. Out of 10,000 pages crawled and tested, 250 (2.5%) were duplicate adverts. Similarly, in terms of the search results pages, where the site structure allows the same advert(s) to appear under several unique URLs. A prime example would be in this example.  Notice, on this page we have already filtered down to 1 result, but the left hand side filters all return that same 1 advert. Using tools like Siteliner and Moz Analytics just highlights these as urgent high priority issues, but I've always been sceptical. On a large scale, would this count as Panda food in your opinion, or does Google understand the nature of classifieds is different, and treat it as such? Appreciate thoughts. Thanks.

    | Sayers
    1

  • Hi Our facets are from what I can see crawled by search engines, I think they use javascript - see here http://www.key.co.uk/en/key/lockers I want to get this fixed for SEO with an ajax solution - I'm not sure how big this job is for developers, but they will want to know the positive impact this could have & whether it's worth doing. Does anyone have any opinions on this? I haven't encountered this before so any help is welcome 🙂

    | BeckyKey
    0

  • I have an English wordpress site that I'd like to duplicate and translate in Spanish...does anyone recommend using a subdomain (espanol.xyz.com) or something like a .com/es/ format? I'm also interested in any SEO issues that might arise and want to keep SEO in mind while building this segment of the site. Any other tips or things to think of?

    | SteveZero12
    0

  • Hello Mozzers, This is a bit of an open ended question and I don't think any one person is going to be the same. I have recently seen the light in my link building practices and I am trying to get a feel for what to expect in terms of natural link acquisition in an effective content marketing strategy. My question is how many natural links do you generally find yourself earning after the first 12 months of content creation/placement with a new website/industry? I know this is going to be a question with a multitude of different answers. I look forward to your valuable insight as always!

    | ChoChauRice
    1

  • I'm working on a large scale publishing site. I can increase search rankings almost immediately by improving internal linking to targeted pages, sometimes by 40 positions but after a day or two these same rankings drop down again, not always as low as before but significantly lower than their highest position. My theory is that the uplift generated by the internal linking is subsequently mitigated by other algorithmic factors relating to content quality or site performance or is this unlikely? Does anyone else have experience of this phenomenon or any theories?

    | hjsand
    1

  • Friday, December 16 is Free Shipping Day. However, Google says it's actually 2 days later. Does anybody know how we can get that changed? See: https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=free shipping day

    | lk99
    0

  • Hi Moz Community! I've researched Moz to find the answer to this question but nothing for my situation. I'm hoping some experienced SEOs can help me out. Here's the situation: I'm up against some fairly stiff competition for my main keyword - the front page is dominated by major manufacturers with high brand recognition and loads of money, where as my client is a much smaller manufacturer trying to compete. However, their DA is only 37-53 so not impossible to outrank... just many links and a significant advantage. We've honed in on a keyword that still drives good traffic, that's a great term to drive paying customers, and that we can get competitive with. My strategy was to attempt to rank my client's _homepage _for this term, rather than a specific product page, as I knew that they'd have many more links and social shares of their main site. (I've been successful with this strategy before). We've risen 60+ positions for the keyword in the past 3 months, to position 12, but we seem to have plateaued for the past month. We're ranking in top 5 positions for a number of our other keywords, so I know we're trending well. However, I'm concerned that despite our quick rise to #12, I may have made a seemingly fatal decision to rank their homepage for our target keyword term. After we had plateaued for a while, I did a more thorough side by side comparison and found that 8 out of 10 competitors on the front page have 2 main things we don't (and can't, because we're ranking the homepage)... 1-  The keyword in the url (they're ranking for product pages, i.e. homepage.com/keyword-here/) 2- Their keyword comes first, or early in the meta title. Ours is _supposed to _, but as you know- Google can do what it likes with your homepage title as it's your brand, so they've put our company name- _then _the keyword we added in the title. e.g. Our Company | The Term We're Ranking For We've done a lot of work, and gained many reputable, high quality links, and we did see a significant rank increase across all our pages. My question is- did I shoot myself in the foot? Or is ranking the homepage still viable in this situation? If ultimately this is going to be impossible to get in the top #5 spots, what can I do to fix it? We've already gained a PA of 38 on the homepage from our work. Or would you let it go and just keep working at it, expecting that eventually we'll break onto the front page? Thanks in advance! Let me know if you need more info. I tried to be general with terms/site for my client's sake.

    | TheatreSolutionsInc
    0

  • Hi guys I have a critical problem with google crawler. Its my website : https://1stquest.com I can't create sitemap with online site map creator tools such as XML-simemap.org Fetch as google tools usually mark as partial MOZ crawler test found both HTTP and HTTPS version on site! and google cant index several pages on site. Is problem regards to "unsafe URL"? or something else?

    | Okesta
    0

  • I am reaching out to see if I correctly set up Schema Markup on our website for multiple locations.Our company has HQ in Little Chute WI but we have 6 other office locations. We have a separate page for each office within our website. I inserted ProfessionalService Schema Markup, using GTM. I used the address of our HQ Little Chute office on all pages besides the 6 pages that have different office locations. I then individually set up unique ProfessionalService Schema Markup for the 6 locations with their unique location addresses. Did I do this correctly? Thanks

    | CharityHBS
    0

  • We are in a standard wide niche, my website composes so far of one 7000 word article on spiritual awakening. I have 14 more articles sketched, on various related topics, each 5000-15000 words per article, targething main keywords in the niche. I am qualified to tie the whole niche together and draw new meaning but it takes in debth length. The question is how many 5K-15K page articles would you recommend polishing before launching the thing. Everything is launched, hid from the search engines, and I need to know how many articles to publish before I let the search engines look at my site. Looking for the best thing for long term growth. Thanks.

    | BobGW
    0

  • Hello everyone. I am trying to understand why most of my website category pages don't show up in the in the first 50 organic results on Google, despite my high website DA and high PA of those pages. We used to rank high a few years ago, not clear why most of those pages have almost completely disappeared. So, just to take one as an example, please, help me to understand why this page doesn't shows up in the first 50 organic search results for the keyword "cello sheet music": http://www.virtualsheetmusic.com/downloads/Indici/Cello.html I really can't explain why, unless we are under some sort of "penalization" or similar (a curse?!)... I have analyzed any possible metric, and can't find a logical explanation. Looking forward for your thoughts guys! All the best, Fab.

    | fablau
    0

  • I have a question about how Mobile First could affect websites with separate (and smaller) mobile vs desktop sites.   Referencing this SE Roundtable article (seorountable dot com /google-mobile-first-index-22953.html), "If you have less content on your mobile version than on your desktop version - Google will probably see the less content mobile version. Google said they are indexing the mobile version first." But Google/ Gary Illyes are also on the record stating the switch to mobile-first should be minimally disruptive. Does "Mobile First" mean that they'll consider desktop URLs "second", or will they actually just completely discount the desktop site in lieu of the mobile one?  In other words: will content on your desktop site that does not appear in mobile count in desktop searches? I can't find clear answer anywhere (see also: /jlh-marketing dot com/mobile-first-unanswered-questions/). Obviously the writing is on the wall (and has been for years) that responsive is the way to go moving forward - but just looking for any other viewpoints/feedback here since it can be really expensive for some people to upgrade.  I'm basically torn between "okay we gotta upgrade to responsive now" and "well, this may not be as critical as it seems".  Sigh... Thanks in advance for any feedback and thoughts.  LOL - I selected "there may not be a right answer to this question" when submitting this to the Moz community. 🙂

    | mirabile
    0

  • Hello, What are the top 3 concepts in modern SEO in your honest opinion, and what are your best sources for learning about them. For example, #1 10X Content This Whiteboard Friday

    | BobGW
    0

  • I am trying to generate a site map for my site nationalcurrencyvalues.com but all the tools I have tried don't get all my 70000 html pages...   I have found that the one at check-domains.com crawls all my pages but when it writes the xml file most of them are gone... seemingly randomly. I have used this same site before and it worked without a problem.  Can anyone help me understand why this is or point me to a utility that will map all of the pages? Kindly, Greg

    | Banknotes
    0

  • Hi I've reviewing our internal linking structure & have found that the facets/filter buttons on a category, are crawled and have anchor text to each link, for example: The anchor text to filter the product listing results by those under £50 would be: | Facet Value Less than £50.00 (15) Less than £50.00 (15) | This also has the source URL & destination URL of http://www.key.co.uk/en/key/lockers I haven't come across this before - is this an issue?

    | BeckyKey
    0

  • Hi, I have a call with a potential client tomorrow where all I know is that they are wigged-out about canonicalization, indexing and architecture for their three sites: m.ExampleSite.com mobile.ExampleSite.com ExampleSite.com The sites are pretty large... 350k for the mobiles and 5 million for the main site. They're a retailer with endless products. They're main site is not mobile-responsive, which is evidently why they have the m and mobile sites. Why two, I don't know. This is how they currently hand this: What would you suggest they do about this? The most comprehensive fix would be making the main site mobile responsive and 301 the old mobile sub domains to the main site. That's probably too much work for them. So, what more would you suggest and why? Your thoughts? Best... Mike P.S., Beneath my hand-drawn portrait avatar above it says "Staff" at this moment, which I am not. Some kind of bug I guess.

    | 94501
    0

  • Struggling through trying to resolve a complicated search issue - would appreciate any community input or suggestions. The Background Info We have several brand sites and each one has both a .ca and .com domain. For some reason, our website platform was created in a way that hundreds of pages on the .com domain have an equivalent page on the .ca domain, which are all 301'ed to the appropriate .com pages. Example below for clarity: www.domain.ca/gadget/brand - 301 Redirected to: www.domain.com/gadget/brand www.domain.ca/gadget/en/brandcanada = Proper .ca Canadian URL (where en is the language - fr exists as well) The Problem Because these .com pages exist under the .ca domain as well, they have started to outrank the correct .ca pages on Google. This has led to Canadian customers finding incorrect information, pricing, and reviews for these products - causing all sorts of customer service issues and therefore affecting our sales. I am being told that to properly fix the issue, and remove the incorrect URLs under the .ca domain would be prohibitively expensive in terms of resources, so I'm left trying to fix this via means available to me (i.e. anything but a change to how the platform is currently setup). The Attempted Fix I've submitted proper sitemaps for the .ca brand sites, and we have also created a robots.txt file to be accessed only when the site is crawled through the .ca domain. In that robots.txt, we have Disallowed crawling of any /gadget/brand/ URLs for the .ca domain. This was done a week ago and I am still seeing the .com URL show up in search results. The Question Should I be submitting any www.brand.ca/gadget/brand/ URLs to be temporarily removed from Google? Because of the 301 redirect in place from www.brand.ca/gadget/brand to www.brand.com/gadget/brand, I am hesitant to do so, as I do not want the .com URL removed. Will Google simply remove the .ca URL and not follow the 301 redirect to remove that URL as well? Any additional insight or feedback would be awesome as well.

    | Trevor-O
    0

  • Hi, I can see in my search console that a website giving thousands of links to my site where hardly only one back-link from one of their page to our page. Why this is happening? Here is screenshot: http://imgur.com/a/VleUf

    | vtmoz
    0

  • Hi all, I launched a new website in Aug 2015, and have had some success with ranking organically on Google (position 2 - 5 for all of my target terms). However I'm still not getting any traction on Bing. I know that they use completely different algorithms so it's not unusual to rank well on one but not the other, but the ranking behaviour that I see seems quite odd. We've been bouncing in and out of the top 50 for quite some time, with shifts of 30+ positions often on a daily basis (see attached). This seems to be the case for our full range of target terms, and not just the most competitive ones. I'm hoping someone can advise on whether this is normal behaviour for a relatively young website, or if it more likely points to an issue with how Bing is crawling my site. I'm using Bing Webmaster tools and there aren't any crawl or sitemap issues, or significant seo flags. Thanks dhYgh

    | Tinhat
    0

  • My latest post on the Moz Blog, Featured Snippets: A Dead-Simple Tactic for Making, explores how to keep Featured Snippets once you have them. I'm curious to know how many brands are actively working to get in the answer box, and for those who are, what's been the results?

    | ronell-smith
    2

  • Hi Moz Community, Recently I've been seeing multiple pages from my eCommerce site pop up in the SERPS for a couple of queries. Usually I would count this as a good thing but since both pages that generally pop up are so similar I'm starting to wonder if we would rank better with just one page. My example is the query "birthday gifts" Both of the URL's below show up in the search results one after the other on the first page. The URL on the top is our family page and the one below it is our subcat page, you can find both in the top nav. of our site. www.uncommongoods.com/gifts/birthday-gifts/birthday-gifts (family) www.uncommongoods.com/gifts/birthday-gifts (subcat) Both of these pages have different PA's and the subcat page that currently lives in our site nav is actually: **www.uncommongoods.com/gifts/birthday-**gifts?view=all. ****This url doesn't show up in the serps and is rel=canonicaled to the subcat page without the parameter listed above. We use this page in the nav because we think it's a better user experience than the actual subcat page. If we were to condense all three pages into one would we rank higher? Any thoughts here would be appreciated. Thanks

    | znotes
    0

  • Starting November 1st, organic web traffic from Google dropped from an average of about 60 visits a day to about 5 per day. So we are more than 90% off!!!! At the end of September, we modified the header of the site to simplify it. We also added a snippet of code to each page to enable Zoho "Sales IQ" to work. Sales IQ enables us to track visitors and engage in chat sessions with them. Apart from that no changes have been made from the site. Any ideas as to what could have caused this drop in traffic? Was there a Google update at that time that could have caused the drop? Or could the recent site changes have caused this? I have attached a Google Webmasters Tool report showing the drop in traffic. I would very much appreciate some insight into this, as all organic traffic to our site has ceased. Thanks,
    Alan 9VNB1O5

    | Kingalan1
    0

  • Hi Moz community Let's say I have two domains www.domain1.com www.domain2.com domain1 is my main website. Domain 2 was a peripheral side project I was working on. I recently decided to shut it down. So I hooked up the proper 301s and filed a change of address request with Google Webmaster tools. I have had an offer for someone to purchase domain2 - I have absolutely no use for it and would like to sell it. I just want first to figure out that: I can do this without losing any ranking to my main site. I can disassociate this domain from myself and my Company completely. I don't want any of the work we put into it to transfer to the new owner. How can I do this? thanks!

    | Shop-Sq
    0

  • Hi I am reviewing our robots.txt file. I wondered if search results pages should be blocked from crawling? We currently have this in the file /searchterm* Is it a good thing for SEO?

    | BeckyKey
    0

  • Our commercial real estate site (www.nyc-officespace-leader.com) contains about 800 URLs. Since 2012 the domain authority has dropped from 35 to about 20. Ranking and traffic dropped significantly since then. The site has about 791 URLs. Many are set to noindex. A large percentage of these pages have a Moz page authority of only "1". It is puzzling that some pages that have similar content to "1" page rank pages rank much better, in some cases "15". If we remove or consolidate the poorly ranked pages will the overall page authority and ranking of the site improve? Would taking the following steps help?: 1. Remove or consolidate poorly ranking unnecessary URLs?
    2. Update content on poorly ranking URLs that are important?
    3. Create internal text links (as opposed to links from menus) to critical pages? A MOZ crawl of our site's URLs is visible at the link below. I am wondering if the structure of the site is just not optimized for ranking and what can be done to improve it. THANKS. https://www.dropbox.com/s/oqchfqveelm1q11/CRAWL www.nyc-officespace-leader.com (1).csv?dl=0 Thanks,
    Alan

    | Kingalan1
    0

  • I am in process of pruning my sites for low quality/thin content. The issue is that I have multiple sites with 40k + pages and need a more efficient way of finding the low quality content than looking at each page individually. Is there an ideal way to find the pages that are worth no indexing that will speed up the process but not potentially harm any valuable pages? Current plan of action is to pull data from analytics and if the url hasn't brought any traffic in the last 12 months then it is safe to assume it is a page that is not beneficial to the site. My concern is that some of these pages might have links pointing to them and I want to make sure we don't lose that link juice. But, assuming we just no index the pages we should still have the authority pass along...and in theory, the pages that haven't brought any traffic to the site in a year probably don't have much authority to begin with. Recommendations on best way to prune content on sites with hundreds of thousands of pages efficiently?  Also, is there a benefit to no indexing the pages vs deleting them? What is the preferred method, and why?

    | atomiconline
    0

  • We operate www.metro-manhattan.com, a commercial real estate website. There about 550 pages. About 300 pages are for individual listings. About 150 are for buildings. Most of the listings pages have 180-240 words. Would it be better from an SEO perspective to have multiple listings on a single page, say all Chelsea listings on the Chelsea neighborhood page? Are we shooting ourselves in the foot by having separate URLs for each listing? Are we at risI for a thin cogent Google penalty? Would the same apply to building pages (about 150)? Sample Listing: http://www.nyc-officespace-leader.com/listings/364-madison-ave-office-lease-1802sf Sample Building: http://www.nyc-officespace-leader.com/for-a-new-york-office-space-rental-consider-one-worldwide-plaza-825-eighth-avenue My concern is that the existing site architecture may result in some form of Google penalty. If we have to consolidate these pages what would be the best way of doing so? Thanks, 
    Alan

    | Kingalan1
    0

  • Hello everyone, Maybe it is a stupid question, but I ask to the experts... What's the best way to noindex pages but still keep backlinks equity from those noindexed pages? For example, let's say I have many pages that look similar to a "main" page which I solely want to appear on Google, so I want to noindex all pages with the exception of that "main" page... but, what if I also want to transfer any possible link equity present on the noindexed pages to the main page? The only solution I have thought is to add a canonical tag pointing to the main page on those noindexed pages... but will that work or cause wreak havoc in some way?

    | fablau
    3

  • I am in the process of a large rebuild and redesign of my real estate website. It is Wordpress based with custom post types organized by category archives for neighborhood pages. Through this process, I am expanding my content and number of internal pages by a large amount. I am at a decision point with permalink structure for my site and can choose two different directions to go, not sure which is best or if there is any impact on SEO for the two options. Most of it comes down to location of keyword search terms being placed in my permalink structures. When I first built my site, exact domain match was still a thing for SEO power, and now days I hear it doesn’t hold much weight… so my domain name of http://dwellarizona.com still remains the same either way which does contain an important keyword, but might be redundant if using the keyword “Arizona” elsewhere in a permalink. Is there any difference in SEO power for the following keywords whether they are placed all at the end of the permalink, versus being distributed more throughout the entire link address when there is a hierarchy to the address and some in the domain name, partly in a parent page address, and partly in the child page link address? Target keywords: grayhawk, scottsdale, arizona
    Option 1: http://dwellarizona.com/luxury/scottsdale/grayhawk vs
    Option 2 (parent-child relationship): http://dwellarizona.com/luxury/grayhawk-scottsdale-arizona

    | shawnbeaird
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.