Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • With this new update coming out with raven marketing with this SEO Moz have to do the same update? Are is this something that Raven does and SEOMoz doesnt do with getting 3 rd party info?

    | MarketMotiveStudents
    0

  • Want to optimize referral traffic while at same time keep search engines happy and the ads posted. Have a client who advertises  on several classified ad sites around the globe. Which is better (post Panda), having multiple identical urls using canonicals  to redirect juice to original url? For example: www.bluewidgets.com is the original www.bluewidgetsusa.com www.blue-widgets-galore.com Or, should the duplicate pages be directed to original using a 301? Currently using duplicate urls. Am currently not using "nofollow" tags on those pages.

    | AllIsWell
    0

  • i have removed lots of links and contacted lots of webmaster to clean up my link profile. I have a large xls file to send to google for them to see that we have done a lot to clean up the bad links. How would i show this file to google? is there a place where I can post it? or email ? thank you nick

    | orion68
    0

  • We work with a number of clients that need a small website.  Is Adobe Muse a good software for building solid, optimized websites?  Is Muse better then Dreamweaver?  Input please.

    | Lael
    0

  • I have an x-cart site and it is showing only 1 page being crawled.  I'm a newbie, is this common? Can it be changed? If so, how? Thanks.

    | SteveLMCG
    0

  • For instance, if the keyword I'm targeting on a specific page is "New Orleans", the Keyword is everywhere it's supposed to be, title, meta, content, internal links, etc, .... So when I check my most relative key words with different tools, it always breaks the word up like: new - 12 times 2.3% orleans - 12 times 2.3% Should I try to fix this? or is this normal? and does google view this as 1 keyword when evaluating my site?

    | Nola504
    0

  • I am working on a project where we are re-branding lots (100+) existing local business under one national brand.  I am wondering what we should do with their existing websites, they are generally fairly poor and will need re-designing to match the new brand but may have some residual links? 301 redirect the URL to the national site, e.g. nationalsite.com/localbusinessA?  If so what should I look out for?  Do I need to specifically redirect any pages that have links to them to the same pages on the new site? Or should I give them a new standalone website that they link back to the national brand site? More than likely this will be hosted on the same server and CMS as the main site just the URL will remain  Do I need to make sure that any old URL's that had links to them are 301'd to the new pages? Many thanks for you advice.

    | BadgerToo
    0

  • We found msnbot is doing lots of request at same time to one URL, even considering we have caching,  it triggers many requests at same time so caching does not help at the moment: For sure we can use mutex to make sure URL waits for cache to generate, but we are looking for solution for MSN boot. 123.253.27.53 [11/Dec/2012:14:15:10 -0600] "GET //Fun-Stuff HTTP/1.1" 200 0 "-" "msnbot/2.0b (+http://search.msn.com/msnbot.htm)" 1.253.27.53 [11/Dec/2012:14:15:10 -0600] "GET //Type-of-Resource/Fun-Stuff HTTP/1.1" 200 0 "-" "msnbot/2.0b (+http://search.msn.com/msnbot.htm)" 1.253.27.53 [11/Dec/2012:14:15:10 -0600] "GET /Browse//Fun-Stuff HTTP/1.1" 200 6708 "-" "msnbot/2.0b (+http://search.msn.com/msnbot.htm)" We found the following solution: http://www.bing.com/community/site_blogs/b/webmaster/archive/2009/08/10/crawl-delay-and-the-bing-crawler-msnbot.aspx Bing offers webmasters the ability to slow down the crawl rate to accommodate web server load issues. User-Agent: * Crawl-Delay: 10 Need to know if it’s safe to apply that.  OR any other advices. PS: MSNBot gets so bad at times that it could trigger a DOS attack – alone! (http://www.semwisdom.com/blog/msnbot-stupid-plain-evil#axzz2EqmJM3er).

    | tpt.com
    0

  • Saturday I waas doing some correcting of some duplicate titles, including nofollowing tags, etc. (my main problem was duplicate titles due to tags and categories being indexed). Now this morning I see that one of my pages refuses to load, citing a 301 redirect loop. http://www.incredibleinfant.com/feeding/switching-baby-formula/ Originally, the page was posted under the wrong category. http://www.incredibleinfant.com/uncategorized/switching-baby-formula I resaved it under the correct category (feeding) and now it won't load. Can someone help me figure out how to correct this mess? Thanks so much Heather

    | Gotmoxie
    0

  • Hi, I'm finding our blog's RSS feed won't be recognized by various websites. The feed is on a https url. Is this the most likely problem? Thanks!

    | OTSEO
    0

  • All of my competitors have high linking root domains from youtube and our isn't showing up although we have 1.5 million views to youtube. I tried adding our URL to the videos but it hasn't recognized as a linking root domain. What should I do?? There's a ton of SEO juice here I want to tap into! watch?v=GTXFRTY4CCA&list=UUOcfF9LAHKedNSyk-gk5xDw&index=28

    | tonymartin
    0

  • Crazy in a bad way!I am hoping that perhaps some of you have experienced this scenario before and can shed some light on what might be happening.Here is what happened:We recently fixed a meta refresh that was on our site's homepage. It was completely fragmenting our link profile. All of our external links were being counted towards one URL, and our internal links were counting for the other URL. In addition to that, our most authoritative URL, because it was subject to a meta refresh, was not passing any of its authority to our other pages.Here is what happened to our link profile:Total External Links:  Before -  2,757     After -  **4,311  **Total Internal Links:   Befpre -  125        After -  3,221
    Total Links:               Before  -  2,882     After -  7,532Yeah....huge change. Great right? Well, I have been tracking a set of keywords that were ranking from spots 10-30 in Google. There are about 66 keywords in the set. I started tracking them because at MozCon last July Fabio Riccotta suggested that targeting keywords showing up on page 2 or 3 of the results might be easier to improve than terms that were on the bottom of page 1. So, take a look at this. The first column shows where a particular keyword ranked on 11/8 and the second column shows where it is ranking today and the third column shows the change. For obvious reasons I haven't included the keywords.11/8   11/14   Change****10        44       -34
    10        26       -16
    10        28       -18
    10        34       -24
    10        25       -15
    15        29       -14
    16        33       -17
    16        32       -16
    17        24       -7
    17        53       -36
    17        41       -24
    18        27       -9
    19        42       -23
    19        35       -16
    19        -         Not in top 200
    19        30       -11
    19        25       -6
    19        43       -24
    20        33       -13
    20        41       -21
    20        34       -14
    21        46       -25
    21         -        Not in top 200
    21        33       -12
    21        40       -19
    21        61       -40
    22        46       -24
    22        35       -13
    22        46       -24
    23        51       -28
    23        49       -26
    24        43       -19
    24        47       -23
    24        45       -21
    24        39       -15
    25        45       -20
    25        50       -25
    26        39       -13
    26        118     - 92
    26        30       -4
    26        139     -113
    26        57       -31
    27        48       -21
    27        47       -20
    27        47       -20
    27        45       -18
    27        48       -21
    27        59       -32
    27        55       -28
    27        40       -13
    27        48       -21
    27        51       -24
    27        43       -16
    28        66       -38
    28        49       -21
    28        51       -23
    28        58       -30
    29        58       -29
    29        43       -14
    29        41       -12
    29        49       -20
    29        60       -31
    30        42       -12
    31        -          Not in top 200
    31        59       -28
    31        68       -37
    31        53       -22
    Needless to say, this is exactly the opposite of what I expected to see after fixing the meta refresh problem. I wouldn't think anything of normal fluctuation, but every single one of these keywords moved down, almost consistently 20-25 spots. The further down a keyword was to begin with, it seems the further it dropped.What do you make of this?  Could Google be penalizing us because our link profile changed so dramatically in a short period of time? I should say that we have never taken part in spammy link-building schemes, nor have we ever been contacted by Google with any kind of suspicious link warnings. We've been online since 1996 and are an e-commerce site doing #RCS. Thanks all!

    | danatanseo
    0

  • Hello, We have a temporary redirect (302?) from http url to https url. Seomoz has suggested that we change the 302 temporary redirect to a permanent 301 redirect. Sounds good.... but how do we do that?! Appreciate any and all feedback!

    | OTSEO
    0

  • I recently started out on Seomoz and is trying to make some cleanup according to the campaign report i received. One of my biggest gripes is the point of "Dublicate Page Content". Right now im having over 200 pages with dublicate page content. Now.. This is triggerede because Seomoz have snagged up auto generated links from my site. My site has a "send to freind" feature, and every time someone wants to send a article or a product to a friend via email a pop-up appears. Now it seems like the pop-up pages has been snagged by the seomoz spider,however these pages is something i would never want to index in Google. So i just want to get rid of them. Now to my question I guess the best solution is to make a general rule via robots.txt, so that these pages is not indexed and considered by google at all. But, how do i do this? what should my syntax be? A lof of the links looks like this, but has different id numbers according to the product that is being send: http://mywebshop.dk/index.php?option=com_redshop&view=send_friend&pid=39&tmpl=component&Itemid=167 I guess i need a rule that grabs the following and makes google ignore links that contains this: view=send_friend

    | teleman
    0

  • Hi, I'm trying to help someone fix the following situation: they had a website, www.domain.com, that was generating a steady amount of traffic for three years. They then redesigned the website a couple of months ago, and the website developer redirected the site to domain.com but did not set up analytics on domain.com. We noticed that there was a drop in traffic to www.domain.com but have no idea if domain.com is generating any traffic since analytics wasn't installed. To fix this situation, I was going to find out from the developer if there was a good reason to redirect the site. What would have prompted the developer to do this if www.domain.com had been used already for three years? Then, unless there was a good reason, I would change the redirect back to what it was before - domain.com redirecting to www.domain.com. Presumably this would allow us to regain the traffic to the site www.domain.com that was lost when the redirect was put in place. Does this sound like a reasonable course of action? Is there anything that I'm missing, or anything else that I should do in this situation? Thanks in advance! Carolina

    | csmm
    0

  • My home page for my site, isn't really a home page, not sure how to describe that. We have additional pages that are stand alone pages which we work on and add content too, just not for the main page. So I have put my 300 words in a widget on the front page (which actually shows up on all the page being a widget. Is that good for SEO, or should it be in the body of a page? Thanks!

    | greenhornet77
    0

  • Here's a puzzler... Our main domain (www.ides.com) doesn't appear in Google (but does on Bing and other engines). We think this is due to duplicate content which we're fixing. However our website's subdomains continue to appear prominently in SERPs, even on Google - here are some examples: IDES Prospector = prospector.ides.com IDES =  support.ides.com Cycolac FR15 = catalog.ides.com Why would Google penalize a main domain and its subdomains?

    | Prospector-Plastics
    0

  • I have a website which uses an individual landing pages to greet traffic for each suburb in our service area. For a long time we had one landing page which covered a region of suburbs but we've since added individual pages for those suburbs as well. However, Google will always display results for the old regional page, even though the new pages are optimized for searches in that area. Is there anything I can do to get Google to display the pages I want?

    | mark.schad
    0

  • Currently, I have an error on my moz dashboard indicating there are too many links on one of my pages.  That page is the sitemap.  It was my understanding all internal pages should be linked to the sitemap. Can any mozzers help clarify the best practice here? Thanks, Clayton

    | JorgeUmana
    0

  • im working with  gutter installation company, and we're ranking for all the top keywords in google. the only thing that we're not ranking for is for the map results, for the keyword "gutter ma" since we're located in Springfield ma, i thing Google considers certain areas from Boston, because its more center of Massachusetts, What can i do to improve my rankings in maps for this keyword, because i know it wont work with PO box since  i need to confirm an address? Thanks

    | vladraush99
    0

  • A client has an application area of the site (a directory) that has a form and needs to be secured with ssl.  The vast majority of the site is static, and does not need to be secured.  We have experienced situations where a visitor navigates the site as https which then throws security errors.  We want to keep static visitors on http; (and crawlers) and only have visits to the secure area display as ssl.  How is this best accomplished? Our developer wants to add a rule to the global configuration file in php that uses a 301 redirect to ensure static pages are accessed as http, and the secure directory is accessed as https.  Is the the proper protocol?  Are there any SEO considerations we should make? Thanks.

    | seagreen
    0

  • Hi, Our domain has expired, and it could take up to 48h to recover our website. Appart from the obvious image damage, It worries me Google will just think we have vanisheg Any recommendations? Maybe update something on WebMasterTools? Not having the domain, cannot even do any temporary redirect, etc... Thanks! Jaime

    | BaseKit
    0

  • We are having an issue with getting Google to rank the page we want.  To have this page http://www.jakewilson.com/c/52/-/346/Cruiser-Motorcycle-Tires rank for the key word Cruiser Motorcycle Tires; however, this page http://www.jakewilson.com/t/52/-/343/752/Cruiser-Motorcycle-Tires is ranking instead and it has less links and page authority according to site explorer and it is farther down in the hierarchy.   I am wondering if google just likes pages that have actual products on them instead of a page leading to the page with all the products. Thoughts?

    | DoRM
    0

  • Hi everyone, I have a question about multilingual blogs and site structure. Right now, we have the typical subfolder localization structure. ex: domain.com/page (english site) domain.com/ja/page (japanese site) However, the blog is a slightly more complicated. We'd like to have english posts available in other languages (as many of our users are bilinguals). The current structure suggests we use a typical domain.com/blog or domain.com/ja/blog format, but we have issues if a Japanese (logged in) user wants to view an English page. domain.com/blog/article would redirect them to domain.com/ja/blog/article thus 404-ing the user if the post doesn't exist in the alternate language. One suggestion (that I have seen on sites such as etsy/spotify is to add a /en/ to the blog area: ex domain.com/en/blog domain.com/ja/blog Would this be the correct way to avoid this issue? I know we could technically work around the 404 issue, but I don't want to create duplicate posts in /ja/ that are in English or visa versa. Would it affect the rest of the site if we use a /en/ subfolder just for the blog? Another option is to use: domain.com/blog/en domain.com/blog/ja but I'm not sure if this alternative is better. Any help would be appreciated!

    | Seiyav
    0

  • Hi All, I am looking for websites with keywords in the domain and I am using: inurl:keyword/s The results that come back include sub-pages and not only domains with the keywords in the root domain. example of what i mean: www.website.com/keyword/ What I want displayed only: www.keyword/s.com Does anyone know of a site command i can use to display URL's with keywords in the root domain only? Thanks in Advance Greg

    | AndreVanKets
    0

  • What is the latest on what Google is looking for? Keyword one, Keyword two? Sentences with the Keyword in them?

    | netviper
    0

  • Hi, I want to optimize my job portal for maximum search traffic. Problems Duplicate content- The portal takes jobs from other portals/blogs and posts on our site. Sometimes employers provide the same job posting to multiple portals and we are not allowed to change it resulting in duplicate content Empty Content Pages- We have a lot of pages which can be reached via filtering for multiple options. Like IT jobs in New York. If there are no IT jobs posted in New York, then it's a blank page with little or no content Repeated Content- When we have job postings, we have about the company information on each job listing page. If a company has 1000 jobs listed with us, that means 1000 pages have the exact same about the company wording Solutions Implemented Rel=prev and next. We have implemented this for pagination. We also have self referencing canonical tags on each page. Even if they are filtered with additional parameters, our system strips of the parameters and shows the correct URL all the time for both rel=prev and next as well as self canonical tags For duplicate content- Due to the volume of the job listings that come each day, it's impossible to create unique content for each. We try to make the initial paragraph (at least 130 characters) unique. However, we use a template system for each jobs. So a similar pattern can be detected after even 10 or 15 jobs. Sometimes we also take the wordy job descriptions and convert them into bullet points. If bullet points already available, we take only a few bullet points and try to re-shuffle them at times Can anyone provide me additional pointers to improve my site in terms of on-page SEO/technical SEO? Any help would be much appreciated. We are also thinking of no-indexing or deleting old jobs once they cross X number of days. Do you think this would be a smart strategy? Should I No-index empty listing pages as well? Thank you.

    | jombay
    3

  • hi i am just checking my errors on my site and it is telling me about duplicate pagination results, so i am just wondering if pagination is bad for seo for example http://www.in2town.co.uk/benidorm/benidorm-news/Page-2 i also have page 3 and page 4. should i stop my site from having this to help with seo

    | ClaireH-184886
    0

  • Hi, i receive around ten emails a day claiming they can help you get your site in the top ten in google, now i know most are a load of rubbish but i am just wondering if anyone has used any of these companies for a new site or an old site. I am about to launch a new site after xmas and i am just wondering if any of these companies are worth looking at to help promote the new site instead of doing all the ground work myself. Would love to know your thoughts

    | ClaireH-184886
    0

  • I recently started working here and I have noticed that google is ranking some pages over other for the main key word. Example: We are ranking on page one for ATV tires for this url http://www.rockymountainatvmc.com/t/43/81/165/723/ATV-Tires-All I thought google would pick http://www.rockymountainatvmc.com/c/43/81/165/ATV-Tires since it is higher up in the folders. I Have a couple reasons why the are picking the other one.  Mostly from link signals from one other site and footer link..  Any other thoughts. If we want google to rank the second url instead what would you suggest?

    | DoRM
    0

  • I have a question regarding ampersands, we are needing to redirect to a URL w/ an ampersand in the URL: http://local.sfgate.com/b18915250/Sam-&-Associates-Insurance-Agency Will Google pass page authority/juice despite the fact that there is an ampersand in the URL, if we were to 301 redirect or cross-domain canonical to the url? Should we 301 redirect to http://local.sfgate.com/b18915250/Sam-%26-Associates-Insurance-Agency instead of http://local.sfgate.com/b18915250/Sam-&-Associates-Insurance-Agency? I don't have the option of removing the ampersand Thank you for your time!

    | Gatelist
    0

  • Hello everyone I have the following link: http://mywebshop.dk/index.php?option=com_redshop&view=send_friend&pid=39&tmpl=component&Itemid=167 I want to prevent google from indiexing everything that is related to "view=send_friend" The problem is that its giving me dublicate content, and the content of the links has no SEO value of any sort. My problem is how i disallow it correctly via robots.txt I tried this syntax: Disallow: /view=send_friend/ However after doing a crawl on request the 200+ dublicate links that contains view=send_friend is still present in the CSV crawl report. What is the correct syntax if i want to prevent google from indexing everything that is related to this kind of link?

    | teleman
    0

  • Thanks in advance for any responses; we really appreciate the expertise of the SEOmoz community! My question: Since the file extensions are different, can a site have both a /sitemap.xml and /sitemap.html both siting at the root domain? For example, we've already put the html sitemap in place here: https://www.pioneermilitaryloans.com/sitemap Now, we're considering adding an XML sitemap. I know standard practice is to load it at the root (www.example.com/sitemap.xml), but am wondering if this will cause conflicts. I've been unable to find this topic addressed anywhere, or any real-life examples of sites currently doing this. What do you think?

    | PioneerServices
    0

  • Firstly, this is for an .asp site - and all my usual ways of fixing this (e.g. via htaccess) don't seem to work. I'm working on a site which has www.home.com and www.home.com/index.html - both URL's resolve to the same page/content. If I simply drop a rel canonical into the page, will this solve my dupe content woes? The canonical tag would then appear in both www.home.com and www.home.com/index.html cases. If the above is Ok, which version should I be going with? - or - Thanks in advance folks,
    James @ Creatomatic

    | Creatomatic
    0

  • Firstly, this is for an .asp site - and all my usual ways of fixing this (e.g. via htaccess) don't seem to work. I'm working on a site which has www.home.com and www.home.com/index.html - both URL's resolve to the same page/content. If I simply drop a rel canonical into the page, will this solve my dupe content woes? The canonical tag would then appear in both www.home.com and www.home.com/index.html cases. If the above is Ok, which version should I be going with? - or - Thanks in advance folks,
    James @ Creatomatic

    | Creatomatic
    0

  • Hi. my site www.in2town.co.uk  is currently number five in google for the search word lifestyle magazine, sometimes it moves to four but for over a year it has not got past four. before we had to do the site from scratch due to a major problem upgrading, we were number one in the search engines and our traffic was around 30% higher than it is now. For the keyword lifestyle news, we are on the fifth page of google and would really like to improve this. I would like to know what i need to do on our home page to try and improve our rankings for these two words. the most important word for us is lifestyle news. any help in my goal to improve our rankings would be great. We have improved our design which we are still working on, and we have upgraded to a bigger dedicated server to improve the speed.

    | ClaireH-184886
    0

  • According to google webmaster tools my landing pages just dropped from 1300 impressions 2 days ago to zero for the past 2 days. Have attached snippet of graph, URL of website is http://www.cheapcentralheating.co.uk - I have no idea whats happened here, and if anyone can advise or help I would be extremely grateful. landing_pages.jpg

    | nicklemonpromotions
    0

  • The website is medicare.md. if you search for term "medicare doctors PG county maryland" it is #1 in bing and yahoo but not even showing on google.com first TEN pages, although not banned. Interestingly if you do that search on google.co.pk it is #4. Quite Puzzuling !! Would appreciate any help or advice . Sherif Hassan

    | sherohass
    0

  • This is just the little frustrating question nothing important but I’m sure somebody will know the answer. In the white board Friday this week Rand suggested at one point that when you’re searching for results links to your website if you put a - followed by site followed by your url like –site:yourwebsite.com you get the results of pages with links on other websites but excluding your own webpages but it just doesn’t work I get no results just an error message, any idea why? If I remove the - I get tons of results but there on my own webpages……….

    | whitbycottages
    0

  • I just signed up for SEO Moz pro for my site. The initial report came back with over 700+ duplicate content pages. My problem is that while I can see why some of the content is duplicated on some of the pages I have no idea why it's coming back as duplicated. Is there a tutorial for a novie on how to read the duplicate content report and what steps to take? It's an e-commerce website and there is some repetitive content on all the product pages like our "satisfaction guaranteed" text and the fabric material... and not much other text. There's not a unique product description because an image speaks for itself. Could this be causing the problem? I have lots of URLs with over 50+ duplicates. Thx for any help.

    | Santaur
    0

  • I am working on a site that will be published in the original English, with localized versions in French, Spanish, Japanese and Chinese. All the versions will use the English information architecture. As part of the process, we will be translating the page the titles and page descriptions. Translation quality will be outstanding. The client is a translation company.  Each version will get at least four pairs of eyes including expert translators, editors, QA experts and proofreaders. My question is what special SEO  instructions should be issued to translators re: the page titles and page descriptions. (We have to presume the translators know nothing about SEO.) I was thinking of: stick to the character counts for titles and descriptions make sure the title and description work together avoid over repetition of keywords page titles (over-optimization peril) think of the descriptions as marketing copy try to repeat some title phrases in the description (to get the bolding and promote click though) That's the micro stuff. The macro stuff: We haven't done extensive keyword research for the other languages. Most of the clients are in the US. The other language versions are more a demo of translation ability than looking for clients elsewhere. Are we missing something big here?

    | DanielFreedman
    0

  • Hello everyone! newbie to SEO and have been trying to keep everything nice and ethical but I've seen on a couple of blogs today "incoming search terms" at the bottom of the blogs, then a bullet pointed list of search terms  beneath it. So I had a quick search about the use of it and noticed wordpress has a plugin that automatic ally generates these "incoming search terms". I ask is this a legitimate plugin or will this harm my blog? I assume it generally will as I can't see this being much use for the audience, rather it would be 100% for trying to lure in search engines.

    | acecream
    0

  • I would like to know what off-page SEO and on-page SEO improvements can be made to one of our client websites http://www.nd-center.com Best regards,

    | fkdpl242
    0

  • Hello, This is the first time I post on this forum but have been a Pro Member for about 11 months. Im going crazy the more I do the more it drop positions. My problem is that one of the sites that Im working is has not been on the top 50 of any of the keywords.   There were many issues but I have reduce the number. Im not sure if I can post the link here or via PM. My market is very competitive and Im using word press. One of my target keywords is: web design miami I would like for a member to give me an opinion of my site and may tell me what Im doing wrong. Thanks in advance.

    | ogdcorp
    0

  • OK so yesterday a website agreed to publish my RSS feed and I just wanted to check something. The site in question is far more established than mine and I am worrying that with my content appearing on their website pretty much at the same time as mine, will Google index theirs first and therefore consider mine to be dupe? They are linking back to each of my articles with the text "original post" and I'm not sure whether this will help. Thanks in advance for any responses!

    | marcoose81
    0

  • We have a .com domain where we are 301-ing the .co.uk site into it before shutting it down - the client no longer has an office in the UK and wants to focus on the .com. The .com is a nice domain with good trust indicators. I've just redesigned the site, added a wad of healthy structured markup, had the duplicate content mostly rewritten - still finishing off this job but I think we got most of it with Copyscape. The site has not so many backlinks, but we're working on this too and the ones it does have are natural, varied and from trustworthy sites. We also have a little feature on the redesign coming up in .Net magazine early next year, so that will help. The .co.uk on the other hand has a fair few backlinks - 1489 showing in Open Site Explorer - and I spent a good amount of time matching the .co.uk pages to similar content on the .com so that the redirects would hopefully pass some pagerank. However, approximately a year later, we are struggling to grow organic traffic to the .com site. It feels like we are driving with the handbrake on. I went and did some research into the backlink profile of the .co.uk, and it is mostly made up of article submissions, a few on 'quality' (not in my opinion) article sites such as ezine, and the majority on godawful and broken spammy article sites and old blogs bought for seo purposes. So my question is, in light of the fact that the SEO company that 'built' these shoddy links will not reply to my questions as to whether they received a penalty notification or noticed a Penguin penalty, and the fact that they have also deleted the Google Analytics profiles for the site, how should I proceed? **To my mind I have 3 options. ** 1. Ignore the bad majority in the .co.uk backlink profile, keep up the change of address and 301's, and hope that we can just drown out the shoddy links by building new quality ones - to the .com. Hopefully the crufty links will fade into insignificance over time.. I'm not too keen on this course of action. 2. Use the disavow tool for every suspect link pointing to the .co.uk site (no way I will be able to get the links removed manually) - and the advice I've seen also suggests submitting a reinclusion request afterwards- but this seems pointless considering we are just 301-ing to the new (.com) site. 3. Disassociate ourselves completely from the .co.uk site - forget about the few quality links to it and cut our losses. Remove the change of address request in GWT and possibly remove the site altogether and return 410 headers for it just to force the issue. Clean slate in the post. What say you mozzers? Please help, working myself blue in the face to fix the organic traffic issues for this client and not getting very far as yet.

    | LukeHardiman
    0

  • Hi, We are currently redesigning our gaming website (www.totallygn.com) and one of our main goals is to get listed by Google News in future. Looking at the Google News URL requirements "The URL for each article must contain a unique number consisting of at least three digits." How does the above affect SEO structure?  I was planning on using a format such as www.totallygn.com/xbox-360/360-reviews/fifa-12-review how would this compare to something like? www.totallygn.com/xbox-360/360-reviews/fifa-12-review234 Thanks in advance for your help

    | WalesDragon
    0

  • Hi All, I have a page that ranks well for the keyword “refurbished Xbox 360”.  The ranking page is an eCommerce product details page for a particular XBOX 360 system that we do not currently have in stock (currently, we do not remove a product details page from the website, even if it sells out – as we bring similar items into inventory, e.g. more XBOX 360s, new additional pages are created for them).  Long story short, given this way of doing things, we have now accumulated 79 “refurbished XBOX 360” product details pages across the website that currently, or at some point in time, reflected some version of a refurbished XBOX 360 in our inventory. From an SEO standpoint, it’s clear that we have a serious duplicate content problem with all of these nearly identical XBOX 360 product pages.  Management is beginning to question why our latest, in-stock, XBOX 360 product pages aren't ranking and why this stale, out-of-stock, XBOX 360 product page still is.  We are in obvious need of a better process for retiring old, irrelevant (product) content and eliminating duplicate content, but the question remains, how exactly is Google choosing to rank this one versus the others since they are primarily duplicate pages?  Has Google simply determined this one to be the original?  What would be the best practice approach to solving a problem like this from an SEO standpoint – 301 redirect all out of stock pages to in stock pages, remove the irrelevant page? Any thoughts or recommendations would be greatly appreciated. Justin

    | JustinGeeks
    0

  • Hi, I was using the On Page Report Card Tool here on SEOMOZ for the following page: http://www.priceline.com/eventi-a-kimpton-hotel-new-york-city-new-york-ny-1614979-hd.hotel-reviews-hotel-guides and it claims there is a canonical issue or improper use of it. I looked at the element and it seems to be fine: <link rel="canonical" href="http://www.priceline.com/eventi-a-kimpton-hotel-new-york-city-new-york-ny-1614979-hd.hotel-reviews-hotel-guides" /> Can you spot the issue and how it would be fixed? Thanks. Eddy

    | workathomecareers
    0

  • I have multiple contributors who provide content on our page. I have created an authors page that shows the picture and bio of each author along with their Google+ profile link. Each profile link goes to the authors respective profile where I have had them verify themselves as contributors. My question is will Google see each of these authors and attribute the rel=author tag correctly (even though they are listed on the same profile page) or will Google only take the first person I point to for Rel=Author?

    | PLEsearch
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.