Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • I have been hearing that the latest algorithmic changes address "Over-Optimization", which sounds very counter-productive for businesses. Why would Google hurt the companies that have a narrow market or product offering? Regarding over-optimization, does SEOmoz have a tool to help address those issues? We saw a strange decrease in our "LMS" keyword, which is the core keyword for http://interactyx.com.  I am trying to figure out if we are over-optimized. Can anyone provide any suggestions and tools I can use to make sure my site now matches what Google wants? Thanks!

    | TOPYX
    0

  • Hi, I have a restaurant menu directory listing website (for example www.menus.com). Restaurant can have there menu listed on this site along with other details such as opening hours, photos ect. An example of a restaurant url might be www.menus.com/london/bobs-pizza. A feature i would like to offer is the ability for Bob's pizza to use the menus.com website listing as his own website (let assume he has no website currently). I would like to purchase www.bobspizza.com and 301 redirect to www.menus.com/london/bobs-pizza Why?
    So bob can then list bobspizza.com on his advertising material (business cards etc, rather than www.menus.com/london/bobs-pizza). I was considering using a 301 redirect for this though have been told that too many domain level redirects to one single domain can be flagged as spam by Google. Is there any other way to achieve this outcome without being penalised? Rel canonical url, url masking? Other things to note: It is fine if www.bobspizza.com is NOT listed in search results. I would ideally like any link juice pointing to www.bobspizza.com to pass onto www.menus.com though this is a nice to have. If it comes at the cost of being penalised i can live without the link juice from this. Thanks

    | blackrails
    0

  • i already used the validation tool from Google. It tell's me it's implemented, but i would like to know specifically if everything is implemented correctly and in a non spam way. Here is a product link to test: http://www.suddora.com/pink-sweatbands-wholesale-pink-wristbands.html

    | Hyrule
    1

  • Hello, My company has four websites in the same vertical and we're planning to integrate them all on our main company site. So instead of www.siteone.com, www.sitetwo.com, www.sitethree.com, etc. It would be www.branddomain.com/site-one, www.branddomain.com/site-two, etc. I have a few questions... Should we redirect the old domains to the new directories or leave the old domains and stop updating them with new content... Then have the old content, links, etc. 301 to the same content on the new site? Should we literally move all of the content to the new directories? Any tips are appreciated. It's probably pretty obvious that I don't have a ton of technical skills... my development team will be doing the heavy lifting. I just want to be sure we do this correctly from an SEO perspective! Thanks for the help, please let me know if I can clarify anything. E

    | essdee
    0

  • I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?

    | BostonWright
    0

  • I was looking at which websites ranks in the TOP 3 for the keyword "computers"... I noticed that first is wikipedia and then there are Dell and Apple... I then did an on page report card and I noticed that wikipedia has a grade A (which is great ) However, Apple has an F ( which sucks !! ) but there still rank out there. My question is why is Apple ranking for the keyword computers with no tiitle, no URL, no H1, no body, no B/Strong... when wikipedia has all of that and the term " computers " occurs 290 times on its page... Is is due to the fact that apple has millions of external links and is that enough to rank even with an " irrelevant " page ? By the way I have noticed that on other keywords such as " bicycle ". Wikipedia is ranking 1 st and then sites like www.trekbikes.com are out there but they shouldn't based on their homepage "optimization ". I know there are other factors but I am just trying to figure why such sites ( like apple or trek bikes ) rank out there. Thank you,

    | seoanalytics
    0

  • | Oct 9 (1 day ago) | See question ive posted here https://productforums.google.com/forum/#!msg/webmasters/Wz_pAz7_lk8/jR8DvSyn5T4JHi
    We've submitted 2 reconsideration requests in a month and google are not replying to us.   They've caused huge loss in business for us over links that are now against google guidelines.  I checked our links from the tools provided in webmaster tools and i can see some that are against google guidelines that have recently been spidered, however these links were built in 2008-09, long before the panda updates where this type of link would be classed as spammy.IS their anyone i can contact to speak with as clearly google are too big to care anymore or respond to such requests. http://www.cyberhostpro.com the responce ive got is | |
    |
    | On Tuesday, October 9, 2012 10:41:33 AM UTC-3, cyberhostpro wrote: however these links were built in 2008-09, long before the panda updates When we all rode horses to work there was no need for speed limits on the highway & byways... but then things change.In 2008-09 those links may have been worth something... why do you believe they deserve that value today?Get rid of all those 2008-09 links and you should be ok! If you could offer nay help or advice with this it would be most appreciatedRegardsDaniel

    | seogeek11
    0

  • I have a client that has multiple apartment complexes in different states and metro areas. They get good traffic and pretty good conversions but the site needs a lot of updating, including the architecture, to implement SEO standards. Right now they rank for " <brand_name>apartments" on every place but not " <city_name>apartments".</city_name></brand_name> There current architecture displays their URLs like: http://www.<client_apartments>.com/index.php?mainLevelCurrent=communities&communityID=28&secLevelCurrent=overview</client_apartments> http://www.<client_apartments>.com/index.php?mainLevelCurrent=communities&communityID=28&secLevelCurrent=floorplans&floorPlanID=121</client_apartments> I know it is said to never change the URL structure but what about this site? I see this URL structure being bad for SEO, bad for users, and basically forces us to keep the current architecture. They don't have many links built to their community pages so will creating a new URL structure and doing 301 redirects to the new URLs drastically drop rankings? Is this something that we should bite the bullet on now for future rankings, traffic, and a better architecture?

    | JaredDetroit
    0

  • Hi there, I have some good links pointing to one of my web pages at the moment, however we are just about to launch a new design with new URL structure and I am clear that I need to do a 301 redirect on the URL to the new URL. However, do I keep the old URL live forever? or can I remove it after a while? Kind Regards

    | Paul78
    0

  • My question is about paywalls made with cookies and local storage. We are changing a website with free content to a open paywall with a 5 article view weekly limit. The paywall is made to work with cookies and local storage. The article views are stored to local storage but you have to have your cookies enabled so that you can read the free articles. If you don't have cookies enable we would pass an error page (otherwise the paywall would be easy to bypass). Can you say how this affects SEO? We would still like that Google would index all article pages that it does now. Would it be cloaking if we treated Googlebot differently so that when it does not have cookies enabled, it would still be able to index the page?

    | OPU
    1

  • I have a page that id ranking quiet good (Page1) for the plural of a Keyword but it is just ranking on Page 3 for the Singular Keyword. For more then one Year I am working on Onpage and Offpage optimization to improve ranking for the singular term, without success. Google is treating the two terms almost the same, when you search for term one also term 2 is marked in bold and the results are very similar. The big difference between both terms is in my opinion that one is more for informational search the other one is more for transactional search. Now i would be curious to know which factors could Google use to understand weather a search and a website is more transactional or informative? Apart of mentioning: Buy now, Shop, Buy now, Shop, Special offer etc. Any Ideas?

    | SimCaffe
    0

  • Hi everyone, I am currently working on the website of a friend, who's owning a French spa treatment company. I have been working on it for the past 6 months, mostly on optimizing the page titles and the link building. So far the results are great in terms on normal results : if you type most of the keywords and the city name, the website would be very well positioned, if not top positioned. My only problem is that in the local results (Google Maps), nothing has improved at all. In most of the same keyword where the website is ranking 1st on normal results, the website doesn't appear at all on the same keywords in local results. This is confusing as you would think Google think the website is relevant to the subject according to the normal results but it doesn't show any good ones in a local matter. The website is clearly located in the city (thanks to the pages titles and there's a Google Map in a specific page dedicated to its location). The company has a Google Places page and it has positive customers reviews on different trusted websites for more than a year now (the website is 2 years old). I focused my work concerning the link building on the local websites (directories and specialized websites) for the past 2 months. The results kept improving on normal results but still no improvement at all in the local ones. As far as I know, there is no mistakes such as multiple addresses for the same business etc. Everything seems to be done by the rules. I am not sure at all what more I can do. The competitors do not seem to be working their SEO pretty much and in terms of linking (according to the -pretty good- Seomoz tools), they have up to 10 times less (good) links than us. Maybe you guys have some advice on how I can manage this situation ? I'm kind of lost here 😞 Thanks a lot for your help, appreciate it. Cheers,
    Raphael

    | Pureshore
    0

  • Working on SEO for client who is at a new address. Luckily? there are no google reviews attached to the old addresses (there 2). Should I try to update one of the old ones? Or should I just go ahead and create a new listing? What are the pros and cons of each decision.

    | greenhornet77
    0

  • Dear fellow Mozzer's, for one of my clients I get different local results in Google. My client is a real-estate broker and when I search on "real-estate agent" + the city name we are on top. So whoohoo you would say BUT when Firefox has the exact city name determined as the location I am in and I only use "real-estate agent" I get also the local results but we are listed as number 8?? Hope anyone can give me insights as I have no idea what's causing this. Thanks in advance for your help!

    | newtraffic
    0

  • I'd appreciate feedback on a situation. We're going through a major overhaul in how we globally manage our websites. Regional servers were part of our original plan (one in Chicago, UK, and APAC)  but we've identified a number of issues with this approach. Although it's considered a best practice among many, the challenges we'd face doing it are considerable (added complexity, added steps and delays to updating sites, among others). So, we shifted our plan and how are looking at hosting here in the US but to use Akami to deliver images and other heavier data pieces from their local servers (in the UK, etc.).  This is how many of the larger companies like Amazon, etc. delivery their global websites. We hope that using Akami will allow us to have good performance while simplifying our process. Any warning signs we should be aware of? Anyone doing it this way and has a good experience/bad experience?

    | josh-riley
    0

  • Hey MozFolk, I was wondering what the best and SAFEST way to handle this situation is; I am doing a redesign of our current website, but the new site will have different content. Should we just forward the entire root domain in the HT Access file? Or redirect each and every URL using a 301? I know these, terms but never actually done it myself, and cannot risk losing the SEO weight of this website. How do I handle a group of pages that they don't want to continue to use also? Do I just leave those URLs be, or do I forward all of them to one new page (or homepage) on the new site? Please help me look like a rockstar and save the ship from sinking itself!

    | DerekM88
    0

  • Hi, Was speaking to a client today and got asked how damaging two way links are. i.e. domaina.com links to domainb.com and domainb.com links back to domaina.com. I need a nice simple layman's explanation of if/how damaging they are compared to one way links. And please don't answer with you lose link juice as I have a job explaining link juice.... I am explaining things to a non techie! Thank you!!

    | JohnW-UK
    0

  • Would re-skinning, duplicating an exising ecommerce website with a new domain name cause any ranking issues? The plan would be that all product data, pricing info etc would be identical, the site would have a minor redesign to change colours, logos etc and all duplicate content would be rel=canonicaled to the original site. In case you are wondering the reason for this is a customer with an existing site wants to try out a new brand without incorporating a massive development costs. The majority of traffic would be driving through google shopping, a bit of PPC, social etc. Is this site duplication likely to harm the original site or will setting up rel=canonical to point to the original site going to be sufficient enough to prevent this happening? Is there anything else is should consider? Many thanks for your help

    | JustinTaylor88
    0

  • My site was penalized for specific pages in the UK On July 28 (corresponding with a Panda update). I cleaned up my website and wrote to Google and they responded that "no manual spam actions had been taken". The only other thing I can think of is that we suffered an automatic penalty. I am having problems with my sitemap and it is indexing many error pages, empty pages, etc... According to our index status we have 2,679,794 not selected pages and 36,168 total indexed. Could this have been what caused the error? (If you have any articles to back up your answers that would be greatly appreciate) Thanks!

    | theLotter
    0

  • Hi guys, After moving from an old static .htm site to Wordpress, I 301'd all old .htm urls fine to the new trailing slash foldery style /wordpress-urls/ in htaccess no problem. But Google Webmaster Tools tells me I still have hundreds of external links pointing to a similar version of the old urls (but without the .htm), giving lots of not founds and 403s. Example of the urls linked to that 403 not found: http://www.mydomain.com/filename So I'm wondering how I do a 301 redirect from a non-exisiting url that also has no file extention as above and is not like a folder? This seems like a lot of possible external link juice to lose. Thanks!

    | emerald
    0

  • When it's detected that a mobile device is accessing the site it has the ability to redirect from www.example.com to m.example.com. Does it make more sense to employ a 301 or 302 redirect here? Google says a 301 but does not explain why (although usually I stick to "when in doubt, 301") . It seems like a 302 would prevent passing link juice to the mobile site and having mobile-optimized results also showing up in Google's index. What is the preference here?

    | SEOTGT
    0

  • Was wondering how easiest to find all of a website's existing SERPs?

    | McTaggart
    0

  • wow, after being hit with panda i'm having a real tough time with this issue. Maybe i'm going about it the wrong way.. How can i possibly write unique content for all of these different colors of the same product?... http://www.suddora.com/green-sweatbands-wholesale-green-wristbands.html http://www.suddora.com/pink-sweatbands-wholesale-pink-wristbands.html http://www.suddora.com/black-sweatbands-wholesale-black-wristbands.html http://www.suddora.com/green-headbands-wholesale-pricing-available.html http://www.suddora.com/pink-headbands-wholesale-pricing-available.html http://www.suddora.com/black-headbands-wholesale-pricing-available.html Should i be going about this a different way? Thanks, Paul

    | Hyrule
    0

  • Recently notice that the domain authority for twitter is shown as 8 instead of the 100 that it used to. Is this a glitch or an actual change?

    | casper434
    0

  • I have several clients who have old addresses scattered around the search engines. Correcting them all individually is extremely time consuming and I'm looking into services like Yext and Localeze, but they tend to be fairly pricey. Does Localeze actually end up correcting most of the random listings? Is there any other services I should be aware of? Thanks! Tom

    | TomBristol
    0

  • I am making a new site for a company that services many cities. I was thinking a url structure like this, website.com/keyword1-keyword2-keyword3/cityname1-cityname2-cityname3-cityname4-cityname5. Will this be the best approach to optimize the site for the keyword plus 5 different cities ? as long as I keep the total url characters under the SeoMoz reccomended 115 characters ? Or would it be better to build separate pages for each city, trying to reword the main services to try to avoid dulpicate content.

    | jlane9
    0

  • Hi SEOMoz Moderators and Staff, My web developer and I are having a world of trouble setting up the best way to 301 redirect from www.tisbest.org/default.aspx to the www.tisbest.org since we're using session very heavily for our ASP.NET using MasterPages.  We're hoping for some help since our homepage has dropped 50+ positions for all of our search terms since our first attempt at setting this up 10 days ago.  = (  A very bad result. We've rolled back the redirects after realizing that our session system was redirecting www.tisbest.org back to www.tisbest.org/default.aspx?AutoDetectCookieSupport=1 which would redirect to a URL with the session ID like this one: http://www.tisbest.org/(S(whukyd45tf5atk55dmcqae45))/Default.aspx which would then redirect again and throw the spider into an unending redirect loop.  The Google gods got angry, stopped indexing the page, and we are now missing from our previous rankings though, thankfully, several of our other pages do still exist on Google. So, has anyone dealt with this issue?  Could this be solved by simply resetting up the 301 redirects and also configuring ASP.NET to recognize Google's spider as supporting cookies and thus not serving it the Session ID that has caused issue for us in the past? Any help (even just commiserating!) would be great.  Thanks!  Chad

    | TisBest
    0

  • We have just taken the steps to start building links to www.towelsrus.co.uk, I am concerned about the state of external links to the site created by previous companies, i.e are they OK, or doing us harm, could I get more out f whats already in place or should we focus purely on getting new links? We also have about 25 pages that have 302 errors and contain external links to the site. What should i do with these. try and get them re-directed to our site with appropriate anchor text or simply put a 301 re-direct in place? In essence where I start, We want to build and increase traffic in particular for towels, bathrobes, dressing gowns and bolster our position as our positions are fluctuating a little but steadier than they have been. Any help appreciated.

    | Towelsrus
    0

  • A friend of mine runs a website over at http://www.web-design-herefordshire.co.uk/ The keywords he is targeting  web design hereford and web design herefordshire. When you search these terms (he's found on page 3 on google.co.uk) his title tag is shortened to web design hereford, Does google shorten these when the keyword being searched is the keyword in the domain? I've seen it on a few others.

    | jasonwdexter
    0

  • Hi, I just 301 redirected my 3 year old domain to a new domain which was created yesterday. Now i want to start link building to my new domain. Should i start slowly by publishing 4-5 articles on article directories and a 1 press release a week? Can someone suggest me some ideas on how to handle a new domain. Will be waiting for replies.

    | Dex378378378
    0

  • I get daily Google alerts for our site and a competitor's site. I have noticed that I am getting multiple alerts a day from Google about products and product categories on the competitor's site. Every now and then there's an actual alert for a linking blog post or something else. How is Google noticing new product on this site but has never done the same for ours? Is there some kind of strategy involved here that I don't know about? The site is http://bit.ly/Q0o2ob

    | IanTheScot
    0

  • SEOMoz returns a Duplicate Page Content error for a website's index page, with both domain.com and domain.com/index.html isted seperately. We had a rewrite in the htacess file, but for some reason this has not had an impact and we have since removed it. What's the best way (in an HTML website) to ensure all index.html links are automatically redirected to the root domain and these aren't seen as two separate pages?

    | ContentWriterMicky
    0

  • In my learning about Local SEO recently, I keep reading the importance of NAP (name, address, phone number). But what if you are only using different phone numbers because you are tracking pay per call. How would set up my Local SEO strategy? The newest phone numbers are NOT going to match all the websites, social media and previous listing in directories, etc. Is this a bad move? Should I suggest that we do one or the other going forward with other clients but not both? Thanks a lot...

    | greenhornet77
    0

  • I have been researching this for some time, and I have seen some videos index and some not depending on their Flash player. I need to use a Flash player that passes a Video Id instead of the movie url. Do you know of some player features that help or hurt seo?

    | braines
    0

  • I manage a site on the 3dCart e-commerce platform. I recently updated the SSL certificate. Today, when I tried to log-in via FTP, I couldn't connect. The reason I couldn't connect was because my IP had changed. Last week the site experienced almost across the board rankings drops on lmost every important keyword. Not gigantic drops, a lot just lost 2-4 postiions, but that's a lot when you were #2 and you drop to #4 or # 6. Initially I thought it was because I was attempting to markup my product pages using structured data following guidelines from schema.org. I am not a coder so it was a real struggle, especially trying to navigate 3dCart's listing templates. I thought the rankings drops were Google slapping me for bad code, but now I wonder....could I really have dropped down because of that IP address change? Does anyone have a take on this? Thanks!

    | danatanseo
    0

  • One of my new clients has an eCommerce site selling chair mats, that ranks first page for their primary keyword ("chair mats"), but the serp is not showing their home page... It is an odd "resources" page that mostly just has outbound links. I've never seen this. Also, there are no external inbound links to this particular page. How can this be ranking higher than the home page... weeeeird. Update: I haven't done any optimization yet... and yes, I know the site is outdated 🙂 My location is set to San Diego. If I change it (New York City), the Resources2.htm page goes away from the page 1 ranking, and I get a page 2 home page ranking. This is so strange 😕

    | Joes_Ideas
    0

  • HI Guys Following on from my previous posts i have still not got my rankings back, http://www.seomoz.org/q/301-redirect-have-no-ranking i am beginning to think that i do have a underlying issue in the site which is restricting me My old site www.economyleasinguk.co.uk was moved to www.economy-car-leasing.co.uk, as mentioned the 301 seemed to go really well and all pages updated within 48 hours, however over 5 months on and the juice from the old site is still not pushed over and i hardly rank at all for anything. here are a list of things i have tried 1:Swapped the original 301 which was PHP for an Htaccess 2: added canonical tag to all pages 3: Turned on internal links as per this post by Everett Sizemore http://www.seomoz.org/blog/uncrawled-301s-a-quick-fix-for-when-relaunches-go-too-well number 3 was only done 5 days ago and initially bot traffic was immense, and may need a bit more time to see any results. I still think i have another underlying issue due to the below reasons 1: Page rank on home page is one but inner pages mixture of 1,  2 and 3 sporadically 2: If I copy text from home page no results 3: Open site explorer still has the old site at with a PA of 60 compared to 42 for the new site 4: Checked server logs and Google is visiting old site 5: Header responses are all correct for the canonicals and see no chaining of the 301’s 6: All pages are do follow and no robots restrictions 7: site:has only in the last few days removed the old site from the index naturally it could be that its just a matter of time however 5 months for a 301 is a very long time and 80% traffic loss is immense I would really appreciate it if someone can give the site a once over and see if i have missed anything obvious. Thanks in advance

    | kellymandingo
    0

  • I redesigned my companies website and I am first and foremost an SEO person so I know the importance of a well laid out website.  Furthermore, I know realistically you should NEVER hide text whether it's with WH or BH intentions but here is my problem. For every page I have all the details taken care of except proper placement of H1 tags. My website is responsive designed VERY competitive industry I have to make sure it is properly developed both design wise and seo wise It's an INC 5000 company so NO BH intentions On phones and tablet devices I have the header images hidden and in the place of header images I have the information as in location, service,etc of whatever that page may be.  This makes it look good on desktops and serves up information quickly to people using phones and tablets. My question is: Would it be bad to turn that text seen on tablets and phones into an h1 tag as it's hidden on desktops with CSS but available on mobile devices.  My problem is making the h1 tag's work with the desktop versions visually as placement doesn't make since. Any opinions are appreciated. Thanks Ballanrk

    | ballanrk
    0

  • When I send the SEO Bot to crawl my website, it only crawls the homepage? Does anyone know why this is happening? Thanks, Andy

    | ChrisCurdDesign
    0

  • I have an ecommerce site (www.brick-anew.com) focused on Fireplace products and we also have a separate blog (fireplacedecorating.com) focused on fireplace decorating. My ecommerce site needs new content, pages, internal links, etc... for more Google love, attention, and rankings. My question is this: Should I add a blog to the ecommerce site for creating new content or should I just add and create new pages? I have lots of ideas for relevant new content related to fireplaces. Are there any SEO benefits to a blog over new static pages? Thanks! SAM

    | SammyT
    0

  • I am doing SEO for a client in Canada. A few of his keywords are: palliative care
    home health care
    home care
    respite care
    senior care He wants to attract vistors only from Calgary specifically, not Canada wide. What should I do to optimize his website only for Calgary region? Should I add the word 'Calgary' to all his keywords?

    | KS__
    0

  • What are the benefits / disadvantages of geo-targeting content based on IP address.  A client is interested in serving up different content on their homepage based on what area the user is coming from.  This seems like an SEO nightmare to me as search engine spiders could potentially see different content depending on when they visit.  Is there a best practices here? Or is it looked down upon in regards to SEO?  Any information would be helpful.

    | MichaelWeisbaum
    0

  • Case: Webshop with over 2000 products. I want to make a logical sitemap for Google to follow. What is best practice at this field? Should i remove the on-page sitemap there is in html with links (is shown as a footer link called "sitemap") and only have the domain.com/sitemap.xml ? Links for great articles about making sitemaps are appreciated to. The system is Magento, if that changes anything.

    | Mickelp
    0

  • Hi Im Looking at reversing a 301 i did 5 months ago, I originally changed names of a site for reasons where the site was going to be split into 2 where part of the site i would run and the other where i would get my business partner to run it However since the 301 i have lost 80% of traffic and cannot find the reason as to what especially as everything looks perfect except the rankings, to put it into context 150 keywords i was tracking all were page 5 and below with 40% being page 1 Now only 7 from that same 150 are in the top 10 pages (First 100 results), The issue i have must be very rare as i have posted for help with little or no response, which tells me that the kind people who have looked could not see any issues as this is normally a very helpful community (below is the last thread i asked for help) http://www.seomoz.org/q/301-redirect-how-to-get-those-juices-flowing So 5 months on i am considering removing the 301 and hoping that some kind of normality returns by reinstating the old URL So my question is that is what im about to do wise, do i have any options, is this something you have done in the past if so how did you get on Thanks in advance

    | kellymandingo
    0

  • A client (Website A) has allowed one of their franchisees to use some of the content from their site on the franchisee site (Website B). This franchisee lifted the content word for word, so - my question is how to best establish that Website A is the original author? Since there is a business relationship between the two sites, I'm thinking of requiring Website B to add a rel=canonical tag to each page using the duplicated content and referencing the original URL on site A. Will that work, or is there a better solution? This content is primarily informational product content (not blog posts or articles), so I'm thinking rel=author may not be appropriate.

    | Allie_Williams
    0

  • Hi everyone... When getting our weekly crawl of our site from SEOMoz, we are getting errors for duplicate content. We generate pages dynamically based on variables we carry through the URL's, like: http://www.example123.com/fun/life/1084.php
    http://www.example123.com/fun/life/1084.php?top=true ie, ?top=true is the variable being passed through. We are a large site (approx 7000 pages) so obviously we are getting many of these duplicate content errors in the SEOMoz report. Question: Are the search engines also penalizing for duplicate content based on variables being passed through? Thanks!

    | CTSupp
    0

  • Hello Mozers, I am currently building an HTML 5 site. I've run into a couple of issues. While implmenting segmentation in each of my mian menu iten, I am able to pluggin Meta data only for one segement (or the page). I am unable to inser Meta data for each of the segments. For example: I have (main menu) Services ----> Submenu (teaching, upgrading, Dancing) I can implement meta data for the Services but not for teaching, upgrading and Dancing as they are segment in the same page. Whats the best logic to get around this

    | waspmobile
    0

  • Hiya guys... Just a quicken, So my forum, talknightlife.co.uk is currently 10th on google for "nightlife forum" I have about 15 back links, 26 page autority. Now what i'm trying to do, which everyone else is doing, is trying to move it up a couple of spots maybe to 5th or something. What would your tactics be, I'm disregarding all the crap I read in the forums etc, you guys on here tend to have the best explanation. Let it rip 🙂 Cheers guys Luke.

    | Lukescotty
    0

  • My website dropped from ~15 to 500 + for our main keywords in target markets (but are still doing okay in other countries and for other keywords) I cleaned my site up and contacted Google who told me no manual spam actions were taken on my site.   The only thing I can think is that we suffered from an automatic penalty (the drop corresponded with a small panda update). If that is in fact what happened, how do we recover? Also feel free to contribute other ideas about what may have occurred.

    | theLotter
    0

  • This question seems to come up a lot. 70 flat page site. For ease of navigation, I want to link every page to one-another. Pure CSS Dropdown menu with categories - each expanding to each of the subpage. Made, implemented, remade smartphone friendly. Hurray. I thought this was an SEO principle - ensuring good site navigation and good internal linking. Not forcing your users to hit "back". Not forcing your users to jump through hoops. But unless I've misread http://www.seomoz.org/blog/how-many-links-is-too-many then this is something that's indirectly penalised by Google because a site with 70 links from its homepage only lets each sub-page inherit 1/80th of its PageRank. Good site navigation vs your subpages are invisible on Google.

    | JamesFx
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.