Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • Hello All,
    In light of Wil Reynold's closing keynote at Portland's Searchfest, I thought I might try posting here to get some advice. We run a family business on the side and we're looking at starting to use volusion.com for our e-commerce solution. The catch is we currently have a wordpress site summitmining.com running on thesis with great SEO. Ranking #1 & #2 for our highest trafficked terms. Ideally, I'd like Summitmining.com to direct to the Volusion store and then summitmining.com/blog to go to our wordpress installation BUT since the volusion site will be hosted with the company and they will not host our wordpress installation we'd have to use a subdomain instead of a subdirectory which I understand will be bad for SEO. Does anyone have any recommendation on how to set this up without totally screwing up our ranking OR any recommendations of an easy to use shopping cart (I've worked on a magento site before and it's too complex for us) that wouldn't require a separate or subdomain? Thank you so much!
    -Cherie Prochaska
    503-816-3557
    [email protected]
    @cherieprochaska

    | CherieP
    0

  • When I view my campaign report I'm seeing duplicate content/ meta for mydomain.com and mydomain.com/   (with a slash) I already applied a 301 redirect as follows: redirect 301 /index.php/ /index.php Where am I messing up here?

    | cgman
    0

  • Hi people, I keep on reading and reading , but I won't get it... 😉 I mean this page: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077&topic=2370587&ctx=topic On the bottom of the page they say: Step 2: Use rel="alternate" hreflang="x" Update the HTML of each URL in the set by adding a set of rel="alternate" hreflang="x" link elements. Include a rel="alternate" hreflang="x" link for every URL in the set, like this: This markup tells Google's algorithm to consider all of these pages as alternate versions of each other. OK! Each URL needs this markup. BUT: Do i need it exactly as written above, or do I have to put in the complete URL of the site, like: The next question is, what happens exactly in the SERPS when I do it like this (an also with Step1 that I haven't copied here)? Google will display the "canonical"-version of the page, but wehen a user from US clicks he will get on http://en-us.example.com/**page.htm **??? I tried to find other sites which use this method, but I haven't found one. Can someone give me an example.website??? Thank you, thank you very much! André

    | waynestock
    0

  • So i'm a beginner/intermediate SEO and uptil about 3 weeks ago i enjoyed Top 3 rankings for all my keywords(VIrtual Assistant,Virtual Assistants, Virtual Personal Assistant,Virtual Personal Assistants and so on) for my site www.247VirtualAssistant.com. All of a sudden i dropped in rankings and  can't figure out why. I ran a link analysis and nothing looks like it changed, in fact i still command much higher domain authority than my competition, but i'm stuck on the bottom of the 2nd page. I can't tell if i'm being penalized, if the other sites all of sudden just outperformed me or something else is happening here. I've also noticed a lot of "dancing" in my serps, I've been in 2nd last position on the 2nd page, then 1st of the third page, then last on the 2nd page and so on. Can someone please help me make sense of this?? Thanks! Thomas, a very confused an desperate website owner

    | Shajan
    0

  • I was just checking Google Webmaster Tools for one of the first times (I know this should have been a regular habit). I noticed that on Feb 8th we had almost 80K errors of type 503.  This is obviously very alarming because as far as I know our site was up and available that whole day.  This makes me wonder if there is a firewall issue or something else that I'm not aware of. Any ideas for the best way to determine what's causing this? Thanks, Chris

    | osports
    0

  • Is it better for SEO to delete web pages that I don't want anymore or should I 301 redirect all of the pages I delete to the homepage or another live page?

    | CustomOnlineMarketing
    0

  • My on page optimization grade is an "A" with the following factors; Factor Overview <dl class="scoreboard clearfix"> <dt>Critical Factors</dt> <dd>4 / 4</dd> <dt>High Importance Factors</dt> <dd>7 / 7</dd> <dt>Moderate Importance Factors</dt> <dd>8 / 9</dd> <dt>Low Importance Factors</dt> <dd>11 / 11</dd> <dt>Optional Factors</dt> <dd>5 / 5</dd> </dl> The main thing I appear to be missing is keywords in my URL. How truly important is that in today's SEO world and how much time or ranking would be lost if I do not have control to change the external links to my website if I decided to migrate to a keyword relevant url?

    | classa
    0

  • any one else have the problem of pages loading once logged into their seomoz account??

    | james100
    0

  • What is best practice regarding domain names? Assuming I would target the keyword "example" examp.le or example.com I guess the latter is preferable, what could be the issues with the first option? /Lars Eriksson

    | LarsEriksson
    0

  • A real estate site has a landing page for a particular zip code: site.com/zip/99999 On this page, there are links which add arguments to the URL, resulting in structures like this: site.com/zip/99999?maxprice=1000000&maxbeds=3 My question is on using a canonical URL for the pages with arguments.  These pages may have lots of duplicate content, so should I direct search engines back to the base URL for the search? (site.com/zip/99999) A side note is that these pages with arguments could have no listings returned (no listings found) or could come back with listings (then it wouldn't be duplicate), but that can change on a day to day basis.

    | SteveCastaneda
    0

  • This is going to sound like such a newbie question. I don't know how I've been doing SEO all this time and haven't come across the answer. It's usually true that a homepage will have the most backlinks, but what if it doesn't? What if an internal page has more links? Is it possible for an internal page to have a higher PR than a homepage? Additionally, if I do on-page SEO for many internal pages of my site, will it only help those particular pages or will it give the whole site a boost, thus making the homepage rank higher in the SERPS?

    | UnderRugSwept
    0

  • We have hundreds of pages that are getting categorized as duplicate content because they are so similar. However, they are different content. Background is that they are names and when you click on each name it has it's own URL. What should we do? We can't canonical any of the pages because they are different names. Thank you!

    | bonnierSEO
    0

  • I have been wondering about this for a while now with regards to several of my sites.  I am getting a list of pages that I have blocked in the robots.txt file.  If I restrict Google from crawling them, then how can they consider their existence an error?  In one case, I have even removed the urls from the index. And do you have any idea of the negative impact associated with these errors. And how do you suggest I remedy the situation. Thanks for the help

    | phogan
    0

  • Is it bad practice to use a text indent through CSS for H1 text on a homepage(basically hiding h1 text)? I'm just trying to compensate for the fact that some text that should really be in the h1 tag is actually an image.

    | inc.com
    1

  • Good morning from ten degrees c mostly cloudy wetherby UK 🙂 Ive got a 70% bounce rate on this home page - http://www.goldsboroughestates.co.uk/Home.aspx Whilst their is a telephone number on the home page it feels too high. Ive drilled down and identified the following: 1. Medium (NONE)  is causing 80% bounce
    2. No ppc ad are running I get a futher explanation from the Google Gods: •Auto-tagging is on but cost data is not applied (learn more)
    •There is a redirect in the URL
    •The gclid parameter is altered or dropped from the ad
    •Auto and manual tagging are being used at the same time
    •Manually tagged URLs are missing a value My feeling is eMails with links to the home page are causing such I high medium (none count) Would be grateful if any SEO mozzer could offer there insights, Thanks, David

    | Nightwing
    0

  • Hi All I am currently working on a DotNetNuke site. I have enabled friendly URL's which have changed the url structure from the default setting of TabId=x to whatever the page name is set as. I will use the following page as an example - www.notarealdomain./graphicdesign.aspx Now I would like to know if it would be worth my time to change this to "/graphic-design.aspx through the use of a 301 redirect and/or a rel=can. Any help would be much appreciated. Thanks

    | masterpete
    0

  • Magento pages have been giving me a lot of trouble with the canonical tags. In some cases duplicate pages are showing up, so I need to add the canonical tag. In other cases I'm getting an error that there are multiple canonical tags per page. How can I get my pages canonized without duplicate tags? It seems like it's either too much or not enough, no matter what I do. Note: this only applies to category and product pages.

    | GravitateOnline
    0

  • Hey folks, This one is over my head.  I'm helping out a friend's dental office website (www.capitolperiodontal.com), and their home page source code points to the .net TLD for its content apparently: | | <title></span>http://www.capitolperiodontal.com/</title> http-equiv="content-type" content="text/html" /> rows="100%" id="dd_frameset_0001"> src="http://www.capitolperiodontal.net/" name="dd_content_0001" framespacing="0" frameborder="0" noresize="noresize" title="capitolperiodontal.com" /> <noframes></noframes> My idea was to load all the content from the .net to the .com, then redirect the .net to the .com as it has better domain authority and is, well a .com. Any insights what this iframe biz is all about and if my strategy above is ok? Many thanks folks! john

    | juanzo007
    0

  • Hi, We recently underwent a platform change and unfortunately our updated ecom site was coded using java script. The top navigation is uncrawlable, the pertinent product copy is undetectable and duplicated throughout the code, etc - it needs a lot of work to make it (even somewhat) seo-friendly. We're in the process of implementing ajax #! to our site and I've been tasked with creating a document of items that I will test to see if this solution will help our rankings, indexing, etc (on Google, I've read the issues w/ Bing). I have 2 questions: 1. Do I need to notify our content team who works on our linking strategy about the new urls? Would we use the #! url (for seo) or would we continue to use the clean url (without the #!) for inbound links? 2. When our site transferred over, we used meta refresh on all of the pages instead of 301s for some reason. Instead of going to a clean url, our meta refresh says this:  . Would I update it to have the #! in the url? Should I try and clean up the meta refresh so it goes to an actual www. url and not this browsererrorview page? Or just push for the 301? I have read a ton of articles, including GWT docs, but I can't seem to find any solid information on these specific questions so any help I can get would be greatly appreciated. Thanks!

    | Improvements
    0

  • This question follows on from my earlier question http://www.seomoz.org/q/how-to-replace-my-co-uk-site-with-my-com-site-in-the-us-google-results My client owns www.blindbolt.co.uk for the UK site and www.blindboltusa.com for their US site. They will shortly be having a new site for Australia. They have just acquired www.blindbolt.com and have expressed an interest in using this as the main hub for all of their sites, i.e. http://uk.blindbolt.com, http://aus.blindbolt.com. The current, existing sites (e.g. www.blindbolt.co.uk) could be 301'd to the new locations. Could I have your thoughts please on whether to go down this route of having international subdomains , vs keeping the sites on separate top level domains? What should I take into consideration? Is google smart enough to return different subdomain results in different countries? Many thanks!

    | OffSightIT
    0

  • Here's the setup: We have a website, www.laptopmd.com, and we're ranking quite well in our geographic target area. The site is chock-full of local keywords, has the address properly marked up, html5 and schema.org compliant, near the top of the page, etc. It's all working quite well, but we're looking to expand to two more locations, and we're terrified that adding more addresses and playing with our current set-up will wreak havoc with our local search results, which we quite frankly currently rock. My question is 1)when it comes time to doing sub-pages for the new locations, should we strip the location information from the main site and put up local pages for each location in subfolders? 1a) should we use subdomains instead of subfolders to keep Google from becoming confused? Should we consider simply starting identically branded pages for the individual locations and hope that exact-match location-based urls will make up for the hit for duplicate content and will overcome the difficulty of building a brand from multiple pages? I've tried to look for examples of businesses that have tried to do what we're doing, but all the advice has been about organic search, which i already have the answer to. I haven't been able to really find a good example of a small business with multiple locations AND good rankings for each location. Should this serve as a warning to me?

    | LMDNYC
    0

  • hi everyone, I have been going through a site recently and i am noticing certain seo errors that are being caused by the blog. Nothing too harmful, but nonetheless i am hoping to correct them. 1 - The seomoz software have identified that i have duplicate title tags on the following http://www.altman.co.uk/blog http://www.altman.co.uk/blog?page=1 http://www.altman.co.uk/blog?page=2 ETC, ETC, ETC... Now am i right that i need to canonicalize those types of urls with the rel="next" and rel="previous"? <colgroup><col width="583"></colgroup>
    | <colgroup><col width="583"></colgroup>
    | <colgroup><col width="583"></colgroup>
    | | | |

    | AITLtd
    0

  • Hi guys, We have done some work to try to remove pages from Google index. We have done the following: 1. Noindex tag 2. Make pages returning a 404 response. Is there anyway to notify Google about these changes so we can speed up the process of removing these pages from Google index? Also regarding the URL removal tool, Google says that it's used to remove URLs from search results, does it mean the URLs are removed from their index too? Many thanks guys David

    | sssrpm
    0

  • Hi, I've acquired a vast amount of domains related to my industry over the past 2-3 years. The domains themselves are keyword rich, and likely to be highly searched in their respective terms. Most of the domains are virgin names, some are expired and re-registered names. I can appreciate re-registered names likely retain little value, but I'm wondering, if one was to setup each of the virgin vanity domains as a 301 re-direct, and add the redirected domains as a new submit to google, would there be any keyword relevance, or would this likely be a wasted effort or result in a penalty? I initially registered the domains to protect intellectual property, or prevent others from benefiting from the competitive terms (evil, I know), but I'd like benefiting from them, rather than renew each year and have them site there and do nothing. Thanks!

    | ispone
    0

  • The analysis of my site is showing that I have a problem with too many on-page links. Most of this is due to our menu, and wanting users to be able to quickly get to the shopping category they are looking for. We end up with over 200 links in order to get the menu we want. How are other people dealing with a robust menu, but avoiding getting dinged for too many links? One of our pages in question is: http://www.milosport.com/category/2176-snowboards.aspx

    | dantheriver
    0

  • Currently we are using previous 1 2 3 next for our link to other inventory pages, with some variation of this javascript code javascript:__doPostBack('ctl00$phMain$dlPagesTop$ctl01$lnkPageTop','') . Can search engines even index the other pages with this javascript? Is there a better way to do this?

    | CFSSEO
    0

  • Hello, The CMS that I use makes 3 versions of the homepage:
    www.homepage.com/home.aspx homepage.com homepage.com/default.aspx By default the CMS is set to rel=canonical all versions to the www.homepage.com/home.aspx version. If someone were to link to a website they most likely aren't going to link to www.homepage.com/home.aspx, they'll link to www.homepage.com which makes that link juice flow through the canonical to www.homepage.com/home.aspx right? Why make that extra loop at all? Wouldn't that be splitting the juice? I know 301's loose 1-5 % juice, but not sure about canonical. I assume it works the same way? Thanks! http://yoursiteroot/

    | EvolveCreative
    0

  • Is there any negative to cloud hosting? I believe they share the same IP addresses but is it true you can still get banned if someone else on a shared IP or server does some spam?

    | iAnalyst.com
    0

  • Hi, I am in the process of having a site created which will focus on the Xbox 360, PS3, Wii and PS3 Vita. I would appreciate some advice when it comes to the URL structure.  Each category mentioned above will have the following subsections News
    Reviews
    Screenshots
    Trailers Would the best url structure be? www.domain.com/xbox-360/news/news-story-headline
    www.domain.com/ps3/reviews/ps3-game-name Thanks in advance for your help and suggestions.

    | WalesDragon
    0

  • Hi everyone, I recently installed verisign ssl. the idea to have page https://example.com all redirect from non-http to https work properly, but in IE whenever smbdy types https://www.example.com it shows the red screen with invalid certificate. If you click "proceed" - everything goes to normal page and on server redirect www to non-www seem to work fine. Is there way to get rid of the warning? Is it server or certificate issue? Here is the peice of code from htaccess. Please, advice needed! RewriteEngine On RewriteBase /RewriteCond %{HTTPS} !=on RewriteRule ^(.*) https://%{SERVER_NAME}/$1 [R,L] RewriteRule ^index.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] Thanks in advance

    | Kotkov
    0

  • Hi I run a job board that enables employers to post job vacancies and information about their organisations. These are 'paid for' pages (advertising) on our site. These link out to their own websites. My question is, would it be better for these links out to their sites to be no-follow? From my site's perspective, I cannot necessarily dictate the quality of their websites (although the majority are leading firms) as I would in article and feature content, where we do happily link out and refer to other quality sites with information that gives readers further information. I know that many large job boards do this where they run listings of feeds from other sites, but should we also do this at the page level where the link out is effectively paid for. What would be the pros and cons if I do or if I don't use no-follow? I hope this makes sense and look forward to some replies. Many thanks

    | CelestialChook
    0

  • I"m working on a real estate site that has multiple listing pages, e.g. http://www.hhcrealestate.com/manhattan-beach-mls-real-estate-listings I'm trying to get the main result page to rank for that particular geo-keyword, i.e. "manhattan beach homes for sale". I want to make sure all of the individual listings on the paginated pages, 2,3, 4 etc. still get indexed. Is it better to add to all of the paginated pages, i.e.manhattan-beach-mls-real-estate-listings-2, manhattan-beach-mls-real-estate-listings--3, manhattan-beach-mls-real-estate-listings-4, etc. or is it better to add noindex,follow to those pages?

    | fthead9
    1

  • Hi everyone, I'm looking to redirect all http requests to https for a site's homepage. It only needs to be for the homepage, not site wide. What's the best method of doing this without losing pagerank or ranking? I'm using IIS7.5 so I've been looking at a URL Rewrite or possibly this ASP.Net solution; http://www.xdevsoftware.com/blog/post/Redirect-from-Http-to-Https-in-ASPNET.aspx Or is a simple 301 or 302 (for some reason Microsoft's site says to do a 302 re-direct, though I'm not sure if this is great from an SEO perspective?) re-direct from http version to the https version the best method? Also if the solution retained the URL query string that would be even better! Any help appreciated! Thanks

    | PeterAlexLeigh
    0

  • Say I've got a site that can be accessed using either protocal (i.e. HTTP and HTTPS), but most (if not all of the links) are pointing to the HTTP versions. Will it cause a problem if I start link building to HTTPS versions? In other words does google see http://mysite.com as the same page as https://mysite.com? Thanks

    | PeterAlexLeigh
    0

  • Hi I am having problems with my URL rewriting to create seo friendly / user friendly URL's.  I hope you follow me as I try to explain what is happening... Since the creation of my rewrite rule I am getting lots of errors in my SEOMOZ report and Google WMT reports due to duplicate content, titles, description etc For example for a product detail, it takes the page and instead of a URL parameter it creates a user friendly url of mydomain.com/games-playstation-vita-psp/B0054QAS However in the google index there is also the following friendly URL which is the same page - which I would like to remove domain.com/games-playstation-vita/B0054QAS The key to the rewrite on the above URLs is the /B0054QAS appended at the end - this tells the script which product to load, the details preceeding this could be in effect rubbish i.e. domain.com/a-load-of-rubbish/B0054QAS and it would still bring back the same page as above. What is the best way of resolving the duplicate URLs that are currently in the google index which is causing problems The same issue is causing a quite serious a 5XX error on one of the generated URLs http://www.mydomain.com/retailersname/1   - , if I click on the link the link does work - it takes you to the retailers site, but again it is the number appended at the end that is the key - the retailersname is just there for user friendly search reasons How can I block this or remove it from the results? Hope you are still with me and can shed some light on these issues please. Many Thanks

    | ocelot
    0

  • I read over this post on the blog tonight: http://www.seomoz.org/blog/lessons-learned-by-an-over-optimizer-14730 & it's got me concerned that I might be having a similar issue on our site? Back in March & April of last year, we ranked fairly well for a number of long tail keywords, here is one in particular 'Mio Drink' for this page: http://www.discountqueens.com/free-mio-drink-from-kraft-facebook-offer The page is still indexed, but appears back on page #3 for the search term. During this time we had made a number of different updates to our site & I can't seem to put an exact finger on what might have caused the problem? Can anyone see any issues that might have caused this to drop? Thanks, BJ

    | seointern
    0

  • Hi Mozzers, I have a serious topic to discuss and want help from the experts here. Our website has 6 PR and we have been consistency staying at the top for very competitive terms in the niche. Since last Friday (24th February, 2012) we have been facing massive fluctuation in the rankings for most of the keywords we are focusing on. After this fall, we checked the following details but didn’t find any serious/critical issue that might be contributing towards these fluctuations:- We analyzed Google webmaster tools, there’s no update/warning from Google regarding any negative activity and other things seem to be normal. We checked our website through site search (site: www.domain.com) and found that we haven’t lost any indexed pages and things appear normally as they used to. So, we are sure that we haven’t been banned or penalized. We also cross verified our link building and other promotional activities and we didn’t find anything suspicious that could lead to such a big fluctuation. The drop is really big, some keywords went to 5th or 6th page from top 3 position; some keywords are not in top 200 or 300 spots which were usually staying put between 5th to 10th position. We have analyzed a lot but haven’t come to know the reason why we are facing this fluctuation. Our website is 4 years old and this kind of fluctuation has happened for the first time. Has anyone faced this kind of issue before? I’m looking forward to your support in identifying this trouble. Thanks

    | ValSmith
    0

  • Hello all, Is anybody experiencing blanck page's in Google's 'Cached' view? I'm seeing just the page background and none of the content for a couple of my pages but when I click 'View Text Only' all of teh content is there. Strange! I'd love to hear if anyone else is experiencing the same. Perhaps this is something to do with the roll out of Google's updates last week?! Thanks,
    Elias

    | A_Q
    0

  • Hi I have two websites one legacy site done in wordpress the other in php.  However I would like to merge the two together and remove the wordpress site. However it has a good link profile and the pages rank well. What is the best approach to do a 301 redirect from the old site with all its pages pointing to the homepage of the new site? If so what's the best way to do this in wordpress? Many thanks

    | ocelot
    0

  • In webmaster tools, under labs/site performance google provides your ave page load time. When google grades a page, does it use how long that specific page loads -or- Does google use the overall ave page load time for the domain as provided in lab/site performance

    | Bucky
    0

  • Hi All, Recently our SERPs have changed in Google results to show product prices from our pages rather than the meta description. This just started to happen in November with no change (that we know of) on our side. I have attached a from and to SERP image if that helps. Does any one have any ideas as its starting to effect our rankings? Thanks, Tony. Tkeou,6jg6Q Tkeou,6jg6Q#1

    | tstauntonwri
    0

  • Hello, How do I redirect domain.com to domain.com/ Thanks! Tyler

    | tylerfraser
    1

  • Our blog is part of Google News and is syndicated for use by several of our partners such as Chicago Tribune. Lately, we see the syndicator version of the post appearing in Google News instead of our original version. Ours generally ranks in the regular index. ChiTrib does have canonical URL tags and syndication-source tags pointing to our original. They are meta tags, not link tags. We do have a News-specific sitemap that is being reported in WMT as error-free. However, it shows no urls indexed in the News module -- even when I can find those specific URLs (our version) in the News. For an example: Here is a ChiTrib post currently ranking in Google News
    http://www.chicagotribune.com/classified/automotive/sns-school-carpool-lanes-are-a-danger-zone-20120301,0,3514283.story The original version is here:
    http://blogs.cars.com/kickingtires/2012/03/school-carpool-lanes-are-a-danger-zone.html The News sitemap URL is
    http://blogs.cars.com/kickingtires/kickingtires_newsmap.xml One of our front-end producers is speculating that the Facebook sharing code on ChiTrib is having an effect. Given that FB is FB and Google is Google, that sounds wrong to me when we're talking about specifically Google News. Any suggestions? Thanks.

    | CarsProduction
    0

  • Hi, I created a video sitemap and now I'm getting an error on webmaster tools because the location for some of the videos is the same. It says: Duplicate URL - This URL is a duplicate of another URL in the sitemap. Please remove it and resubmit. What can I do if all my videos are located in the same URL?? Thanks

    | Tug-Agency
    0

  • I have a new domain (www.newdomain.com) and and an old domain (www.olddomain.com). Currently both domains are pointing (via dns nameserves) at the new site. I want to 301 everything that comes from the www.oldsite.com to www.newsite.com. I've used this htaccess code RewriteEngine On RewriteCond %{HTTP_HOST} !^www.newsite.com$
    RewriteRule (.*) http://www.newsite.com/$1 [R=301,L] Which works fine and redirects if someone visits www.olddomain.com but I want it to cover everything from the old domain such as www.olddomain.com/archives/article1/ etc. So if any subpages etc are visited from the old domain its redirected to the new domain. Could someone point me in the right direction? Thanks

    | EclipseLegal
    0

  • I have several website in campaigns and I consistently get flagged for duplicate content and duplicate page titles from the domain and the domain/ versions of the sites even though they are properly redirected. How can I fix this?

    | RyanKelly
    0

  • Hi, I am wondering if i should include a description tag for an article page where the news on that page will change around five times a day. I am not sure if to fill a description in for local news or to leave it blank so the search engines pick up the latest local news and then the next day they show the latest local news again in the search engines instead of having a static description of the news page. any help would be great

    | ClaireH-184886
    0

  • Hi guys, 4 weeks ago we launched a site www.adsl-test.it. We just make some article marketing and developed a lots of functionalities to test and share the result of the speed tests runned throug the site. We have been for weeks in 9th google serp page then suddendly for a day (the 29 of february) in the second page next day the website home is disappeared even to brand search like adsl-test. The actual situalion is: it looks like we are not banned (site:www.adsl-test.it is still listed) GWT doesn't show any suggestion and everything looks good for it we are quite high on bing.it and yahoo.it (4th place in the first page) for adsl test search Anybody could help us to understand? Another think that I thought is that we create a single ID for each test that we are running and these test are indexed by google Ex: <cite>www.adsl-test.it/speedtest/w08ZMPKl3R or</cite> <cite>www.adsl-test.it/speedtest/P87t7Z7cd9</cite> Actually the content of these  urls are quite different (because the speed measured is different) but, being a badge the other contents in the page are pretty the same. Could be a possible reason? I mean google just think we are creating duplicate content also if they are not effectively duplicated content but just the result of a speed test?

    | codicemigrazione
    0

  • Hi, my website is www.theprinterdepo.com and I have been in seomoz pro for 2 months. When it started it crawled 10000 pages, then I modified robots.txt to disallow some specific parameters in the pages to be crawled. We have about 3500 products, so thhe number of crawled pages should be close to that number In the last crawl, it shows only 1700, What should I do?

    | levalencia1
    0

  • Hello the SEOMOZ report is showing me I have a lot of duplicate content and then proceeds listing almost every page on my site as showing with a URL with an ending "/" and without. I checked my sitemap and only one version is there, the one with "/". I have a Wordpress site. Any recommendations ? Thanks.

    | dpaq2011
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.