Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • Hi! We have a multilingual page and I have set in Google Webmaster Tools the language preference for the root domain to be none, Spanish for the .com/es, English for the .com/en, and German for the .com/de. The title and description show in the right language in Google Germany and google UK, but in google.es (Spain) the title and description appear in English instead of Spanish. Does anybody know why could this be happening and how to fix it? kJtF3.png

    | inmonova
    0

  • Hi, The https-pages of our booking section are being indexed by Google. We added But the pages are still being indexed. What can I do to exclude these URL's from the Google index? Thank you very much in advance! Kind regards, Dennis Overbeek ACSI Publishing | [email protected]

    | SEO_ACSI
    0

  • So I did a site analysis over at Majestic SEO and it told me that I have 100 Referring IP addresses and 92 are Class C subnets. What does this mean? Are they good, bad or irrelevant?

    | AaronParrish
    0

  • In my multilingual Magento store, I want to redirect the hompage URL with an added language code to the base URL. For example, I want to redirect http://www.mysite.com/tw/ to http://www.mysite.com/ which has the exact same content. Using a canonical URL will help with search engines, but I would just rather nip the problem in the butt by not showing http://www.mysite.com/tw/ to visitors in the first place. Problem is that I don't want (can't have) all /tw/ removed from URLs due to Magento limitations, so I just want to know how to redirect this single URL. Since rewrites are on, adding Redirect 301 /tw http://www.88kbbq.com would redirect all URLs with the /tw/ language code to ones without. Not an option. Hope folks can lend a hand here.

    | kwoolf
    0

  • We have a client who publishes deals that are time sensitive. Links to the deals expire and so Google's crawlers are picking them up and finding a 404 If I no follow them, will the 404's still get picked up and reported in WMT? The same question applies to SEOMoz Pro.

    | Red_Mud_Rookie
    0

  • I have 2 sites with duplicate content issues. One is a wordpress blog. The other is a store (Pinnacle Cart). I cannot edit the canonical tag on either site. In this case, should I use robots.txt to eliminate the duplicate content?

    | bhsiao
    0

  • I have seen this response http://www.seomoz.org/qa/view/36353/video-sitemaps-for-youtube-embedded-videos That basically answers this question by saying that Google will give value to the site embedding the video. However I have seen much talk about this whether this really ads any value to the actual site. So By searching videos on Google my site will NOT come up in the results for any embedded video. What would the actual value be?

    | zachc_coffeeforless.com
    0

  • A sub page now ranks for my main key word (the file name is exact match to the key word) it completely replaced my index page in rankings. Would 301 redirecting the sub-page to my index page (which is more informative and has a whole lot more links pointing towards it) be a good idea or vice versa? Or would optimizing that page (the sub page) be the best way to go (the sub page doesn't have single inbound link pointing to...).  This happened about a week ago. Thanks!

    | Benj707
    0

  • My site has a content distribution agreement with Yahoo Finance for the daily articles we publish. It's delivered to them via XML, and while we don't have in-line links within the article, we do have 1. Clickable Logo image 2. Standard language at the end of the article with a link back to our registration page We use DART clicktags (http://ad.....) that redirects to our homepage combined with ?src=YahooFinance&affiliateId=77 query strings that are generated by these clicks to measure registration and sources My question is twofold. 1. Are the doublclick clicktags hurting the valuable linkbacks from Yahoo Finance for picking up our content 2. What should be done with the query string extentions once people land. We still want to see that data in our Google Analytics, so is a rel=canonical the appropriate solution?

    | Yun
    0

  • Apologies in advance for the complexity. My client, company A, has purchased company  B in the same industry, with A and B having separate domains. Current hosting arrangement combines registrar and hosting functions in 1 account so as to allow both domains to point to a common folder, with the result that identical content is displayed for  both A & B. The current site is kind of an amalgam of A and B. Company A has decided to rebrand and completely absorb company B. The problem is that link value overwhelmingly favours B over A. The current (only) hosting package is Windows, and I am creating a new site and moving them to Linux with another hosting company. I can use 301's for A , but not for B as it is a separate domain and currently shares a hosting package with A. How can I best preserve the link juice that domain B has? The only conclusion I can come up with is to set up separate Linux hosting for B which will allow for the use of 301's. Does anyone have a better idea?

    | waynekolenchuk
    0

  • Does anyone have a recommendation for the best xml sitemap plugin for wordpress sites or do you steer clear of plugins and use a sitemap generator then load it up to the root manually?

    | simoncmason
    0

  • Hi, I have a WordPress-based site and overall everything is working well. However, I can't seem to figure out how to get apostrophes and other characters to display normally. Now, the problem isn't that they are displaying as code to normal visitors or up in the title bar, they are displaying as code to Google's bots as well as to SEOMOZ. Example: Normal visitor sees: About **** | **** - Metro Vancouver's IT & Web Experts Google and SEOMOZ see: About **** | **** - Metro Vancouver's IT & Web Experts I've played around with different ways of typing the title (not using character codes vs. using character codes) and nothing seems to work. Any help or explanation would be appreciated.

    | Function5
    0

  • I am going to start searching myself, but I have a client on a Windows server needing to do redirects. So besides my usual 'Get a UNIX box', I need to know how to do redirects on IIS. Full site redirect: RewriteCond %{HTTP_HOST} ^[site].com [NC] RewriteRule (.*) http://www.[site]/$1 [L,R=301] as well as page level redirects redirect 301 /[old-page.php] http://www.[site].com/[new-page] And that leads me to the question of RewiteRules also? RewriteRule ^[requested-page].html [server-fed-page].php [L] Thanks if anyone can [redirect] me to a good URL for these. 🙂 Richard

    | Getz.pro
    0

  • Hey guys, been a longtime SEOmoz user, only just getting heavily into SEO now and this is my first query, apologies if it's simple to answer but I have been doing my research! My website is http://www.lyricstatus.com - basically it's a lyrics website. Rightly or wrongly, I'm using Google Custom Search Engine on my website for search, as well as jQuery auto-suggest - please ignore the latter for now. My problem is that when I launched the site I had a complex AJAX Browse page, so Google couldn't see static links to all my pages, thus it only indexed certain pages that did have static links. This led to my searches on my site using the Google CSE being useless as very few pages were indexed. I've since dropped the complex AJAX links and replaced it with easy static links. However, this was a few weeks ago now and still Google won't fully index my site. Try doing a search for "Justin Timberlake" (don't use the auto-suggest, just click the "Search" button) and it's clear that the site still hasn't been fully indexed! I'm really not too sure what else to do, other than wait and hope, which doesn't seem like a very proactive thing to do! My only other suspicion is that Google sees my site as more duplicate content, but surely it must be ok with indexing multiple lyrics sites since there are plenty of different ones ranking in Google. Any help or advice greatly appreciated guys!

    | SEOed
    0

  • Hi, Please tell me how to make the mobile website availabe: htp://m.mysite.com or http://www.mysite.com i.e. to render different content based on user agent but on the same URL

    | IM_Learner
    0

  • Does anyone know of a tool that will allow you to track links to an email address hyperlink like mailto:[email protected]

    | dsonenberg
    0

  • Hello, We're looking to use a subdomain for a bookings engine that will also host the majority of our site content as it wil house the details of the courses that we'll be selling online. All content is currently available on www.existingdomain.co.uk A few pages will remain here but the majority will ultimately be hosted on a different IP address under a subdomain: courses.existingdomain.co.uk I am a little concerened about search engine reaction to this content separation. Would this approach dilute the rankings of www.existingdomain.co.uk? Is there anything else we need to be mindful of? We have alternative options if this is a real SEO faux pas. Thanks

    | Urbanfox
    0

  • I am having a large amount of errors in the not found section that are linked to old urls that haven't been used for 4 years. Some of the ulrs being linked to are not even in the structure that we used to use for urls. Never the less Google is saying they are now 404ing and there are hundreds of them. I know the best way to attack this is to 301 them, but I was wondering why all of these errors would be popping up. I cant find anything in the google index searching for the link in "" and in webmaster tools it shows unavailable as where these are being linked to from. Any help would be awesome!

    | Gordian
    1

  • Does the order of the robots.txt syntax matter in SEO? For example (are there potential problems with this format): User-agent: * Sitemap: Disallow: /form.htm Allow: / Disallow: /cgnet_directory

    | RodrigoStockebrand
    0

  • My wordpress site's robots.txt used to be this: User-agent: * Disallow: Sitemap: http://www.domainame.com/sitemap.xml.gz I also have all in one SEO installed and other than posts, tags are also index,follow on my site. My new posts used to appear on google in seconds after publishing. I changed the robots.txt to following and now post indexing takes hours. Is there something wrong with this robots.txt? User-agent: * Disallow: /cgi-bin Disallow: /wp-admin Disallow: /wp-includes Disallow: /wp-content/plugins Disallow: /wp-content/cache Disallow: /wp-content/themes Disallow: /wp-login.php Disallow: /wp-login.php Disallow: /trackback Disallow: /feed Disallow: /comments Disallow: /author Disallow: /category Disallow: */trackback Disallow: */feed Disallow: */comments Disallow: /login/ Disallow: /wget/ Disallow: /httpd/ Disallow: /*.php$ Disallow: /? Disallow: /*.js$ Disallow: /*.inc$ Disallow: /*.css$ Disallow: /*.gz$ Disallow: /*.wmv$ Disallow: /*.cgi$ Disallow: /*.xhtml$ Disallow: /? Disallow: /*?Allow: /wp-content/uploads User-agent: TechnoratiBot/8.1 Disallow: ia_archiverUser-agent: ia_archiver Disallow: / disable duggmirror User-agent: duggmirror Disallow: / allow google image bot to search all imagesUser-agent: Googlebot-Image Disallow: /wp-includes/ Allow: /* # allow adsense bot on entire siteUser-agent: Mediapartners-Google* Disallow: Allow: /* Sitemap: http://www.domainname.com/sitemap.xml.gz

    | ideas123
    0

  • Curious to see if this is a positive or negative thing for SEO...or even perhaps, neutral. h9SZz

    | RodrigoStockebrand
    0

  • Hi The designer of the company I work for is re-designing Pop Up browsers as well as inline Pop up and Drop down menus. He needs SEO requirements - how can they be SEO friendly? Thanks a lot for your help! SL. Please see below the detail: Browser Pop Ups all include:
    • a browser title,
    • a logo and title in the title bar,
    • a close window button and
    • a call to action (that closes pop-up when clicked). Usage:
    Use when you'd like to offer additional information to the
    user but, not take the user away from the main page. Inline Pop up and Drop down menus. The inline pop-up & drop down is used to display additional menu options, functionality
    or content on the page without dedicating real estate in the page layout. It's a part of the page HTML to retain SEO value and thus does not trigger pop-up blockers. A title bar displays when content of the pop-up or drop down is not in context of
    the trigger. When used as a drop down, it is attached to the the bottom of it's trigger and left-aligned (unless it would exceed beyond the browser chrome, then it's right-aligned). When used as a pop-up, it is centered vertically/horizontally in the users browser window.
    The inline pop-up/drop down can be triggered differently per instance (e.g. onclick, onhover with delay). It can be closed by: clicking on link/location that triggered the pop-up/drop down (a.k.a. close icon) clicking anywhere outside the pop-up/drop down There are 5 widths to choose from, based on the needs of the content: 196px (3 columns) 266px (4 columns) 406px (6 columns) 546px (8 columns) 658px (10 columns)

    | charsimona
    0

  • I am seeing major movement in my keyword positions leading me to believe that Google made a major algorithm update this morning

    | irvingw
    0

  • Hi All I'm looking for some feedback regarding a site architecture issue I'm having with a client. They are about to enter a re-design and as such we're restructuring the site URLs and amending/ adding pages. At the moment they have ranked well off the back of original PPC landing pages that were added onto the site, such as www.company.com/service1, www.company.com/service2, etc The developer, from a developer point of view wished to create a logical site architecture with multiple levels of directories etc. I've suggested this probably isn't the best way to go, especially as the site isn't that large (200-300 pages) and that the key pages we're looking to rank should be as high up the architecture as we can make them, and that this amendment could hurt their current high rankings. It looks like the trade off may be that the client is willing to let some pages be restructured so for example, www.company.com/category/sub-category/service would be www.company.com/service. However, although from a page basis this might be a solution, is there a drawback to having this in place for only a few pages rather than sitewide? I'm just wondering if these pages might stick out like a sore thumb to Google.

    | PerchDigital
    1

  • We have a site that we want to break up into mini sites but keep the old site for the major brands. Empirecovers.com is the major and we want to break it off into Empire Truck Covers and Empire Boat covers. What I am thinking of doing is linking from the home to Empiretruckcovers.com instead of a mini page on the site and 301 redirect the mini page to empiretruckcovers.com. Than (there wont be duplicate content) making a small page for truck covers on empire just so people do not get confused. Is this the best way to go or what do you suggest? We are doing this because I feel there is seo value in having mini sites and also the user experience will be cleaner and people will trust it a lot more than inside a big site. The other problem is I have some great rankings on the pages so I want to do it so there is as little damage as possible. I guess once I start I will do all the free directories, yahoo directory and try to get links as fast as I can. Any suggestions would be great. I am going to do a/b testing to see if my adwords convert better on mini site or on the big site for certain keywords too

    | goldjake1788
    0

  • A very similar question was asked previously.  (http://www.seomoz.org/q/why-google-did-not-index-our-domain)  We've done everything in that post (and comments) and then some. The domain is http://www.miwaterstewardship.org/ and, so far, we have: put "User-agent: *  Allow: /" in the robots.txt  (We recently removed the "allow" line and included a Sitemap: directive instead.) built a few hundred links from various pages including multiple links from .gov domains properly set up everything in Webmaster Tools submitted site maps (multiple times) checked the "fetch as googlebot" display in Webmaster Tools (everything looks fine) submitted a "request re-consideration" note to Google asking why we're not being indexed Webmaster Tools tells us that it's crawling the site normally and is indexing everything correctly.  Yahoo! and Bing have both indexed the site with no problems and are returning results.  Additionally, many of the pages on the site have PR0 which is unusual for a non-indexed site.  Typically we've seen those sites have no PR at all. If anyone has any ideas about what we could do I'm all ears.  We've been working on this for about a month and cannot figure this thing out. Thanks in advance for your advice.

    | NetvantageMarketing
    0

  • Hi all, I have a website, www.acrylicimage.com which provides products in three different currencies, $, £ and Euro. Currently a user can click on a flag to indicate which region they are in, or if the user has not manually selected the website looks at the users Locale setting and sets the region for them. The website also has a very simple content management system which provides ever so slightly different content depending on which region the user is in. The difference in content might literally be a few words per page, like contact details, measurements i.e. imperial to metric. I dont believe that GoogleBot, or any other bot for that matter, sets a Locale, and therefore it will only ever be indexing the content on our default region - the UK. So, my question really is if I need to be able to index different versions of content on the same page, is the best route to provide alternate urls i.e.: /en/about-us
    /us/about-us
    /eu/about-us The only potential downside I see to this is there are currently a couple of pages that do have exactly the same content regardless of whether you have selected the UK or USA regions - could this be considered content duplication? Thanks for your help. Al

    | dotcentric
    0

  • I have been asked to review an old website to an identify opportunities for increasing search engine traffic. Whilst reviewing the site I came across a strange loop. On each page there is a link to printer friendly version: http://www.websitename.co.uk/index.php?pageid=7&printfriendly=yes That page also has a link to a printer friendly version http://www.websitename.co.uk/index.php?pageid=7&printfriendly=yes&printfriendly=yes and so on and so on....... Some of these pages are being included in Google's index. I appreciate that this can't be a good thing, however, I am not 100% sure as to the extent to which it is a bad thing and the priority that should be given to getting it sorted. Just wandering what views people have on the issues this may cause?

    | CPLDistribution
    0

  • If I type site:youtube.com into Google, are the results listed by what Google considers to be the most important pages of the site? If I change my sitemap should this order change? Thanks!

    | Seaward-Group
    0

  • Hi,
    I am creating sitemaps for site which has more than 500 Sub domains. Page varies from 20 to 500 in all subdomains & it will keep on adding in coming months. I have seen sites that create separate sitemap.xml for each subdomain which  they mention in separate robots.txt file http://windows7.iyogi.com/robots.txt XML site map eg for subdomain: http://windows7.iyogi.com/sitemap.xml.gz , Currently in my website we have only 1 robots.txt file for main domain & sub domains. Please tell me shall i create separate robots.txt & XML site map file for each subdomain or 1 file. Creating separate xml for each sub-domain is not feasible as we have to verify in GWT separately. Is there any automatic way & do i have to ping separately if i add new pages in subdomain. Please advise me.

    | vaibhav45
    0

  • Hi, I have a little issue with a client site whose programmer seems kind of unwilling to change things that he has been doing a long time. So, he has had this dynamic site set up for a few years and active in google webmaster tools and others, but is not happy with the traffic it is getting. When I looked at webmaster tools I see that he has a sitemap registered, but it is /sitemap.php When I said that we should be offering the SE's /sitemap.xml his response is that sitemap.php checks the site every day and generates /sitemap.xml, but there is no /sitemap.xml registered in webmaster tools. My gut is telling me that he should just register /sitemap.xml in webmaster tools, but it is a hard sell 🙂 Anyone have any definitive experience of people doing this before and whether it is an issue? My feeling is that it doesn't need to be rocket science... Any input appreciated, Sha

    | ShaMenz
    0

  • Hi <acronym title="Search Engine Optimization">SEO</acronym> pro's, how are you doing these days? Hope everything is fine... Let's get down to business: I've got a little question about ecommerce sites with duplicate content (product descriptions). I'm already ranking top #1 for exact keyword matche's (did a lot of backlink work with exact keyword). That's fine. The question is: long tail keywords still getting lower results than the competitors, because they published the content first. How to beat them? What I need to do/work to outrank competitors on long tail keywords? (I really need this because almost keywords/products from my niche only have 10% of exact search's). Hope someone can give me a word of light on this! Thanks!

    | azaiats2
    0

  • Hi-- I was using the SEOMoz toolbar, and I went to this page - http://www.hark.com/clips/dxkdbdggyh-hes-totally-lucid-100-percent and it says "zero" links from anywhere. However, it's clearly linked to from the home page - http://www.hark.com/ Why would this be?

    | TheIronYuppie
    1

  • Last year we merged 3 websites into 1 website and launched the new site in February. When developing the new site I created 301 redirects for all the pages from the old sites to the new site. Unfortunately when the new website was created the URLs were not optimised for search engines. I now need to optimised the page URLs. In theory I need to create new 301 redirects from this existing pages to the new optimised URLS. I am concerned that in a few years I might end up with a string of 301 redirects and if I break some links I might loose some ranking. How many redirects will link juice work for? I hope I'm clear here, if not I've attached a image showing what I'm doing. Thank you. unledfh.jpg

    | Seaward-Group
    0

  • Hello everyone, I am currently working on a big site which sells thousands of widgets.  However each widget has ten sub widgets (1,2,3... say) My strategy with this site is to target the long tail search so I'm creating static pages for each possibly variation. So I'll have a main product page on widgets in general, and also a page on widget1, page on widget2 etc etc. I'm anticipating that because there's so much competition for searches relating to widgets in general, I'll get most of my traffic from people being more specific and searching for widget1 or widget 7 etc. Now here's the problem - I am getting a lot of content written for this website - a few hundred words for each widget.  However I can't go to the extreme of writing unique content for each sub widget - that would mean 10's of 1,000's of articles. So... what do I do with the content.  Put it on the main widget page was the plan but what do I do about the sub pages.  I could put it there and it would make perfect sense to a reader and be relevant to people specifically looking for widget1, say, but could there be a issue with it being viewed as duplicate content. One idea was to just put a snippet (first 100 words) on each sub page with a link back to the main widget page where the full copy would be. Not sure whether I've made myself clear at all but hopefully I have - or I can clarify. Thanks so much in advance David

    | OzDave
    0

  • Hi All I have been looking at advertising on some fashion blogs for our online store. Both sites have decent traffic though A is stronger than the B with more than double the traffic, Therefore given equal relevance to our business sunglasses (www.pretavoir.co.uk) it would be fair to predict that A would result in double the number of conversions.. However another interesting aspect to making a decision on which sites to advertise is their Domain Authority and how much link juice they can pass. Therefore my question is this; Putting aside any potential click through traffic, if site A Domain Authority is 70 (link to be on homepage) and site B Domain Authority is 35 is the value of site A double that of site B or is there a less linear relationship (just as with page rank). Site A are charging 500$ per year for an advertising link and Site B 100$ per year would it better business to take 5 x Site Bs or is the linkjuice passed by one DA 70 site worth more? Your thoughts would be most appreciated..

    | seanmccauley
    0

  • Hi there, on my website (http://dealcity.de) I aggregate group shopping deals from Germany. To obtain a good user experience there is a large main navigation in the header where users can choose the city they are interested in. The problem is, that the main navigation has 220 links to all cities. Of course this is way to much on a fairly new site like this to have the link juice flown to the sites that are important to us. What should I do with the main navigation? Is there any way to remove these links from the linkgraph but keep the current user experience? Best Regards
    Markus

    | marfert
    0

  • Hi, I´m really new to this and have just setup some Campaigns. I have setup a Campaign for the root domain: portaldeldiablo.com.uy which returned only 2 crawled pages.. As this page had a 301 redirect from the non-www to the www version, I deleted this Campaign and setup a new one for www.portaldeldiablo.com.uy which returned only 1 crawled page.. I really don´t know why is my website not being crawled..Thanks in advance for your help.

    | ceci2710
    0

  • Hi, we have problems with having too many links on page. Our website has a menu with 3 level sub-navigation drop down for categories which we want to maintain, for easy-navigation for the users. http://www.redwrappings.com.au/ After reading this article: http://www.seomoz.org/blog/questions-answers-with-googles-spam-guru, and some other articles, we came up with a solution. We can easily reduce the number of links per page by putting 'nofollow' on our categories links menu dropdown and create a separate 'landing page' that contains links to these categories (and allow 'follow' links for robots). Is it wise to do this? Or any better, easy solution that you can suggest? Thanks

    | Essentia
    1

  • We are redirecting page-level content (about 500 pages) from several sub domains to our main site. With IIS, It’s my understanding that file locations must match. For example: subdomain/pathA/filename1
    mainsite/pathA/filename1 Since the sub domain files are not on the main site, this means we'd create up to 500 zero byte dummy files on the new server and replicate the sub domain directory structure. With IIS is there a work around for handling page level redirects without duplicating the file location? In the case of white papers, videos and case studies, we'll imlement directory level redirection. Thanks in advance.

    | DigitalMkt
    0

  • Hi, Our IS department is bringing down our network for maintenance this weekend for 24 hours. I am worried about search engine implications. all Traffic is being diverted, and the diverted traffic is being sent to another server with IIS 6.0 From all research i have done it appears creating a custom 503 error message in IIS 6 is not possible  Source: http://technet.microsoft.com/en-us/library/bb877968.aspx So my question is does anyone have any suggestions on how to do a proper 503 temporarily unavailable in IIS 6.0 with a custom error message? Thanks

    | Jinx14678
    0

  • In our web master tools we have 35K (ish) URLs that are restricted by robots.txt and as have 1200(ish) soft 404s.  WE can't seem to figure out how to properly resolve these URLs so that they no longer show up this way. Our traffic from SEO has taken a major hit over the last 2 weeks because of this. Any help? Thanks, Libby

    | GristMarketing
    0

  • Dear Friend, We representa a major national brand in the auto care industry, and they have locations in both US and Canada.  There is a primary content site at  .com that we have duplicated at .ca. We are hosting the .ca site on a separate IP on a server in Canada - but by in large it is the same site. (there are some minor changes we made to change US English to Canadian English - though minor. When we search Google.ca we generally see strong search results for the .com site, but rarely, if ever any evidence of rankings for the .ca site. The .com site was launched several years ago about 18 months before the .ca site. Why doesn't Google.ca show the .ca site?  Is this an issue of duplicate content, and Google.ca simply shows the .com version which it knew about first?  Are we wasting our time, money and efforts having both? Thanks, Tim ps. this isn't about location. We use a separate site to locate local shops, and have coordinated that well with Google Places, and when looking for local auto care - we do well in both US and Canada. The sites described above are largetl content sites.

    | lunavista-comm
    0

  • Hi Mozzers, We had a question about our twitter app  http://www.arenaflowers.com/flowers-fun/flowers/home  This app sends messages out to tweeters for their Birthday, to say congratulations etc. Our question relates to the fact that we are generating 1,000 of pages of content (like this:http://www.arenaflowers.com/flowers-fun/flowers/message?id=394447 ) with some unique content, but these orphaned pages aren't really linked to and only have short-term traffic influx. Our question is whether these small, orphaned pages are likely to be seen as lots of random, low quality, low content pages - and whether that might hurt our Google ranking.  Sometimes the virtual cards are linked to by blogs and tweets etc, so we don't want them not indexed but equally, we don't want our rankings to be damaged by them. Wonder if anyone has any thoughts, opinions or any similar experience? Thanks, Arena Flowers

    | ArenaFlowers.com
    0

  • Perhaps I've learned too much about the technical aspects of SEO, but nowhere have I found scientific studies backing up any claims made here, or a useful answer to a discussion I recently started. Maybe it doesn't exist. I do enjoy Whiteboard Friday's. They're fantastic for new ideas. This site is great. But I take it there are no proper studies conducted that examine SEO, rather just the usual spin of "belief from authority". No?

    | stevenheron
    0

  • Hello altogether, we'd like to build a community using either phpFox or SocialEngine. Does anybody know how these platforms perform from an SEO perspective? Are there any technical "traps" with one of them? Thanks for sharing your experience.

    | FMT
    0

  • Hi, Our site is beingthere.com.au We are in the business of video conferencing in Australia. I was wondering if there would be any benefit of purchasing keyword rich domains such as www.videoconferencing.net.au www.video streaming.net.au What would be the benefit(s)? And How would I go about using these domains to maximise SEO benefit? Thanks Dan

    | dantmurphy
    0

  • Hi, Im currently working on www.kupwakacje.pl which is something like travel agency. People can search for holidays and buy/reserve them. I do know plenty of problems on my website, and thx to seomoz hopefully I will be able to fix them but one is crucial and it's kind of hard to fix I think. The search engine is provided by external party in form of simple API which is in the end responding with formatted HTML - which is completly stupid and pointless, but that's not the main problem. Let's dive in: So for example the visitor goes to homepage, selects Egypt and hit search button. He will be redirected to www.kupwakacje.pl/wczasy-egipt/?ep3[]=%3Fsp%3D3%26a%3D2%26kt%3D%26sd%3D10.06.2011%26drt%3D30%26drf%3D0%26px and this is not a joke 😉 'wczasy-egipt' is my invention obviously and it means 'holidays-egypt'. I've tried to at least have 'something' in the url that makes google think it's related to Egypt indeed. Rest which is the complicated ep3[] thingy is a bunch of encoded parameters. This thing renders in first step a list of hotels, in next one hotel specific offer and in next one the reservation page. Problem is that all those links generated by this so-called API are only changing subparameters in ep3[] parameter so for example clicking on a single hotel changes to url to: www.kupwakacje.p/wczasy-egipt/?url=wczasy-egipt/&ep3[]=%3Fsid%3Db5onrj4hdnspb5eku4s2iqm1g3lomq91%26l ang%3Dpl%26drt%3D30%26sd%3D10.06.2011%26ed%3D30.12.1999%26px%3D99999 %26dsr%3D11%253A%26ds%3D11%253A%26sp%3D which is obviously looking not very different to the first one. what I would like to know is shall i make all pages starting with 'wczasy-egipt' a rel-canonical to the first one (www.kupwakacje.pl/wczasy-egipt) or shoudn't I? google recognizes the webpage according to webmasters central, and recognizes the url but responses with mass duplicate content. What about positioning my website for the hotel names - so long tail optimalization? I know it's a long and complicated post, thx for reading and I would be very happy with any tip or response.

    | macic
    0

  • Does it influence the search engine result if we have our domain name without the "www." ?

    | netbuilder
    0

  • Hi, I'm new to seomoz (and seo in general) and loving it so far. My main domain name is more of a brandname than a search engine friendly list of keywords. I rank well for some keywords I optimized for, and less so for the more competitive keywords. I was wondering if making one page minisites hosted on keyword rich domain names could help in this respect? What I want to do is just have a single page with a few paragraphs of content and links to the main site. I am not looking for links to boost the main site, just for the minisites to do better for several keywords. Will this help? Is this ok, or against some Google policy? Can this hurt the main site rankings? Thank you! **Edit: **I noticed that sites ranking above me on the first page for some keywords have much less on-page elements than my page, have about the same domain trust and also very little inbound links. The only factor I can see is the exact match of keywords in the domain name.

    | Eladla
    1

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.