Did Google's Farmer Update Positively/Negatively Affect Your Search Traffic?
-
See the attached image, showing a comparison of SEOmoz.org's search traffic from Google over the past 6 days and the prior week.
-
SOme positive impact increase in CTR, because after these updates all the spam sites hit down towards and our website get more exposure...
-
I have not seen much of a change. Actually if I have to say anything it would be things have improved. Not, working for an agency helps me because I only answer to one person. Reading about people getting hit hard I am curious if their clients have been blowing up their emails and phone asking about the condition of the site. Over optimized sites I guess were hit the hardest. My marketing department started having me get more content on the site prior to the updates. So maybe that had some affect on whether we got hit.
-
Our site MeFindCoupon actually advanced from 6-15 page on average to 1-5 for our keywords, we are a new site, but also been doing some SEOseo it really good.
-
I have had interesting results during the farmer updates, I was originally effect by a downturn of about 40% in traffic, but after about 4 weeks the traffic came back. In the Panda 2.2 update my page rank slipped from 6 to 5 But I did not loose any traffic despite loosing the page rank.
-
All of my affiliate sites held rankings as well as my authority sites. From what I've seen what's worked is:
- Continuously publishing content
- Putting the blog page first (continuously updated)
- Ensuring that Google is aware of my site in Google Webmaster Tools
- So basically building an affiliate site as a "standard" authority site
So no drop in rankings, actually the exact opposite.
-
I was hit pretty hard at lasr.net, a niche travel site focusing on attractions and events. While the site has been up for over a decade we are currently down 50% from where we normally are. We had the same issue of having placeholder pages where users could submit their own attractions, events, etc. While normally operating around 400k pages indexed, I did notice that we ballooned to over 1 million indexed weeks prior to the farmer update.
On the downside for us, the majority of our pages that ranked very well were original content that had been present on the web for a minimum of 5 years.
I am aggressive on the site with adsense / advertising as it provides the revenue for site production and I could see those placeholder pages or pages with low content being targeted (not unfairly so) as their ratio was a bit skewed.
Currently we are focusing on reevaluating both on page and off page SEO with a primary push for unique content generation. In the last month we have seen slow, but steady, growth in search traffic.
-
One client (Domain authority 47 ranks for 100s of tough brand related keywords) and had duplicate content on its internal brand pages (the same few 100 words of brand info on each brand page) and was hit badly.
We have other clients in this vertical (Mens designer clothing in UK) , some with duplication have increased in rankings, but they might have a slightly higher Domain Authority 50).
All the ranking drops happened on April 12<sup>th</sup>, across all pages which had internal duplication (you can see traffic drops by dissecting brand pages in GA).
We have re written all content and placed it on all pages so there is no internal duplication. All the new content has been there for a month and each page has been cached. No change to our rankings.
Anyone know if the changes they have made have seen any improvements? Surely if this algorithmic then once the algorithm picks up the changes then it re-assesses you and will change your ranking? Surely you can’t now be a ‘marked’ domain???
-
Is the primary result of a site that rankings fall or are you guys also seeing a drop in the number of indexed pages? or both?
Also, how quickly do you all see sites recovering from this?
-
Hello,
I recently faced issue of Google recent algo update on my main website. I found my ranks were all of 5th to 7th page even with website name without .com at end we have 5th page shown our biz website. It was a news portal and behind in subdomain i was running my web hosting website.
1. I removed all news content from my website since i thought the news agency send me content send same content to others may cause in the issue so i removed the NEWS AREA
2. I am turn off all of my old subscriptions or membership of blog networks etc. to make sure i get proper good backlinks with good research etc.
Is there anybody who can suggest me what shall i take more action ? Mean any kind of further good suggestion i will highly appreciate if anyone can help me with any suggestions of SEO. I know there are many people who knows lots about it thus i thought to ask to community..
I am also unsure its a Google panelty or a Google recent update negative changes to my website ?
However i have already filled reconsideration request as an possible alternative by explaining google that we are no more NEWS CONTENT Website.
Will wait for responses...
-
my UV dropped up to 60% when google panda rolled out world wide..
-
The problem is that those pages are all unique (we are a locally-organized directory of classes and courses) -- to simply change copy to avoid being labeled duplicative, when it's perfectly rational copy for a human seems even 'more' SEO spammy than simply changing what's relevant for humans (i.e. the location-focused words).
I do appreciate the research though -- any ideas on how to alter content for locales? Wouldn't sites like Yelp get hit for this exact same type of thing?
-
We definitely saw an impact at www.tradeking.com. However, the impact (negative) was delayed and appears to be isolated to a few head ("primary") non-branded keywords. We saw traffic begin to decline on these keywords on March 3rd. So, it seems like we were impacted more by post-Panda "tweaks" than by the Panda update.
Moreover, our Google ranking for one of these primiary keywords dropped from 5th to 11th (now 12th) on March 3. Even more interesting: said keyword had held steady at 5 or 6 since the Vince update. Prior to the Vince update, our ranking for that keyword had been around 8 or 9 (with some fluctuation). Effectively, our Vince gains have been lost (for this keyword) and we've lost a few more positions on top of that. Could that be a clue?
-
I don't know Teachstreet - but checking my theories/tests, it does fit the model of what has been hit. I took a snippet af exact match content from a Teachstreet directory listing and put it in google: http://www.google.com/search?client=safari&rls=en&q="Get+ready+for+the+SATs,+connect+with+a+math+tutor+or+improve+your+LSAT+score"&ie=UTF-8&oe=UTF-8
So what I see is Google taking note of exact;y what it would take note of - trying to put the original source of the content at the top and penalize (ignore) content tat mimicks the original.
-
PRNewswire.com was listed in the original Sistrix post and was indeed hit, though not nearly as badly as their data suggested. Overall the site is down ~20%. We're obviously not a content farm, but we do have a ton of content on a vast range of topics, and therefore a very dynamic set of keywords. We've alerted Google and I have some theories as to why we were targeted, but we're still picking through the data so forgive me for not sharing them until I'm more certain.
Cutt's and Singhal's comments re external testers were fascinating. If they've really codified qualitative factors that accurately quantify a user's experience on a site/page then that is going to get very interesting. SEOs will have to grow natty little goatees and start calling themselves Optimization Experience Designers...
-
Thanks for thinking about it, Stefan. We did try the NOODP meta tag for 2 months (Jan 4th through last week) and Google seemed to ignore it because they weren't pulling a description from the Open Directory project -- they were pulling users quotes from the page. We're not listed in the Open Directory...
The approach we've been using since last week where the snippets only show up on rollover has forced Google to accept the meta description but has increased bounce. So we're going to tweak that further by inserting 1 line of text for each review in the onload event.
We've seen similar meta description problems for our doctor Q&A where Google pulls the doctor's name because it's in a div with class = "author". We're going to rename that div to discourage Google from picking it up.
-
Hey RealSelf,
With regard to your meta description problems, have you tried using a NOODP meta tag on affected pages. I've used this as a successful tactic for years and looks like other have recently tested it's effectiveness..see this most or search for NOODP http://seogadget.co.uk/the-impact-of-noodp-on-titles-in-serps/ .
Hope that helps.
-
Hey everyone,
We went through different websites which posted about this Google update and found that a few of our websites got a positive trend in Traffic. Attached is the screen capture [ http://snpr.cm/jYl ] which can help you in getting a clear picture.
I think one good reason for this is the websites which were ranking for long tail key phrases have lot huge traffic and rankings.
-
Virtually none of our client sites have been affected, and in many cases traccic/ranking have improved. Also noticing continual speed increases in new content being indexed and showing in the SERP's.
-
Rand said it has been added to Webmaster Tools across the board.
-
We have this and we have a ranking drop. But can it mean anything?
-
A few more reference points to add to this thread:
-
Google opened up a thread in their Google Webmaster Central help forum titled Think you're affected by the recent algorithm change? Post here. huge bunch of responses, you can compare your situation with the hundreds of case studies listed
-
There definitely were some 'goodies' whacked in this update, hence the Cult of Mac website getting whacked and then getting reinstated: http://www.cultofmac.com/crisis-over-google-has-reinstated-cult-of-mac/84362
-
Australia has not been hit by the update yet, AFAIK. We're bracing ourselves on our network and watching for changes
I wish Google would release 'patch notes' for each algo change. Any competitive gamer will understand what I mean - most competitive games have balance tweaks where it is essential to release patch notes
For example:
Panda Update 1.12 (released 04/03/2011)
- Fixed a bug where high karma websites with low external inbound links were downrated in Panda Update 1.11
- Increased brand authority factor to 20% from 15%
- Google Adwords background changed from pink to yellow
- etc etc
One can dream...
-
-
Hey Dave,
I would love to know what kind of errors and warnings were being thrown up in crawl diagnostics on you seomoz campaign.
I am responsible for 3 large E-Commerce sites that have all the normal duplication issues, too many links etc. We rank fairly well, but the algorithm change hasn't hit our shores yet. So I'm curious what errors and warnings you have, so that I can raise the argument for acclerating our seo changes in-house.
Would be interesting for us all to know so that we can attribute a possible correlation between these and the Farmer changes
-
Woot Woot! I don't see eHow above me anymore!
-
Prob wont be an imediate recovery, there will certianly be a peroid of reduced flow. but legitimate sites that are not using "farmer" tactics will likely see signs of recovery soon when google indexes enough of the new links. Set up 301's on all your old links point them all to related pages in your new structure..... it could be done just rattleing off ideas here.
-
Has anyone seen a recovery after taking remedial action?
-
Looking at the traffic, our sites have seen an increase, seems to be assisting getting rid of some of these affiliate farmers for ecom as well.
-
Can someone who didn't see a drop confirm or deny if the "Requesting reconsideration of your site" link is in the yellow Help box on the left side of your Webmaster Tools? That link just appeared for us, and I wonder if that's because Google has set some site-wide penalty on our site? It seems more likely that Google added it for all users after the update, but it would be significant if someone doesn't have that link in Webmaster Tools_._ We don't want to further anger the Google Gods so we're going to wait for the updated algorithim instead of requesting reconsideration at this time.
-
Good ideas, Dave. Your housecleaning ideas prompted us to delete 2,500 old skin care product pages that we had converted to static pages after switching off Drupal. It constituted 1% of our pages and visitors, but the poor metrics from those pages could have acted as a bad signal (2,500 pages is a lot for a human to create) and tarnished our overall site. 1% rotten is still too much.
Good idea about bringing back more content onto the Category/Subject pages. We're going to do the same for our main landing pages. The data I've seen indicates site speed seems is only used to break ties between pages with a similar ranking.
- Eric (CTO at RealSelf.com)
-
I look forward to seeing the Moz Reply -- I've been refreshing your blog since yesterday!
-
Thanks for your response Stuart (and Tom/RealSelf and others as well) -- some extra color from us:
- Yeah, we have a lot of indexed pages. However, there isn't much we can do about it, as we truly do have more than 500,000 class listings alone (not to mention Teacher and School Profiles, category pages, etc), and these are all unique in some way (price, date, geo-location, etc.). You could argue that we could include that all on one summary page, but then we'd equally frustrate users who are looking for their exact match. We decided to focus on humans in this instance, vs. the needs of bots.
- We're working to reduce some placeholder-like pages. For instance, we've been creating pages for something such as 'Wichita, KS Programming', but it may only have Online Classes. In the next 24 hours, those pages (that don't have any local/in-person classes) will redirect to the online/non-geo versions of Programming pages. Here's an example of one of those pages:
http://www.teachstreet.com/wichita-ks/sewing-fabric-arts/50564-385
After our change, this will redirect to this 'online class' page:
http://www.teachstreet.com/sewing-fabric-arts/classes/385
- We've also seen the impact to be pretty much sitewide. And we can't identify any specific geographies, categories, or page types, that have been specifically impacted.
- As part of our review, we HAVE found some sites that looks to be creating some pretty eggregious copies of our data (for instance, the family of sites owned by www.hellometro.com, that spawns 1,000s of similar sites like www.helloseattle.com, have our content on them, with no link-backs). So, we submitted those types of sites to Google for review.
- We also resubmitted TeachStreet to Google for consideration, in Webmaster Console.
- We're removing some legacy 'seo spammy-type content' that we've had on the site since we launched, that we've never bothered to remove (meta-keywords, top-of-page-category descriptors, some excess footer links)
- We had removed some 'Article' and 'Q&A' type content from our Category/Subject pages (to increase their page-load speed)... we'll be moving some of that back, because the content is unique, and high-quality, and also because we think we can do so, without impacting page-load times
Any other ideas?
Dave
-
Our Google.com traffic dropped 29% last week, and we get a fair amount of International traffic too (24%), so we'll probably drop further when the change is replicated to .ca, .co.uk, etc.
Looking at our keyword traffic (or our SEOMoz ranking report), it's really clear that our biggest decreases in visits came from stem keyword searches like [botox] or [invisalign] where we were on the first page of results. The ironic thing is that those pages have low bounce (17% and 27%) and no AdSense ads. We removed the AdSense ads from the landing pages a few months ago to simplify the initial visitor's experience.
It looks like Google is treating those pages as a list of blogs instead of a TripAdvisor hotel listing. Many of the sites that were hit in this update were faux blogs targeted to high CPC keywords.
Since last May, we've been frustrated that Google keeps ignoring our meta description in favor of the first review snippet on the page. (Bing doesn't do this.) Even if our rankings are down, we at least want to present an accurate view of the thousands of reviews, photos, and answers that our site has and not be treated as one person's blog.
We've tried a lot of thigns to get Google to pick up a better meta description, but at best it only works on our site search for a few hours and then Google takes user text from a freshly updated review. At this point it looks like our only option is to pass the first words of each review in a JSON array and display it using Javascript, possibly on mouseOver. Its frustrating that we have to do such contortions, but as Fred Wilson says, it "sucks being a Google bitch."
-
Just to follow-on from my last post, apparrently Google is working on a fix
-
We got hit at StoreCrowd by around ~40%, whilst it's not devastating for our business it does raise some concerns about what we're doing (in comparison to competitors).
I've made a few observations:
- Many of the websites that I've seen hit appear to have a large number of indexed pages (TeachStreet you have over 1M+ for example)
- It appears to have hit sites that don't have a high amount of unique content to page ratio. For example we currently have a lot of blank "placeholder pages" & we also split out our merchants into 4 areas (store, coupons, deals, reviews).
- The penalty appears to be sitewide, we have a large number of links & quality on our blog but even these pages have been hit hard.
- The dropoffs in rankings can be a few spots or ~30 spots, I can't explain why this is - for less competitive keywords the dropoff appears to be less. This leads me to think this ain't a penalty but Google is simply reranking based on new factors.
- I don't think this has anything to do with links.
What we've done so far:
- We're working on increasing the unique content to page ratio - we've noindex,nofollowed all pages that have 0 content (or the placeholder pages I spoke about)
- The next step is to further increase the uniqueness of our tag pages & store level pages.
But, we have competitors that have a lot more duplicate content than we do & they seem to be fine. So, we're merely testing & speculating at the mo!
Happy to hear any suggestions.
-
Teachstreet doesn't fit the profile of a lot of other sites that got hit. Were some pages affected and others not? If you could show off a dozen or so of affected vs. not, that could really help sort out the issue (and possibly give us a roadmap to help).
That sucks Dave! Teachstreet has been getting so good lately, too.
-
Dave,
There is a considerable amount of research being done by Moz and the Moz bloggers at the moment on this topic. A blog post is planned once the team reaches a conclusion. I would expect it within the next few days (if not tomorrow).
If the post doesn't answer your question, you might think about posting your own separate question to get feedback from the community.
Mike
-
We (www.TeachStreet.com) have been pretty negatively hit, with a reduction of ~44% week over week (comparing Thurs-Mon vs a comparable prior period). We're trying to be calm, and find out what's driving it -- we think it's because we are a directory of classes/courses, and many of these classes can be found on the sites owned by our customers... but they're all formal relationships (not scraped content, etc.) so we're not sure what to do.
Any ideas welcome / appreciated.
-
On many of our client sites we've actually seen positive impacts from the farm updates.
Previously, some of our clients were being beat out for the top few positions by some content farms for various long tail search terms.
After the Farm Update we've seen those content farm page results drop off into oblivion and our client sites have stepped right up into their positions. Our client's have gained dozens of new 1st, 2nd, and 3rd position SERP results on long tail keywords.
So in summary - none of the websites we manage/seo have been hurt by the Farmer Update. We have, instead, been rewarded from it.
-
I haven't seen any impact on my sites or on my day to day searches. I think most people took the line that 12% of queries were affected to mean that 12% of websites were affected. Far from true. I think while 12% of search queries may have seen the rankings visibly change, this does not mean 12% of Google's indexed sites were affected in some way. I think this was a handful of sites that accounted for a lot of long tail search results.
-
actualy I lost about 10 places on my main keyword; strangely NO other keywords were affected and long tail traffic did not get adjusted as well.
-
I've not seen any change either. One of my clients has a news site that is pretty ad heavy. While the site does run some non-original content (wire stories for example) at least 60% is original content. Overall Google traffic for that site is up 4% in the last week - with 10% more keywords delivering traffic than the week before. That is right on average with our normal week over week increase.
-
no, and I noticed that merchantcircle, one of their 'top 25 biggest losers' still ranks quite well for long tail service-industry queries.
-
No change, pretty much at all, overall at least.
However some pages we had some poor quality links on have dropped a bit in ranking, but weren't huge traffic generators anyway.
I'll certainly be keeping an eye on it.
-
Personally I have not seen any change actually. And reason is simple: the farmer update still is not alive in the regional Googles, as advised Matt Cutts in its post.
What would be interesting would be to see:
- how the web farms affected in the Google.com ranks in the regional Googles? This can be especially interesting examining all the english based regional Googles (discounting the localization factors);
- how big the difference in traffic will be when the algo update will affect also all the Googles, as - I suppose - then we will see its effects over all the translated/international versions of the websites, which are seriously vampirezed from local farmer sites.
-
Hi Gianluca,
Thanks for the notice. You are right. I just checked our trafic coming from Google.com (and not Google.fr) and it is actually increasing...
Just glad for not beeing impacted by this farmer update.
J.
-
If your site - I imagine - is targeting the French market, therefore Google.fr, therefore I believe you don't have seen any change yet simply because the farmer update still in not alive on regional Googles (as Google.fr is).
But mine is just an assumption, not knowing the real target (therefore Google version) of your site.
-
One client site did very well - it made the SEO Clarity chart for the top ten winners. A small mom & pop level ecommerce client lost 4 out of 5 number one rankings. Fortunately most of their revenue is from repeat customers - we've been working on customer retention for ten years. Still studying to come up with ranking recovery plans, if the algo doesn't self adjust soon.
But most of the sites I work with do not seem affected at all.
-
That could very well be it. I rode the line on a lot of things but ad placement wasnt really one of them. I just have the standard 3 adsense blocks and they are well placed and blended into the site. I also intentionally sold some links on the site (like I said its a testbed) but they arent designed to stick out like a sore thumb so I doubt that would make a diff one way or the other.
P.S. Haha fruedian slip of some sort I'm sure.
-
Kris - awesome that you "rand" some tests. I like to do that myself
One thing we've been noticing is that sites with very aggressive ads (AdSense, overlays, display, etc.) seem unusually hard hit, while content farms that are less agro on that front weren't. Maybe a user/usage data thing?
-
Strictly no impact.
We (a French real estate company) currently receive around 600K unique vistors per month from search traffic and as far as I can see, there is strictly no impact on our traffic coming from search engines.
By the way, the new Q&A forum for PRO is just f*cking awesome! Just love it! Keep up the great work guys,
J. from Paris, France.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
The "Fetch As Google" limit has been decreased - what now?
Since Google decreased the "Fetch As Google" limit to ten pages per day, we've been a bit stuck. We're publishing 20-30 content pages per day, targeting a huge range of search queries. Circa 40% of our traffic comes to us through these pages. Since we're now heavily restricted on submitting these to Google, who's got other ideas to get the pages picked up quickly? I'm slightly concerned because although the pages link outwards to other areas of the website, no other areas of the site link to these pages. They're purely top-of-the-funnel. We can't be the only people with this concern. How would you address it?
Industry News | | MSGroup1 -
What Google Analytics Data to Share with Potential Website Buyer
Hi Mozzers,
Industry News | | emerald
We have contacted our competitors to let them know we would like to sell our website (domain and all content). One of them has asked for Google Analytics data. Which parts of this and how is this data best shared in such a case? As this is the opening of offers, I'm assuming some kind of PDF export with a summary of some Analytics data is sufficient to see who is serious. Then for those who are serious more data could be shared. Or is it ever ok to share your full Analytics with a competitor? Would love to hear what data and best practises are used to share this kind of information. Thank you.0 -
Google Analytics (Not Provided) Count will Increase 100% by Oct 2014 ? - Your Advice ?
What will you Do if you cannnot find your Top Keyword in Google Analytics "not provided"
Industry News | | Esaky
Check here for more details: http://www.notprovidedcount.com/0 -
How do you measure impacts of Google Updates Like Penguin 4?
Having a conversation with a fellow SEO via twitter and we were discussing measuring algorithm updates. In the aftermath of Google Penguin 4 how do you determine the effects it has on your site/sites and your respective verticals?
Industry News | | Thos0030 -
Get Google To Crawl More Pages Faster on my Site
We opened our database of about 10 million businesses to be crawled by Google. Since Wednesday, Google has crawled and indexed about 2,000 pages. Google is crawling us at about 1,000 pages a day now. We need to substantially increase this amount. Is it possible to get Google to crawl our sites at a quicker rate?
Industry News | | Intergen0 -
Not schema, but a new kind of search result?
I came across this search result in Google and I've been racking my brain out in trying to figure out how they did it. Do a search for Novus CD4 and you'll see a search result where they list additional products from the landing page. I used Google's Rich Snippet tool to analyse the page and find no microdata at play. Any ideas how this was achieved? Have you guys come across anything like this? I was thinking of integrating this with schema to display rating stars and prices on an ecommerce site S4aL5.png
Industry News | | Bio-RadAbs0 -
Google Trends - what did you do?
So is it me or did Google make some crazy changes - The "trends" are no longer anchored to appropriate articles etc... Why do you think they would remove something so useful to us? http://www.google.com/trends/ - check it out for yourself. tumblr_m5jh04D65G1ry8grko1_1280.png
Industry News | | Chenzo0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690