Hi Gavo,
You put up robot.txt instead of robots.txt.
http://kaya3976.com.au/robots.txt indeed disallows all pages.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Job Title: Principal
Company: Managing Greatness
Website Description
Management in the Age of Google
Favorite Thing about SEO
Connecting with the people looking for what my client can provide
Hi Gavo,
You put up robot.txt instead of robots.txt.
http://kaya3976.com.au/robots.txt indeed disallows all pages.
Thanks. They are very long tail terms, everything on this site is. Yes, the GA & GWT top 1,000 queries match. But neither matches the Real Time results from that day's Analytics (even after waiting a few days for the regular analytics keyword reports to include the RT data from a few days earlier).
I took a screen shot of the keywords from Google Analytics Real Time Keywords for Google / Organic. Then I waited a week and filtered Google Webmaster Tools' Search Analytics for queries with those keywords. None of them showed up.
Thanks
Thanks Everett, I appreciate it!
Hi Travis,
Thanks for your reply.
As I just wrote to Everett, I can't share too many details for confidentiality reasons. My site is somewhat similar to WhitePages, where http://www.whitepages.com/ind/p-001 has a Moz Page Authority of 45, but http://www.whitepages.com/ind/p-150 has a Moz PA of 1. We have similar PA distribution among our index pages, but our organic search traffic is just as high when linked to from the PA 1 pages and when linked to from the PA 45 pages. So I don't know if my client should spend time fixing the problem.
Thanks
Thanks. I can't share too many details for confidentiality reasons. I realize that makes it hard / impossible to diagnose correctly, and I'm sorry about that.
These are person pages. The site's link structure naturally gives more link power to the people with the most connections. We could NoIndex (or mask links to) pages that don't have much information but I think such a system would probably be complex and may backfire.
So there's not the kind of taxonomy / directory / long-tail keyword structure that you would expect from a large product directory (for example).
Let's pretend we're discussing WhitePages.com where http://www.whitepages.com/ind/p-001 has a Moz Page Authority of 45, but http://www.whitepages.com/ind/p-150 has a Moz PA of 1. I can fix the problem and get the back pages to have higher PA, but I can't recommend that my client spend resources to fix this since the pages at the back of the index get just as much organic search traffic as the pages at the top.
Thanks
Thanks. Sorry I wasn't clear, when I say the traffic is pretty evenly distributed among the pages, I'm referring specifically to organic traffic. I'm wondering if the relatively even distribution of organic traffic is proof that better balancing the link flow won't increase traffic.
I'm working with a site that has millions of pages. The link flow through index pages is atrocious, such that for the letter A (for example) the index page A/1.html has a page authority of 25 and the next pages drop until A/70.html (the last index page listing pages that start with A) has a page authority of just 1. However, the pages linked to from the low page authority index pages (that is, the pages whose second letter is at the end of the alphabet) get just as much traffic as the pages linked to from A/1.html (the pages whose second letter is A or B). The site gets a lot of traffic and has a lot of pages, so this is not just a statistical biip. The evidence is overwhelming that the pages from the low authority index pages are getting just as much traffic as those getting traffic from the high authority index pages. Why is this? Should I "fix" the bad link flow problem if traffic patterns indicate there's no problem? Is this hurting me in some other way? Thanks
For weeks, Yahoo consistently contributed just over 80% of what I got from Bing. Suddenly for the last two weeks the Yahoo and Bing graphs diverged, with Yahoo traffic dropping to 50% of Bing's. Any ideas? Did Yahoo make any deals with companies to give them better ranking? Have they suddenly started adding more ads above the fold? Any thoughts? Thanks
If all of the pages are very valuable and serve the user you should be OK. But if Google finds too many people clicked a search result to your page, and then quickly bounced back to Google and clicked a different result from the search listings, they'll trust your site less and lower your rankings across the board.
The problem isn't too many pages, it's too many pages that aren't satisfying enough of Google's users.
In February 2011 Google launched "Panda" whose main intent was to find sites that were creating many low quality pages. Since then, the "content farm" strategy of quickly creating many pages has generally been backfiring.
Instead focus on pages that will satisfy Google's users, and preferably also attract social shares and links.
Hi Gavo,
You put up robot.txt instead of robots.txt.
http://kaya3976.com.au/robots.txt indeed disallows all pages.
1/19/2010 Q&A sites are a great way to get your message across and to build your brand and reputation.How many people use Q&A sites? In a recent Business.com study, 49% of companies that use social media said they ask questions on Q&A sites. Only 29% said they use...
9/18/2009 Google something like SEO Content Writing Tips and you'll get lots of good advice. Most of it boils down to:Write content that's entertaining, engaging, informative, and compelling. Stuff that your readers will love and link to.Do keyword research and optimize your title and your article for those keywordsWhich is all good advice, but i...
Husband, father, founding member of Answers.com, and conference speaker
Looks like your connection to Moz was lost, please wait while we try to reconnect.