Hey Tom,
I guess this was mostly a regex question. I think I figured it out:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^olddomain.com$
RewriteRule ^oldsubdir/([a-zA-Z.]+).aspx$ https://www.newdomain.com/newsubdir/$1 [R=301]
Thanks for the tips.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Job Title: CEO
Company: Born To Sell
Website Description
covered call investment tools
Favorite Thing about SEO
free organic traffic
Hey Tom,
I guess this was mostly a regex question. I think I figured it out:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^olddomain.com$
RewriteRule ^oldsubdir/([a-zA-Z.]+).aspx$ https://www.newdomain.com/newsubdir/$1 [R=301]
Thanks for the tips.
Thanks, Tom. But 'fruit' was a placeholder for a variable page name that can contain letters or a '.' (sorry if that wasn't clear). for example, 'fruit' could be "abc.def.aspx" and I'd like it to become "ABC.DEF" (strip off the .aspx and uppercase it). Need a regex. The '.' within 'fruit' may or may not be present. But the page will always have the suffix '.aspx' that I want to strip off.
I'd also like to do it with an .htaccess statement instead of VBScript. Running on unix.
I need some help with a regex for htaccess. I want to 301 redirect this:
to this:
changes:
I think it's something like this (placed in the .htaccess file in the root directory of olddomain):
RedirectMatch 301 /oldsubdir/(.*).aspx https://www.newdomain.com/newsubdir/$1
Thanks.
Thanks, everyone! Based on your suggestions, I'm going to try and keep the competitor's site structure (to keep his ranking pages in G's index) and some of his content (don't want to tweak his on-page content too much or those pages might lose their rank). I will add links to each of the competitor's page pointing to the relevant internal page on my site. For smaller pages (that don't rank) I will 301 them to the closest matching page on my site.
I have looked at the 'landing pages' report in Analytics for the competitor's site and will manually adjust (ie explain the merger and add a link to relevant internal page on my site) any landing page that gets more than a little traffic.
His home page is the trickiest and most important. I want to post an announcement about the merger and get people to my site (the acquiring site) asap. I guess I'll do it with an announcement and a link to my site, as opposed to a 301, so that his visitors know what's going on. I will need to leave some of his content on his home page (and his title tag and meta-description tag) so that it still ranks for keywords I care about.
He and I had competing products and I intend to sunset his products after the merger, but keep his ranking content.
Any other suggestions regarding buying a rival, established site, are welcome.
I've purchased a competitor. They rank well organically for keywords that I target, and I want to optimize the way I get value from their current rankings and traffic (and customers -- we will obviously market to their email/customer list). Which is better:
(1) use a 301 redirect for any access to their domain and point it to my home page. I think this would force Google to de-index all of their pages, right?
(2) put up a stub page as their homepage that announces the site has been bought, and have a do-follow link to my home page (which maybe is auto-redirected after 10 seconds or something)? Maybe this is better to keep their home page in Google's index for a while?
As for option (1), I thought I read somewhere recently that 301'ing a domain to the home page of another domain would no longer pass link juice (?). Maybe I should 301 the newly purchased domain to a sub-page on my site that explains the acquisition and asks them to sign up on my site?
Both sites are legit. No spamming happening here; just industry consolidation as one competitor acquires another. Thanks in advance...!
Thanks, Tommy. That confirms what I thought. I wouldn't mind so much if the bigger site didn't nofollow my author tag but since they do then I'm getting little benefit from them other than exposure to their audience. And that is worth something, to be sure.
Maybe I'll post on their site for a day or two and then delete the post on their site (I have that ability) so that I get some exposure there but then the only copy of the article will be on my site after a couple of days.
Thanks, Egol. For my next few postings I will keep them on my own site and see what kind of rankings and traffic they get for a month or so. Then compare that traffic to the traffic I've seen from articles I've posted on the larger site.
Appreciate the input. I do want to build equity for my own site, but it's a trade off with getting more exposure/customers on the bigger site. I am in this for the long haul, though, so I suppose tons of unique content on my own site will be valuable in the future.
I have a small site and write original blog content for my small audience.
There is a much larger, highly relevant site that is willing to accept guest blogs and they don't require original content. It is one of the largest sites within my niche and many potential customers of mine are there.
When I create a new article I first post to my blog, and then share it with G+, twitter, FB, linkedin.
I wait a day. By this time G has seen the links that point to my article and has indexed it.
Then I post a copy of the article on the much larger site. I have a rel=author tag within the article but the larger site adds "nofollow" to that tag. I have tried putting a link rel=canonical tag in the article but the larger site strips that tag out.
So G sees a copy of my content on this larger site. I'm hoping they realize it was posted a day later than the original version on my blog. But if not will my blog get labeled as a scraper?
Second: when I Google the exact blog title I see my article on the larger site shows up as the #1 search result but (1) there is no rich snippet with my author creds (maybe because the author tag was marked nofollow?), and (2) the original version of the article from my blog is not in the results (I'm guessing it was stripped out as duplicate).
There are benefits for my article being on the larger site, since many of my potential customers are there and the article does include a link back to my site (the link is nofollow). But I'm wondering if (1) I can fix things so my original article shows up in the search results, or (2) am I hurting myself with this strategy (having G possibly label me a scraper)? I do rank for other phrases in G, so I know my site hasn't had a wholesale penalty of some kind.
It's not my choice to no follow the author credit; it's a policy on another site where I guest blog. I'm just wondering if they're causing me not to get authorship credit by adding the no follow.
I have seen two forms of rel=author syntax. Are they both valid?
While it used to be possible to keyword stuff content to game the search engines, that is less true today. I'm a believer in LSI (latent semantic indexing) whereby the SEs understand the meaning of a page even if they keyword isn't exactly present. SEs today know about synonyms and can extract meaning from text, allowing for creative and interesting use of language. Without tedious keyword repetition the content you create will be of more interest and read better to a search user –increasing the user experience.
I take the viewpoint that you should write for humans first, and then review the content to make sure the keyword is present at least a little bit. I would not go so far as to say you should have 1 occurrence of the keyword per 150 words, or any other hard and fast density metric. But that's just me.
Also, I think the external factors (links with appropriate anchor text, and variations of that anchor text) are more important than on-page factors like keyword density. Just my opinion, although I do rank well for many phrases and I ignore keyword density (other than making sure the keyword is present in some minimal fashion on the page that is optimized for it).
It only updates when Linkscape is updated, about once every 4-6 weeks. The schedule is here: http://apiwiki.seomoz.org/w/page/25141119/Linkscape-Schedule
Last update was Apr 5, so next will probably be mid-May. I believe SEOmoz is working on making the updates more frequent as many of us would love that...
Don't do it. Yes, they will be spammy, low-quality, and pass extremely little (probably zero) page rank. The directory pages are probably not indexed and if they are they are probably devalued. And if there are 65,000 of them then they're probably all owned by the same guy, in some templated database style, with the same C-class IP address. Google is wise to this and doesn't index or doesn't value those kinds of links.
I think that's because it uses the LinkScape database, which is only updated about once every 4 weeks. The next update is suppose to be tomorrow so you might see a change by tomorrow evening (i'm not sure what time of day the update goes live). There is a calendar of updates here: http://apiwiki.seomoz.org/w/page/25141119/Linkscape-Schedule
url length per se, no, but URI structure, yes. You should not have your site designed 10 layers deep, for example. There's a discussion of it here: http://www.webmasterworld.com/google/3579234.htm and I like this example they give:
/readme.php?source=direct&page=34&articletype=widgetingforests&loc=uk
should rank better than:
/readme-source-direct-page-34-articletype-widgetingforests-loc-uk.html
or
/readme/source/direct/page-34/articletype/widgetingforests/loc-uk/
correct, a higher number is more competitive
Anything that is human reviewed, has editorial standards, and enforces longer word length. The opposite, which would be sites that accept anything, including posts that are less than 300 words, would not be good.
In the 'good' camp I would include
www.buzzle.com (600 word min)
www.selfgrowth.com (500 word min)
would be interested in hearing from others on which ones they use that are human-edited or otherwise have more strict than post-anything-you-want submission guidelines.
Given all the recent talk about over optimization, when was the last time SEOMoz updated the on-page report card tool?
Rand wrote an excellent piece on Perfect On-Page Optimization (which is great, and thanks) in summer 2009. Is that still best practice 3 years later (and post-Penguin/Panda)? If not, has the SEOMoz on-page report card tool been updated to reflect current thinking for on-page best-practices?
I know the higher level concept is "write for humans, not for bots" but if you can do both (and not create an unreadable seo-frankenpage) then why not? Does getting an "A" grade wreak of over optimization now? Should I use the key phrase at the start of the title, h1, and strong (or bold) elements on a page? Should have an image with file name and alt text equal to (or containing at the start) the key phrase?
Thanks for asking. A few ideas:
1. Review of link-building services or networks. Rate each on cost, size of network, page rank of network, spammyness, etc. Any of them worth the trouble, or is the whole group to be avoided as grey/black hat spam?
2. How to get .edu or .gov links.
3. Find a small site that ranks higher than the wikipedia page for the generic keyword category the small site is about, and then offer an analysis of how/why the small site is able to rank higher than such an authoritarian site for that keyword/category.
4. List of top 10 article databases offering do-follow links and their requirements, if any (min word count, links per article, is the directory human reviewed, etc).
5. How to cost effectively have video testimonials made (by actors not working at your company), and then top 5 sites to post them on once you have them.
I know your pain. Been there. My experience was not good. Tried 2 of them. Even though they promised relevant, gramatically correct English postings, they did not deliver. I believe they were acting as middle-men to low-cost labor offshore who did not write solid English nor know anything about the niche my product was in. Plus, they did some questionable things with links that if done on large scale would probably get my site banned. Fortunately I was able to stop it before it got out of control.
The only way I would use an outsourced service again is to pre-approve every post/article in advance (at least for the first 50 or until I got very comfortable with their work). And that's tedious enough (and time consuming enough) that I'm not sure what value they're adding at that point. I could hire my own low-cost labor and then do the same approval process and then post them (cutting out the middle man).
It's really hard when site owners give testimonials like "my SEO firm is great" because the site owners are often unaware that their outsourced SEO firm is using black/grey hat tactics (eg JC Penny). If you're going to hire someone and pay them hundreds/month then take a long time to interview them. Ask for customer references. Ask for data on improvements they've made for their customers (what were the rankings before they started, and how did they improve over the months?). They should have multiple examples of monthly improvement for multiple clients, and then explain to you which white-hat techniques they used to achieve those results. If they refuse to share client references (where you can get on the phone and speak to some of their clients) or data then move along and look elsewhere; there are a million scamsters out there...
1/21/2012 We launched our site in July 2010. By the end of 2011 we ranked on page 1 organic results for 108 relevant phrases. During 2011 we went from 4 phrases in the top 3 results to 44 phrases in the top 3. Here are the SEO tactics we used to get the equivalent of $100K in PPC ads in 2011 for free.
12/16/2011 How would you spend your SEO time and money when given these choices: - Take phrase ABC that ranks #5 organically and try to move it to #3 - Take phrase DEF that ranks #25 and try to move it to page 1 - Create new content for new phrase XYZ where you don't currently rank, and then try to get it ranked on page 1.
investment banker, venture capitalist, software engineer, and now CEO of Born To Sell, a subscription-based financial services company focused on covered calls.
Looks like your connection to Moz was lost, please wait while we try to reconnect.