Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • Hi all, I'm working on a site redesign and it is possible the new site could issue a lot of 301 redirects as we may migrate from one forum solution to another. Is there any issue with crawlers getting a lot of 301 redirects from a site? Thanks Nick

    | nickswan
    0

  • We have made a big mistake.... So what can we do to fix this? A trainee member of staff has used the seomoz 100 top directories and added to sites from PR10 to PR6 approx about 25 sites, using keywords were possible instead of using the website URL "which i now was stupid!. Our website ranking have been lowered big time for all keywords used!, eg from 1st to 10th and even disappeared from the top 100 We are contacting all directories asking for the Title link to be changed to the URL instead of a keyword.. Will this help? I understand that Google give sites a penalty for this!!, but what can i do to put this right and how long would this penalty last for? Any advice would be highly appreciated... Thanks Dean

    | deanpallatt
    0

  • Hi, I realize this is a very broad question, but I am going to ask it anyways in the hopes that someone might have some insight. I have created a great deal of unique content for the site http://www.healthchoices.ca. You can select a video category from the top dropdown, then click on a video beside the provider box to see. The articles I've written are accessible by the View Article tab under each video. I have worked hard to make the articles informative and they are all unique with quotes from expert physicians. Even for strange health conditions that don't have a lot of competition - I don't see us appearing. Our search results are quite dismal for the amount of content we have. I guess I'm checking to see if anyone is able to point me in the right direction at all? If anything jumps out... Thanks, Erin

    | erinhealthchoices
    0

  • Hi all, I have a pretty big problem with my site at the moment which I'm worried will have an impact on my rankings. I've just had a crawl test done and for some reason I get a load of urls returned that don't actually exist... For example I am getting urls like this in my crawl test and xml sitemap: www.applicablejobs.com/jobs/add/android-designer/android-designer/android-designer/android-developer/android-developer/ www.applicablejobs.com/jobs/add/android-designer/android-designer/android-designer/android-developer/iphone-designer/ All the urls seem to start off with www.applicablejobs.com/jobs/ and there is an entry for every conceivable combination of slugs. I can only assume that if the crawl test and an xml sitemap generator is indexing these urls then Google and other search engines probably are too. Does anyone have any idea what might be causing this issue and what can I do to remove them from Googles index if they are? Thanks

    | Benji87
    0

  • Hello I wonder what that information to appear highlighted in the description in the attached file ... this is new to me. tks DLvtI.png

    | eder.machado
    0

  • Hi there, I've recently started working on a very large travel website. One of my main duties is to get it rankings for certain terms (which it is't at the moment, at all!) A large proportion of the website is dynamic, meaning that the pages, and URLs are produced using sessions. I've already enquired with the company who provide the website about how I can get unique meta data for each page on our website. They came back and said it can be done for the static pages, but not for the dynamic pages. This leaves me with about thousands of pages with duplicated meta data. Not at all ideal. I was just wondering how damaging this is likely to be to the SEO of my site. Am I going to be able to achieve rankings even with this issue? Or do I need to get it sorted ASAP? Thanks

    | neilpagecruise
    0

  • Hi all, I'm very new to SEO and still learning a lot. Is it considered a black hat tactic to wrap a link in a DIV tag, with display set to none (hidden div), and what can the repercussions be? From what I've learnt so far, is that this is a very unethical thing to be doing, and that the site hosting these links can end up being removed from Google/Bing/etc indexes completely. Is this true? The site hosting these links is a group/parent site for a brand, and each hidden link points to one of the child sites (similar sites, but different companies in different areas). Thanks in advance!

    | gemcomp123
    0

  • Hi. I am new to the world of SEO, and SEOmoz has already taught me a lot. I am a newly appointed in house SEO at www.completeoffice.co.uk. They want to rank for the keyword office interior design, but I noticed that all the keywords they are trying to rank for, there homepage shows up on search results. So I optimised the on page SEO of their website for the specific office interior design page. http://www.completeoffice.co.uk/interiors.aspx. I took it from a grade F to a grade A. Currently they are showing up just outside the top 50 for office interior design, and its the homepage in the google results. Now I have made these changes and on page optimised the SEO for this page to be solely for office interior design keyword will the search results change so that it is the specific office interior design page showing and linking to , or will it always remain the home page. I have optimised many of the other pages in this way so that a keyword will link to the relevant page and rank for that relevant page, rather than get the home page to rank for 30 keywords. Thanks for any help.

    | CompleteOffice
    1

  • Hi, My website having canonical issue for home page, I have written the .htaccess file and upload the root directory. But still I didn't see any changes in the home page. I am copying syntax which one I have written in the .htaccess file. Please review the syntax and let me know the changes. Options +FollowSymlinks RewriteEngine on #RewriteBase / re-direct index.htm to root / ### RewriteCond %{THE_REQUEST} ^./index.htm\ HTTP/ RewriteRule ^(.)index.htm$ /$1 [R=301,L] re-direct IP address to www ### re-direct non-www to www ### re-direct any parked domain to www of main domain RewriteCond %{http_host} !^www.metricstream.com$ [nc] RewriteRule ^(.*)$ http://www.metricstream.com/$1 [r=301,nc,L] Is there any specific htaccess file format for apache server? Thanks, Karthik

    | karthik-175544
    0

  • We utilize a dedicated server to host roughly 60 sites on. The server is with a company that utilizes a lady who drives race cars.... About 4 months ago we realized we had a group of sites down thanks to monitoring alerts and checked it out. All were on the same IP address and the sites on the other IP address were still up and functioning well. When we contacted the support at first we were stonewalled, but eventually they said there was a problem and it was resolved within about 2 hours. Up until recently we had no problems. As a part of our ongoing SEO we check page load speed for our clients. A few days ago a client who has their site hosted by the same company was running very slow (about 8 seconds to load without cache). We ran every check we could and could not find a reason on our end. The client called the host and were told they needed to be on some other type of server (with the host) at a fee increase of roughly $10 per month. Yesterday, we noticed one group of sites on our server was down and, again, it was one IP address with about 8 sites on it. On chat with support, they kept saying it was our ISP. (We speed tested on multiple computers and were 22MB down and 9MB up +/-2MB). We ran a trace on the IP address and it went through without a problem on three occassions over about ten minutes. After about 30 minutes the sites were back up. Here's the twist: we had a couple of people in the building who were on other ISP's try and the sites came up and loaded on their machines. Does anyone have any idea as to what the issue is?

    | RobertFisher
    0

  • I have a site that I had removed from the wayback machine because I didn't want old versions to show. However I noticed that in many seo tools the site now always shows a domain age of zero instead of 6 years ago when I registered it. My question is what do the actual search engines use to determine age when they factor it into the ranking algorithm? By having it removed from the wayback machine, does that make the search engines think the site is brand new? Thanks

    | FastLearner
    0

  • OK this one's a little confusing, please try to follow along. We recently went through a rebranding where we brought a new domain online for one of our brands (we'll call this domain 'B' -- it's also not the site linked to in my profile, not to confuse things).  This brand accounted for 90% of the pages and 90% of the e-comm on the existing domain (we'll call the existing domain 'A') .  'A' was also redesigned and it's URL structure has changed.  We have 301s in place on A that redirect to B for those 90% of pages and we also have internal 301s on A for the remaining 10% of pages whose URL has changed as a result of the A redesign What I'm wondering is if I should tell Google through webmaster tools that 'A' is now 'B' through the 'Change of Address' form. If I do this, will the existing products that remain on A suffer?  I suppose I could just 301 the 10% of URLs on B back to A but I'm wondering if Google would see that as a loop since I just got done telling it that A is now B. I realize there probably isn't a perfect answer here but I'm looking for the "least worst" solution.  I also realize that it's not optimal that we moved 90% of the pages from A to B, but it's the situation we're in.

    | badgerdigital
    0

  • Hi, A client's site was previously built in Joomla and he wants us to reproduce content that was in there, but the Joomla site is no longer live and has come to me as an archive containing all the files and folders that were included. So, I am looking at the files and folders without Joomla installed. Can someone tell me quickly how to find the where the actual page content was stored? I started looking, but there are some folders I cannot open and nothing that looks as I expected. Would appreciate a hint or two from someone who knows Joomla well.. Life is too short! Thanks Sha

    | ShaMenz
    0

  • Debating registering new domain or spending bucks for old domain, both with equivalent keywords. Normally old is better, but is this true even if the old name was just parked? In other words, is it worth spending $ for a domain that is not indexed or not ranked, just to get the aging? Options... [Keyword]Help.com - new, cheap [Keyword]Guide.com - old, not indexed, $ [Keyword]Info.com - old, indexed but not ranked anywhere (i.e., only found with exact match search), $$

    | draymond
    0

  • How do we solve an issue faced on the Printer Friendly version of the page lets say, There is an Actual Page (Page A) and a Printer friendly version (Page B) of Page A. Page A is ranking at position 5 and Page B is not ranking. Both these pages are indexed by Google and most of the backlinks are going to Page B as compared to Page A. one of the ways is to implement Canonical on Page B. Are there some other ways to solve the issue and how can we implement it? How can ensure that all the links going to page B pass on the link value to Page A.

    | SEOTeam35
    0

  • Hi, I have recently been getting my comparison website redesigned and developed onto wordpress and the site is now 90% complete. Part of the redesign has meant that there are now dynamic urls in the format: http://www.mywebsite.com/10-pounds-productss/?display=cost&value=10 I have other pages similar to this but with different content for the different price ranges and these are linked to from the menus: http://www.mywebsite.com/20-pounds-products/?display=cost&value=20 Now my questions are: 1. I am using Joost's All-in-one SEO plugin and this adds a canonical tag to the page that is pointing to http://www.mywebsite.com/10-pounds-products/ which is the permalink. Is this OK as it is or should i change this to http://www.mywebsite.com/10-pounds-products/?display=cost&value=10 2. Which URL will get indexed, what gets shown as the display URL in the SERPs and what page will users land on? I'm a bit confused so apologies if these seem like silly questions. Thanks

    | bizarro1000
    0

  • We have around 1.3m total pages, Google currently crawls on average 87k a day and our average page load is 1.7 seconds.  Out of those 1.3m pages(1.2m being "spun up") google has only indexed around 368k and our SEO person is telling us that if we speed up the pages they will crawl the pages more and thus will index more of them. I personally don't believe this. At 87k pages a day Google has crawled our entire site in 2 weeks so they should have all of our pages in their DB by now and I think they are not index because they are poorly generated pages and it has nothing to do with the speed of the pages.  Am I correct?  Would speeding up the pages make Google crawl them faster and thus get more pages indexed?

    | upper2bits
    0

  • We have uniquely created all of our product content on our website (Titles, product descriptions, images etc). However, we are also a manufacturer of these products and supply to a number of trade customers. These customers often wish to setup their own websites to re-sell these products. In the past we have quite happily given this content in order to assist our customers sell on their sites. Generally we give them a 'data dump' of our web data and images, but reading about duplicate content this will lead to the search engines seeing lots of identical content on these customer sites. Whilst we wish to support our customers we do not want to harm our (and their) site by issuing lots of duplicate content around the web. Is there a way we can help them with the data without penalizing ourselves? The other issue is that we also take this data feed and use it to sell on both Amazon & Googlebase. Will using this identical data also rank as duplicate content as a quick search does show both our website and amazon product page? When creating Amazon listing do these need to vary from the standard website descriptions? Thanks

    | bwfc77
    0

  • Hi, I am seeing a very strange result in google for my site. When doing a search for the term "london reflexology" my site comes up 18th in the results. But when I click the link or check the URL it shows up as: http://www.reflexologyonline.co.uk/reflexologyonline.php?Action=Webring This is not right at all. It looks like some sort of cloaking but I am not sure. I am new to SEO and I do not know why goole is showing this URL that does not exist on my site and of witch the content is totally wrong. Can anyone please help with this? See the 2 linked images for more details. It seems to me the site might be hacked or something to that effect. Please help.... jyJdP.png 71Mf4.png

    | RupDog
    0

  • We have a site using Joomla CMS, integrated with Jreviews and Jomsocial. Utilizing ACE SEF to generate Dynamic URL structure. Our issue is that we are recieving multiple instances of duplicate url's and duplicate titles due to the way joomla is working with jreviews for all our 7,000+ business listings. Site is already ranked for many broad/national keywords, concerned that our state and local rankings are limited by these errors. How can we prevent this from happening without re-writing the entire website?

    | mdmcn
    0

  • Hi mozzers, I was wondering if theres anything out there that would crawl a site and sort your pages into the number of words they have?

    | PeterM22
    0

  • I started a web app campaign for a site that I recently finished. It had no errors or warnings, but issued rel=cannonical notices for every page on the site. What does this mean?

    | waynekolenchuk
    0

  • Hi all, I'm quite new to the SEO space so I apologise if all the information below isn't technically perfect. I ran the SEOmoz pro tool for the first time a month ago (fantastic tool). It picked up a wealth of errors on our site that we are now working on. the problem: we use dynamic pages to display job listings pulled from our database that have picked up many duplicate page titles and content. For example: _Landing page: _http://www.arm.co.uk/jobs/it-contract-jobs/sec=itcontractjobs _Page 2: _http://www.arm.co.uk/jobs/1/-/-/2/itcontractjobs-/9999/2 _Page 3: _http://www.arm.co.uk/jobs/1/-/-/2/itcontractjobs-/9999/3 Following the results of the Moz tool we have now 'no indexed' and 'no followed' the dynamic pages and the errors have dramatically dropped, great! However, on reflection we generate quite a lot of traffic to individual job's listed on our website. By no following the pages we have restricted passing on any 'juice' to these pages, and by no indexing we may be taking them out of Googles index completely. These dynamic pages and individual job listings do generate a lot of traffic to our website via organic search. We do submit the site index to Google that should index the individual jobs that way. So, the question is (I hope this is making sense), are the gains of reducing errors picked up in the moz tool (to improve the overall site performance) likely to outweigh the traffic generated on these dynamically generated pages by being indexed and followed by Google. Ultimately we would like the static landing pages to retain a stronger page rank. Any guidance is very much appreciated. Best Regards,
    Sam.

    | ARMofficial
    0

  • Thought I might ask you guys if you have ever seen anything similar, 'cause I sure haven't. 🙂 I have a client who stumbled across a problem with his website links. Google change them back and fourth. one day one of the links will be called "iPhone 4 accessories" and some weeks pass and then it changes to " 4 accessories". Weeks pass again and then the iphone is back. First I thought to myself that Google might have expanded the AdWords filter to include website-links.. But then I remembered that they were ordered by the EU courts to size that practice.. so that can't be it. Plus allot of his competition doesn't seem to have the same problem. I have checked everything, the links, title tags, page titles exc.. and I acn't realt find any reason why this should be happening to him and I must admit I have never seen anything similar. Any hints and pointers would be most welcome 🙂

    | ReneReinholdt
    0

  • To rel-canonical or to 301, that is the question. We're frequently running an A/B split test on our home page to optimize conversion.  As a result about 10,000 backlinks to our homepage point to the B page.  (If we're running a test when a blog or newspaper checks us out, there's a 50% chance they're diverted to the B page. So when they copy our home page URL, they're unknowingly copying the B page link.) We can't contact all of these sites and ask for them to change their links.  A lot of the links are from big organizations that aren't interested in tweaking the links of old articles. So should we rel-canonical or 301 the B page?  We consistently use the same URL for our B page tests, so we'd only have to 'fix' one page. Thanks in advance!

    | JoeNYC
    0

  • If I wanted to get rid of a batch of low quality pages from the index, Is the best practise to let them 404 and remove them from sitemap files? Thanks

    | PeterM22
    0

  • Do a Google search for: opensiteexplorer . The 2nd (may vary) result is an seoMOZ blog post, "Brand New Open Site Explorer is Here (and Linkscape's Updated, too)". Google is displaying Rand's pic and google profile link in the search result. How? Can't find the Google profile link in the seoMOZ page source.

    | questfore
    0

  • We are thinking of restructuring the URLs on our site and I was wondering if there is a penalty associated with setting up so many 301 re-directs.

    | nicole.healthline
    0

  • Hi guys - I am getting ready to do a complete domain transfer from one domain to another completely different domain for a client due to a branding/name change. 2 things - first, I wanted to lay out a summary of my process and see if everyone agrees that its a good approach, and second, my client is using IIS, so I wanted to see if anyone out there knows a bulk tool that can be used to implement 301's on the hundreds of pages that the site contains? I have found the process to redirect each individual page, but over hundreds its a daunting task to look at. The nice thing about the domain transfer is that it is going to be a literal 1:1 transfer, with the only things changing being the logo and the name mentions. Everything else is going to stay exactly the same, for the most part. I will use dummy domain names in the explanation to keep things easy to follow: www.old-domain.com and www.new-domain.com. The client's existing home page has a 5/10 GPR, so of course, transferring Mojo is very important. The process: Clean up existing site 404's, duplicate tags and titles, etc. (good time to clean house). Create identical domain structure tree, changing all URL's (for instance) from www.old-domain.com/freestuff to www.newdomain.com/freestuff. Push several pages to a dev environment to test (dev.new-domain.com). Also, replace all instances of old brand name (images and text) with new brand name. Set up 301 redirects (here is where my IIS question comes in below). Each page will be set up to redirect to the new permanent destination with a 301. TEST a few. Choose lowest traffic time of week (from analytics data) to make the transfer ALL AT ONCE, including pushing new content live to the server for www.new-domain.com and implementing the 301's. As opposed to moving over parts of the site in chunks, moving the site over in one swoop avoids potential duplicate content issues, since the content on the new domain is essentially exactly the same as the old domain. Of course, all of the steps so far would apply to the existing sub-domains as well, IE video.new-domain.com. Check for errors and problems with resolution issues. Check again. Check again. Write to (as many as possible) link partners and inform them of new domain and ask links to be switched (for existing links) and updated (for future links) to the new domain. Even though 301's will redirect link juice, the actual link to the new domain page without the redirect is preferred. Track rank of targeted keywords, overall domain importance and GPR over time to ensure that you re-establish your Mojo quickly. That's it! Ok, so everyone, please give me your feedback on that process!! Secondly, as you can see in the middle of that process, the "implement 301's" section seems easier said than done, especially when you are redirecting each page individually (would take days). So, the question here is, does anyone know of a way to implement bulk 301's for each individual page using IIS? From what I understand, in an Apache environment .htaccess can be used, but I really have not been able to find any info regarding how to do this in bulk using IIS. Any help here would be GREATLY APPRECIATED!!

    | Bandicoot
    0

  • Hey all -- I am working with a client, getting ready to make a full domain level change to a brand new domain. The existing domain has solid domain importance and trust, and the home page has a 5/10 GPR, so the transfer of all existing link juice is very important. Of course, I will be utilizing 301's to permanently redirect all existing pages to their new permanent homes. It will be a 1-1 structure, which I know is also best when possible. My question comes in specific to IIS. There is a wealth of information out there on the net regarding implementing permanent 301's using Apache and .htaccess, but nada when it comes to doing it in IIS7, which is what the client is using. For instance, today I am seeking to help them redirect 2 single pages to new destinations within the same domain, just diffferent folders. When you open up the IIS7 Control Panel (yes, with full Admin access), you can navigate to the directory, but the individual pages that I am looking to redirect with 301's do not show in IIS7, so you can't just right click on each page and choose "A redirection to a URL," etc. Any  help on exactly how to redirect a single page using a permanent 301 in IIS 7 would be huge! Thanks guys!

    | Bandicoot
    0

  • Does anyone have any links to information about external links on a front page ? I am advising a client that this is not the best idea and that they could be put in a different place but can't find any proof of this.

    | marcelo-275398
    0

  • We currently have some 301 redirects set up on our site however sometimes a page will redirect twice before reaching the final location.  Is this OK from an SEO perspective to have a page redirect twice or should we concentrate on reducing it to one?

    | JohnHillman
    0

  • I have a business that sells art. There are 2 distinct types of art on offer. In many cases, someone who is interested in Type A may also be interested in Type B. Most search competitors that sell Type B, only sell Type B. As a result, they have a dedicated site for type B art. Would I be better off separating my business into 2 dedicated websites or keeping them combined? The site currently ranks reasonably well due to content, age and has some decent inbound links. Any help on this would be greatly appreciated. Thanks, Peter.

    | peteraitken
    0

  • We have a specific URL naming convention for 'city landing pages': .com/Burbank-CA .com/Boston-MA etc. We use this naming convention almost exclisively as the URLs for links.  Our website had a code breakdown and all those URLs within that naming convention led to an error message on the website.  Will this impact our links?

    | Storitz
    0

  • Still a bit confused on best practice for /index.php showing up as duplicate for www.mysite.com. What do I need to do and How?

    | bozzie311
    0

  • Ben, I have a follow up question from our previous discussion at http://www.seomoz.org/qa/discuss/52837/google-analytics To summarize, to implement what we need, we need to do three things: add GA code to the Darden page _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.darden.virginia.edu']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Change links on the Darden Page to look like http://www.darden.virginia.edu/web/MBA-for-Executives/  and [https://darden-admissions.symplicity.com/applicant](<a href=)">Apply Now  and make into  [https://darden-admissions.symplicity.com/applicant](<a href=)" > onclick="_gaq.push(['_link', 'https://darden-admissions.symplicity.com/applicant']); return  false;">Apply Now Have symplicity add this code. _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.symplicity.com']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Due to our CMS system, it does not allow the user to add onClick to the link.  So, we CANNOT add part 2)  What will be the result if we have only 1) and 3) implemented?  Will the data still be fed to GA account 'UA-12345-1'?  If not, how can we get cross domain tracking if we cannot change the link code? Nick

    | Darden
    0

  • Hi guys, I hope somebody can help me figure this out. On one of my sites I set the charset to UTF-8 in the content-type meta-tag. The file itself is also UTF-8. If I type german special chars like ä, ö, ß and the like they get displayed as a tilted square with a questionmark inside. If I change the charset to iso-8859-1 they are getting displayed properly in the browser but services like twitter are still having the issues and stop "importing" content once they reach one of those specialchars. I would like to avoid having to htmlencode all on-page content, so my preference would be using UTF-8.. You can see it in action when you visit this URL for example: http://www.skgbickenbach.de/aktive/1b/artikel/40-minuten-fußball-reichen-nicht_1045?charset=utf-8 Remove the ?charset parameter and the charset it set to iso-8859-1. Hope somebody has an answer or can push me into the right direction. Thanks in advance and have a great day all. Jan

    | jmueller
    0

  • Hi, on the last Updates of google Panda in Europe, we hAve Lost Ranking. What to do now? where to Start! Which Things First and which Second..? Need help, Tell me how to fix issuess...

    | leadsprofi
    0

  • Hey there.  We've run into a mystifying issue with Google's crawl index of one of our sites.  When we do a "site:www.burlingtonmortgage.biz" search in Google, we're seeing lots of 404 Errors on pages that don't exist on our site or seemingly on the remote server. In the search results, Google is showing nonsensical folders off the root domain and then the actual page is within that non-existent folder. An example: Google shows this in its index of the site (as a 404 Error page):  www.burlingtonmortgage.biz/MQnjO/idaho-mortgage-rates.asp The actual page on the site is: www.burlingtonmortgage.biz/idaho-mortgage-rates.asp Google is showing the folder MQnjO that doesn't exist anywhere on the remote.  Other pages they are showing have different folder names that are just as wacky. We called our hosting company who said the problem isn't coming from them... Has anyone had something like this happen to them? Thanks so much for your insight!
    Megan

    | ILM_Marketing
    0

  • the site is casino.pt we created the site 7-8 month ago, we started to push it by good and natural links (http://www.opensiteexplorer.org/www.casino.pt/a!links!!filter!all!!source!external!!target!page), links in sites with content rich and most of them related to gambling and sport topics. During the first 3-5 months, the rankings were better and better, after the 6 months, the site lose all its rankings. Aditional details http://www.casino.pt/robots.txt http://www.google.pt/#hl=pt-PT&source=hp&biw=1280&bih=805&q=site:http%3A%2F%2Fwww.casino.pt&aq=f&aqi=&aql=&oq=&fp=2651649a33cd228 no critical errors in google webmaster tools any idea how can I fix it? thanks

    | Yaron53
    0

  • All, I operate a Wordpress blog site that focuses on one specific area of the law. Our contributors are attorneys from across the country who write about our niche topic. I've done away with syndicated posts, but we still have numerous articles addressing many of the same issues/topics. In some cases 15 posts might address the same issue. The content isn't duplicate but it is very similar, outlining the same rules of law etc. I've had an SEO I trust tell me I should 301 some of the similar posts to one authoritative post on the subject. Is this a good idea? Would I be better served implementing canonical tags pointing to the "best of breed" on each subject? Or would I be better off being grateful that I receive original content on my niche topic and not doing anything? Would really appreciate some feedback. John

    | JSOC
    0

  • I'm working with a company that has a high page rank on it's main domain and is looking to launch a new business / product offering.  They are evaluating either creating a subdomain or launching a brand new domain.  In either case, their current site will link contextually to the new site.  Is there one method that would be better for SEO than the other? The new business / product is related to the main offering, but may appeal to different / new customers.  The new business / product does need it's own homepage and will have a different conversion funnel than the existing business.

    | gallantc
    0

  • Say you have CharityName.com. They use a dedicated domain name CharityNameEvent.com to advertise their main event. They use this domain on posters, flyers,etc and want to keep using it because it's easier to remember. CharityNameEvent.com has far, far more inbound links than CharityName.com (about 8 times more). Current problem: their current web developer has put the SAME content on both websites instead of setting up a redirect from CharityNameEvent.com (easy to remember) to CharityName.com/Event which would have made more sense. My intention is to consolidate the 2 websites and make sure CharityName.com benefits from links to the Event. I plan to move and 301 redirect CharityNameEvent.com to CharityName.com/Event. I know this would keep links and PR intact but I have a couple of questions: 1. Is it enough to set up the 301 redirect or would they have to ask websites to ACTUALLY change the links to CharityName.com/Event? 2. They plan and need to keep using CharityNameEvent.com for its ease of use on posters, flyers, etc. The 301 redirect would be in place. Would this cause any problems with search engines, especially when/if some people STILL link to CharityNameEvent.com instead of CharityName.com/Event? Basically, my understanding of 301 redirects is that they're used when a website permanently moves. In this case, the OLD DOMAIN name would still be used for reasons mentioned above but would be 301 redirected to CharityName.com/Event. Any chance this might not maximise the potential of new/old links? Any other way to go about it? Anything I'm missing with this scenario? Thanks

    | carmenmardiros
    0

  • A couple of my sites have recently been hacked with the hacker managing to overwrite lots of my pages with their own spam products and also adding in lots of (hundreds) pages that they have created themselves. I have rectified this in so far as removing folders that the hacker used to over write my pages so my original pages are now back showing the correct content and also removed all the hundres of new pages that they had managed to instantly add. I appreciate that google will find and re-crawl all my genuine pages so the correct content is being displayed and indexed for them but what is the best method for dealing with the hundreds of extra spam ages that google had managed to crawl but have now been deleted so there are loads of 404 page not founds in google?

    | Wardy
    0

  • When I view source, and view as Googlebot it's showing as 1 long page of content = good. However, the developer uses some redirects and dynamic page generation to pull this off. I didn't see any issues from a Search perspective but would appreciate a second opinion: Click here Thanks!

    | 540SEO
    0

  • I can't 301 https// to http:// since there are some form pages that need to be https:// The site has 20,000 + pages so individually 301ing each page would be a nightmare. Any suggestions would be greatly appreciated.

    | fthead9
    0

  • Hi, I have a super spiffy (not) CMS that tends to create some pages at the root level of the site (not where I want it) i.e. www.site.com/page.htm as well as the desired location i.e. www.site.com/category/keyword/page.htm . Now obviously a canonical tag inserted into the URL at the undesired location would be the best option, however the source code is exactly the same for both pages (can’t change this) i.e. if I put the canonical tag that reads www.site.com/category/keyword/page.htm"/> it will appear in the head section of both pages, the desired URL and the non desired URL. Will a canonical tag inserted into the head section of a the preferred URL directing the search engine spiders pretty much to itself cause more grieft than the solution it offers re duplicate content ? Marc

    | NRMA
    0

  • Hi, I just got a message from Google Webmaster Tools telling that there are "artificial or unnatural links" pointing to one of my subdomains, and that I should investigate and submit my site for reconsideration. The subdomain in question has inbound links from 4K linking root domains. We are a certificate authority (we provide SSL certificates) so the majority of those links come from the site seal that customers place on their secure pages. We sell certificates to a full spectrum site types, from all sizes of ecommerce sites to .edu, .gov, and even adult.  That said, our linking root domains have always been a mixed bunch, which tells me that these offending links were recently added. Here are my questions: Is it possible to slice my link reports with some sort of time element, so that I can narrow the search to only the newest inbound links? How else might I use OSE to find these "artificial or unnatural links"? Are there any particular attributes I should be looking for in a linking root domain that might suggest it's seen by Google as "artificial or unnatural". Any help with any aspect of this issue would be greatly appreciated. Thanks, Dennis p.s. I should probably state that I've never bought links or participated in link schemes.

    | dennis.globalsign
    0

  • Hi guys The widely followed SEO best practice is that 301 redirects should be used instead of 302 redirects when it is a permanent redirect that is required. Matt Cutts said last year that 302 redirects should "only" be used for temporary redirects. http://www.seomoz.org/blog/whiteboard-interview-googles-matt-cutts-on-redirects-trust-more For a site that I am looking at the SEO Moz Crawll Diagnostics tool lists as an issue that the URL /  redirects to www.abc.com/Pages/default.aspx with a 302 redirect. On further searching I found that on a Google Support forum (http://www.google.com/support/forum/p/Webmasters/thread?tid=276539078ba67f48&hl=en) that a Google Employee had said "For what it's worth, a 302 redirect is the correct redirect from a root URL to a detail page (such as from "/" to "/sites/bursa/"). This is one of the few situations where a 302 redirect is preferred over a 301 redirect." Can anyone confirm if it is the case that "a 302 redirect is the correct redirect from a root URL to a detail page"? And if so why as I haven't found an explanation. If it is the correct best practice then should redirects of this nature be removed from displaying as issues in the SEO Moz Crawll Diagnostics tool Thanks for your help

    | CPU
    0

  • <colgroup><col width="411"></colgroup>
    | One of the most common warnings on our site www.sta.co.uk is the use of parameters in URL strings (they're crawled ok, it's mainly duplication content issues we're trying to avoid). The current traffic manager suggested ‘stage 1’ - remove the unwanted folder structure but wouldn’t tailor the dynamic url I'd say it is difficult to quantify what result this would have in isolation and I would rather do this update in tandem with the ‘stage 2’ which adds structure to the dynamic urls with multiple parameters.(Both stages will involve rewriting the page url and redirecting the long url to the short) Any thoughts, please? Is there any benefit in removing the subfolders (1) or should we wait and do it in one go? Thanks everyone, |

    | Diana.varbanescu
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.