Category: Technical SEO
Discuss site health, structure, and other technical SEO strategies.
-
Expired domain 404 crawl error
I recently purchased a Expired domain from auction and after I started my new site on it, I am noticing 500+ "not found" errors in Google Webmaster Tools, which are generating from the previous owner's contents.Should I use a redirection plugin to redirect those non-exist posts to any new post(s) of my site? or I should use a 301 redirect? or I should leave them just as it is without taking further action? Please advise.
| Taswirh1 -
What are some best practices for optimizing alternate versions of a brand name?
What are the best methods for ensuring that the correct spelling/formatting of a brand name rank in the SERP when an alternate formatting/spelling of the brand name is searched. Take for example the brand name (made up for example purposes), "SuperFry". Many customers search using the term "Super Fry" (with a space). To make things worse, not only does Google not return the brand name SuperFry, but it also auto corrects to another brand name "Super-Fri". Is there a common best practice to ensure the customer finds the intended brand name when they simply add a space in the search term? I assume a quick fix would be to create an ad words campaign for the alternate spellings/formatting. What about an organic solution? Perhaps we could create a special page talking about the alternate ways to spell the brand name? Would this solution send mixed signals to Google and potential hurt the over all rankings? Thanks much for any advice!
| Vspeed0 -
How much will changing IP addresses impact SEO?
So my company is upgrading its Internet bandwidth. However, apparently the vendor has said that part of the upgrade will involve changing our IP address. I've found two links that indicate some care needs to be taken to make sure our SEO isn't harmed: http://followmattcutts.com/2011/07/21/protect-your-seo-when-changing-ip-address-and-server/ http://www.v7n.com/forums/google-forum/275513-changing-ip-affect-seo.html Assuming we don't use an IP address that has been blacklisted by Google for spamming or other black hat tactics, how problematic is it? (Note: The site hasn't really been aggressively optimized yet - I started with the company less than two weeks ago, and just barely got FTP and CMS access yesterday - so honestly I'm not too worried about really messing up the site's optimization, since there isn't a lot to really break.)
| ufmedia0 -
Webmaster Crawl errors caused by Joomla menu structure.
Webmaster Tools is reporting crawl errors for pages that do not exist due to how my Joomla menu system works. Example, I have a menu item named "Service Area" that stores 3 sub items but no actual page for Service Area. This results in a URL like domainDOTcom/service-area/service-page.html Because the Service Area menu item is constructed in a way that shows the bot it is a link, I am getting a 404 error saying it can't find domainDOTcom/service-area/ (The link is to "javasript:;") Note, the error doesn't say domainDOTcom/service-area/javascript:; it just says /service-area/ What is the best way to handle this? Can I do something in robots.txt to tell the bot that this /service-area/ should be ignored but any page after /service-area/ is good to go? Should I just mark them as fixed as it's really not a 404 a human will encounter or is it best to somehow explain this to the bot? I was advised on google forums to try this, but I'm nervous about it. Disallow: /service-area/*
| dwallner
Allow: /service-area/summerlin-pool-service.
Allow: /service-area/north-las-vegas
Allow: /service-area/centennial-hills-pool-service I tried a 301 redirect of /service-area to home page but then it pulls that out of the url and my landing pages become 404's. http://www.lvpoolcleaners.com/ Thanks for any advice! Derrick0 -
Odd scenario: subdomain not indexed nor cached, reason?
hi all hopefully somebody can help me with this issue 🙂 6 months ago a number of pages hosted at a domain level have been moved to a subdomain level with 301redirects + some others were created from scratch ( at a subdomain level too). what happens is that not only the new urls at the subdomain level are not indexed nor cached, but the old urls are still indexed in google, although by clicking on them they bring to the new urls via 301 redirect. question is why having a 301 redirects to the new urls, no issues with robot.txt, metarobots etc, the new urls are still de-indexed? i might remind you that a few (100 pages or so) have been created from scratch, but they are also not indexed. the only issue found across the page is the no-cache line of code set as follow: Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0 Pragma: no-cache i am not familiar with cache control lines. Can this be an issue from a correct indexing? thanks in advance Dario
| Mrlocicero0 -
Redirecting a single page on a separate domain to a new site?
My client started a subdivision of their company, along with a new website. There was already an individual page about the new product/topic on the main site, but recognizing a growth area they wanted to devote an entire site to the product/topic. Can we/should we redirect that page on the old corporate/main site to the new domain, or just place a link or two? Thoughts?
| VTDesignWorks0 -
International Config in a WP Multisite environment: GWT, Yoast, Hreflang, SiteMaps etc ?
Hi If setting up on a WP multisite environment using Yoast seo plugin how should you: 1) Geotarget in GWT - set up a profile for the different tlds OR the network.domain.com different subfolders ? 2) Set up hreflang sitemaps - can Yoast handle this or anything manual need to be done such as disabling Yoast and creating a bespoke site map with hreflang ? Would this plug in help: https://wordpress.org/plugins/language-selector-related/ 3) Any other ideas or recommendations for setting up geotrageting correctly using WP Multisite ? Thanks Dan
| Dan-Lawrence0 -
Redirecting Canonical Hostnames
Hi, I want to rewrite all the url pages of "site.com" to "www.site.com". I read the moz redirection article and i concluded that this would be the best approach. RewriteCond %{HTTP_HOST} !^www.seomoz.org [NC]
| bigrat95
RewriteRule (.*) http://www.seomoz.org/$1 [L,R=301]. But i recieved this error: Internal Server Error The server encountered an internal error or misconfiguration and was unable to complete your request. Please contact the server administrator, webmaster@localhost and inform them of the time the error occurred, and anything you might have done that may have caused the error. More information about this error may be available in the server error log. I tried this rewrite too... RewriteCond %{HTTP_HOST} !^www. [NC]
RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [L,R=301] It worked but it just rewriting my domain** "site.com"** and not all the subs "site.com/fr/example.php" to "www.site.com" Why it doesn't work properly, it seem to be easy... Could it be a hosting problem? Is there another way to do it? <address> </address> <address> </address> <address> </address> <address> </address>0 -
Rel= Canonical
Almost every one of my product has this message: Rel Canonical (Using rel=canonical suggests to search engines which URL should be seen as canonical. ) What is the best way to correct this?
| tiffany11030 -
Domain Mapping - International Config - WP Multisite - Problems/issues experienced ?
Hi My new clients devs are setting up new client according to below domain mapping type config Is anyone aware of any probs this could cause ? or other way should be setting up ? Also will setting up an MA campaign for the root domain 'catch all' given the domain mapping aspect etc ? Cheers Dan Set up now using WP Multisite. The root domain for the network of websites we are going to roll out, is set up as "network.domain.com. This is a "parent" domain from which all languages variants will be children of this root domain name. so: "network.domain.com/uk/" - English Language
| Dan-Lawrence
"network.domain.com/tr/" - Turkish Langauge
"network.domain.com/za/" - South African
etc I then will domain map domain names to each of these children, once I get DNS control of each language's settings. I have already mapped "www.domain.com" to "network.domain.com/uk/", so the english website is set up (and launched - as you know). I fully expect that "www.domain.tr" will be mapped to "network.domain.com/tr/" and so on, but depending on domain availability at the time of purchase. Any domain name can be mapped to each of these children, and the system doesn't care, or mind! I can also map one (or more) domain names to each child, and make ONE of them the primary domain name.0 -
One page of the site disappeared from serp for a month now
Im working on a clients site and been promoting a specific page to a keyword. started to move up the ranks and exactly a month ago on the 19/5 ( on the same day of the last update) updated the main page im working on with new content and published some other new pages on related subjects that all are linking to the main page im working on ( without the same anchor text in the links ) on the same day i found out that because of a technical error the new content was published on 5 other pages of the site and obviously created a duplicate content issue and i removed all the duplicates on the same day , i assume G caught this thing and punished the site for the duplicate content issue but : when i search the page directly with site:...i can find it. its been a month since i fixed all issues that i thought could impact the page..no duplicate content on the site. no KW stuffing. no spammy links to the page. everything seems fine now my question : why is my page not showing ? how long should i wait before giving up and creating a new page .? how come my site has not lost any organic traffic ( apart from that specific page ) ? is it possible to penalize only one page ? can i recover from this at all ? thanks
| nira0 -
Is there a maximum sitemap size?
Hi all, Over the last month we've included all images, videos, etc. into our sitemap and now its loading time is rather high. (http://www.troteclaser.com/sitemap.xml) Is there any maximum sitemap size that is recommended from Google?
| Troteclaser0 -
SEO: open source e-commerece vs. off the shelf
I'm trying to decide on a web development company. Both would do our CMS with Wordpress, but for the e-commerce platform one would use shopify (off the shelf) and the other would use Woo-Commerce (open source). SEO wise is there any benefit of going one way or the other? I am worried the off the shelf (shopify) would be weaker because it would be hosted on a different server than the CMS, plus I think the url structure would be less flexible through shopify (keywords would be further down url structure). Thanks, Dan
| dcostigan0 -
Google Publisher status
Hi all, I just wondered what the general opinion was with regard getting Google publisher status for medium to large organisations. Lots of our clients write a lot of articles & publications and it would be interesting to get some thoughts on how others view Authorship & in particular Publisher credentials. Thanks!
| davidmaxwell0 -
Product Documentation Causing 23-40K issues
One of my biggest hurdles at my company is our Product Documentation library, which houses thousands of pages of publicly accessible and indexed content on old and new versions of our product. Every time a product name changes the URL changes, causing a 404, so I typically have 100s of 404s every few months from this site. It's housed off our main domain. We have 23,000+ Duplicate Pages, 40,000 missing meta descriptions, and 38,000 due to this library. It is not built the same as our main content, with page titles and meta descriptions, so everything is defaulted and duplicate. I'm trying to make a case that this is an issue, especially as we migrate our site next year to a new CMS. Does anyone have any suggestions for dealing with this issue in the short term and long term? Is it worth asking the owners of the section of content to develop page titles and meta descriptions on 40,000 pieces of content? They do not see the value of SEO and the issues this can cause. It needs to be publicly accessible, but it's not highly ranked content. It's really for customers who want to know more about the product. But I worry it is hurting other parts of our site, with the absurd amount of duplicate content, meta, and page title issues.
| QAD_ERP0 -
Redirecting pages from a website to another
Hello Moz community, I’ve got a question and hope you can help! I’ve been working to improve my website’s ranking for the keywords “singing lessons London”. My current website url is http://www.sonic-crew-london.com and the page dedicated to the singing lessons is http://www.sonic-crew-london.com/booking/singinglessons.php I’ve recently bought the url http://www.singing-lessons-london.com which I hope will help to climb Google’s ranks a bit more easily for my chosen keywords. I thought I could redirect the old singing page to the new url. Is that something you would recommend me to do? Is there any specific procedure I should follow to make sure the transition runs smoothly? Any help really appreciated! Many thanks
| SonicCrewLondon0 -
Impact of changing title and description.
When a site doesn't rank for keywords, is this advisable to keep changing the title, description and other on page factors of a page , say home page, until it ranks? Will that impact on improvement? Or else will it be counted in the negative side?
| Somanathan0 -
To 301 or not to 301?
I have a client that is having a new site built. Their old site (WP) does not use the trailing / at the end of urls. The new site is using most of the same url names but IS using the /. For instance, the old site would be www.example.com/products and the new site, also WP, will be www.example.com/products/. WordPress will resolve either way, but my question is whether or not to go in and redirect each matching non / page to the new url that has the /. I don't want to leave any link juice on the table but if I can keep the juice without doing a few hundred 301s that certainly wouldn't suck. Any thoughts? Sleepless in KVegas
| seorocket0 -
International architecture: Country specific subfolders > domain mapping to tld
Hi Ive got a clients dev saying they are setting up with country/language specific subfolders (as i recommended) BUT now they are saying they want to set up on network.domain.com (for example) and then each language will have its own sub-folder BUT will be domain mapped to the TLD as and when they get them. I have asked them to clarify since sounds a bit strange since thought best to have domain.com then /uk and /us etc etc and sure ok to forward country specific TLD's to these subfolders. Its this new subdomain (network.) thats concerning me and mapping rather than forwarding (or is it the same thing) but anyone know off hand if above sounds ok or also thinks a bit strange or know issues with such a set up ? many thanks dan
| Dan-Lawrence0 -
How to handle (internal) search result pages?
Hi Mozers, I'm not quite sure what the best way is to handle internal search pages. In this case it's for an ecommerce website with about 8.000+ products and search pages currently look like: example.com/search.php?search=QUERY+HERE. I'm leaning towards making them follow, noindex. Since pages like this can be easily abused for duplicate content and because I'd rather have the category pages ranked. How would you handle this?
| Qon0 -
My website is not avaliable, will i lose ranking?
My website was not available during 12 hours and i think that i will lose ranking by that. What do you think about it? Will i lose rankings? Some URL were lost during the drop of server, what should i do? Create again? Delete on GWT? Thanks so much.
| pompero990 -
How to change the woocommerce product page permalink
How I can change the product URL structure. Please let me know how to fix woocommerce permalink in wordpress. My current URL is http://www.ayurjeewan.com/product/divya-ashmarihar-kwath and I want to like (only post name) http://www.ayurjeewan.com/divya-ashmarihar-kwath Attached is the screenshot of option available. qa2hZMP.jpg
| JordanBrown0 -
Hello any body abouth google disllowtools
hi any body ..i have one importan domian have google penguen take it before 2 years i have sibmit 127 links on google disllow tools to remove that links this tools can take me out from google penguen ? how long will get to remove 127links ?can relolve trafik hope so regards jivko
| jivkojelazkov0 -
2 sites versus a subdomain: Which is better?
I have a client that sponsors a couple of events during the year. They currently have pages within a single website for these events but are interested in creating a separate website so they can brand the events differently. I'm not sure this is the most effective way to do it for fear of losing the "google juice" already there for these pages.Here's what I'm thinking is a better strategy: 1) Host the content both on the main domain and the sub-domain2) Make sure there is a tag on each page of the sub-domain version that points to the main version.That will give them the branding they are seeking while pushing all juice across to the main domain.What are your thoughts?
| Britewave0 -
Mobile & desktop pages
I have a mobile site (m.example.com) and a desktop site (example.com). I want search engines to know that for every desktop page there is a mobile equivalent. To do this I insert a rel=alternate on the desktop pages to the mobile equivalent. On the mobile pages I insert a rel=canonical to it's equivalent desktop page. So far so good BUT: Almost every desktop page has 4 or 5 copies (duplicate content). I get rid of this issue by using the rel=canonical to the source page. Still no problem here. But what happens if I insert a rel=alternate to the mobile equivalent on every copy of the source page? I know it sounds stupid but the system doesn't allow me to insert a rel=alternate on just one page. It's all or nothing! My question: Does Google ignore the rel=alternate on the duplicate pages but keeps understanding the link between the desktop source page & mobile page ? Or should I avoid this scenario? Many Thanks Pieter
| Humix0 -
Google Blogger Intergration, any SEO benefit?
I want to use google blogger for my website blog, we want to intergrate it into the existing website (which is why wordpress isn't ideal as there is no php) if the domain was www.blog.mysite.com will www.mysite.com still recieve the seo bonuses of having a blog? or does it have to be www.mysite.com/blog otherwise i have little use for a blog, what i also want to ask is how does it intergrate? i don't want the blog on my site to be like a 'window' into the google blog site as that defeats the purpose, i need the content to actually be on my site to ensure the seo benefits. Thanks Dan
| DanBrayfield0 -
How long before I can use a redirected domain without taking back link juice?
We recently moved our website to a new domain that better matched our brand. I want to use the old domain at some point for another aspect of our business. How long after we do the domain redirect will it be safe to use the old domain again--without affecting the seo of the new domain? Thanks! Harriet
| zharriet0 -
Is this duplicate content?
All the pages have same information but content is little bit different, is this low quality and considered as duplicate content? I only trying to make services pages for each city, any other way for doing this. http://www.progressivehealthofpa.com/brain-injury-rehabilitation-pennsylvania/
| JordanBrown
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-new-york/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-new-jersey/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-connecticut/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-maryland/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-massachusetts/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-philadelphia/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-new-york-city/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-baltimore/
http://www.progressivehealthofpa.com/brain-injury-rehabilitation-boston/0 -
Problem with Wordpress RSS feed and Feedburner
Just discovered a problem with my company site's RSS feed. I'm a bit embarrassed to ask, but I thought someone in the community might have encountered this -- and I cannot figure it out for the life of me! We had redirected our Wordpress feed to Feedburner. We publish at least once per week, but no posts after March 18 are in the feed: http://feeds.feedburner.com/TheClineGroup The standard (Wordpress) RSS feed page does not load: http://theclinegroup.com/feed/ Of course, I deactivated all plug-ins to see if one of them was the issue, but the problem(s) still existed. Thanks so much for any assistance!
| SamuelScott0 -
.htaccess redirect question
Hi guys and girls Please forgive me for being an apache noob, but I've been trawling for a while now and i can't seem to find a definitive guide for my current scenario. I've walked into a but of a cluster$%*! of a job, to rescue a horribly set up site. One of many, many problems is that they have 132 302redirects set up. Some of these are identical pages but http-https, others are the same but https-http and some are redirects to different content pages with http-http. A uniform redirecting of http to https is not an option so I'm looking to find out the best practice for reconfiguring these 302s to 301s within .htaccess? Thanks in advance 🙂
| craig.gto0 -
How can I fix this home page crawl error ?
My website shows this crawl error => 612 : Home page banned by error response for robots.txt. I also did not get any page data in my account for this website ... I did get keyword rankings and traffic data, I am guessing from the analytics account. url = www.mississaugakids.com Not sure really what to do with this ! Any help is greatly appreciated.
| jlane90 -
One page templates
Hi, I have in plan to implement One-page template (http://shapebootstrap.net/wp-content/plugins/shapebootstrap/demo.php?id=380) is someone have experience how this type of page have SEO problems? Best regards
| komir20040 -
When to file a Reconsideration Request
Hi all, I don't have any manual penalties from Google but do have a unnatural links message from them back in 2012. We have removed some of the spammy links over the last 2 years but we're now making a further effort and will use the disavow tool once we've done this. Will this be enough once I submit the file or should I / can I submit a Reconsideration Request as well? Do I have to have a manual penalty item in my webmaster account to be able to submit a request? Thanks everyone!
| KerryK0 -
Using the Google Remove URL Tool to remove https pages
I have found a way to get a list of 'some' of my 180,000+ garbage URLs now, and I'm going through the tedious task of using the URL removal tool to put them in one at a time. Between that and my robots.txt file and the URL Parameters, I'm hoping to see some change each week. I have noticed when I put URL's starting with https:// in to the removal tool, it adds the http:// main URL at the front. For example, I add to the removal tool:- https://www.mydomain.com/blah.html?search_garbage_url_addition On the confirmation page, the URL actually shows as:- http://www.mydomain.com/https://www.mydomain.com/blah.html?search_garbage_url_addition I don't want to accidentally remove my main URL or cause problems. Is this the right way this should look? AND PART 2 OF MY QUESTION If you see the search description in Google for a page you want removed that says the following in the SERP results, should I still go to the trouble of putting in the removal request? www.domain.com/url.html?xsearch_... A description for this result is not available because of this site's robots.txt – learn more.
| sparrowdog1 -
Then why my site is not ranking
My website's DA and PAs are good compare with my competitors. Then why my site is not ranking.
| Somanathan0 -
What does Google PageSpeed measure?
What does the PageSpeed tool actually measure? Does it say that a webserver is fast or slow? Thanks in advanced!
| DanielMulderNL0 -
Google Indexing Development Site Despite Robots.txt Block
Hi, A development site that has been set-up has the following Robots.txt file: User-agent: * Disallow: / In an attempt to block Google indexing the site, however this isn't the case and the development site has since been indexed. Any clues why this is or what I could do to resolve it? Thanks!
| CarlWint0 -
How is this site ranking so well? Their link profile is awful and website is messy and difficult to use?
Hi folks, This question has been baffling me for some time now and I'm still struggling to get to the bottom of it. www.sterlingbuild.co.uk is the website of choice for Google when it comes to searches relating to roof windows, velux windows, fakro windows etc. I can't understand why? Their link profile is atrocious. I'm struggling to find one 'high quality' link in their profile at all. Most of their links are guest blog posts which Google is apparently now treating as spam, or links from other sites that they own - also spam. The design of the site is incredibly messy and confusing. But one of the biggest flaws of the site (which I am suspicious may also be what is helping them) is they list every single different size of window as a different product. So whereas with most websites in this market, you search for the type of window you want e.g. a VELUX GGL 3050 window, and then choose the size you need from a drop-down menu, Sterlingbuild list every size as a different product. So you have to scroll through reams of product listings to find the window type in the right size before you get to any information about the product itself. Not to mention, their site is riddled with duplicate content because 12 different sizes of product are not different products, they are the same product, just a different size, so they have the identical product description for numerous separate pages basically selling the same product. How on earth has Google decided this is the best website in the marketplace when it comes to roof windows?
| LukeyB301 -
Sitemaps: Good Image And Video Sitemap Generators
Hello, We are trying to update our sitemap. We have currently updated our XML and HTML sitemaps but would like to have an image and video sitemap also. Can anyone recommend a good image and video sitemap generator? Kind regards, | Deeana Radley Web Designer & SEO Assistant Phone: 01702 460047 Email: [email protected] Google+: +DeeanaRadley Twitter: @DeeanaRadley |
| SolopressPrint0 -
Ignore these external links reported in GWT?
Taking a long, Ace-Ventura-like breath here. This question is loaded. Here we go: No manual actions reported against my client's site in GWT HOWEVER, a link: operator search for external links to my client's website shows NO links in results. That seems like a very bad omen to me. OSE shows 13 linking domains, but not the one that's listed in the next bullet point. Issue: In Google Webmaster Tools, I noticed 1,000+ external links to my client's website all coming from riddim-donmagazine.com (there are a small handful of other domains listed, but this one stuck out for the large quantity of links coming from this domain) Those external links all point to two URLs on my client's website. I have no knowledge of any campaigns run by client that would use this other domain (or any schemes for that matter) It appears that this website riddim-donmagazine.com has been suspended by hostgator All of the links were first discovered last year (dates vary but basically August through December 2013) There have not been any newly discovered links from this website reported by GWT since those 2013 dates All of the external links are /? based. Example: http://riddim-donmagazine.com/?author=1&paged=31 If I run that link in preceding bullet point through http://www.webconfs.com/http-header-check.php, or any others from riddim-donmagazine.com those external links return 302 status. My best guess is at one time the client was running an advertising program and this website may have been on that network. One of the external links points to an ad page on the client's website.
| EEE3
(web.archive.org confirms this is a WordPress site and that it's coverage of Bronx news could trigger an ad for my client or make it related to my client's website when it comes to demographics.) Believe me, this externally linked domain is only a small problem in comparison with the rest of my client's issues (mainly they've changed domains, then they changed website vendors, etc., etc.), but I did want to ask about that one externally linked domain. Whew.Thanks in advance for insights/thoughts/advice!0 -
Using the same domain for two websites (for different geographical locations)
Hi all, My client has a new E-commerce site coming out in few months.
| skifr
His requirement is to use the same domain (lets call it www.domain.com for now) for two seperate websites:
The first site, for users with ip addresses from USA - which will include prices in US dollars.
The second site - for users outside of the US - will not include any prices, and will have different pages and design. Now, lets say that googlebot crawls the websites from different ip ranges. How can i make sure a user from France, for example, won't see crawled pages from the US? Sure, once he will click the result, I can redirect him to a "Sorry, but this content is unavailable in your country" page. The problem is, I don't want a user from France to see the in the search results the meta description snippets of pages related only to users in the US (in some cases, the snippets may include the prices in $).
Is Geotargeting through Webmaster Tools can help in this case? I know I can target a part of the website for a specific country (e.g. - www.domain.com/us/), but how can I make sure global users won't see the pages targeted only to the US in the search results? Thanks in Advance0 -
Redirect chains after a site migration
Hi A clients site was originally canonicalised to the www. from the non www versions Now its migrating to an international config of www.domain.com/uk and www.domain.com/us with the existing pages/urls (such as www.domain.com/pageA) 301'd to the new www.domain.com/uk/pageA for example Will this will create a 301 redirect chain due to the existence of the original canonicalised urls or is the way that works 'catch all' so to speak, and automatically update the canonical 301 redirects of the non www old architexcture url's to the new international architecture URL's ? I presume so but just want to check ? cheers dan
| Dan-Lawrence0 -
Https and 404 code that goes into htaccess
The 404 error code we put into htaccess files for our websites does not work correctly for our https site. We recently changed one of our http sites to https. When we went to create a 404.html page for it by creating an htaccess folder with the 404 error code in it, once we uploaded the file all of our webpages were displaying incorrectly, as if the css was not attached. The 404 code we used works successfully for our other 404.html pages for our other sites (www.telfordinc.com/404.html). However, it does not work for the https site. Below is the 404 error code we are using for our https site (currently not uploaded until pages display correctly) ErrorDocument 404 /404-error.html RewriteEngine on RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^http://(www.)?privatemoneyhardmoneyloan.com/.*$ [NC] RewriteRule .(gif|jpg|js|css)$ - [F] Options +FollowSymLinks RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} !^www.privatemoneyhardmoneyloan.com$ [NC] RewriteRule ^(.*)$ http://www.privatemoneyhardmoneyloan.com/$1 [R=301,L] So we want to know if there is a different 404 error code that goes into the htaccess file for an https vs. http? Appreciate your feedback on this issue
| Manifestation0 -
Duplicate content. Wordpress and Website
Hi All, Will Google punish me for having duplicate blog posts on my website's blog and wordpress? Thanks
| Mike.NW0 -
Advice on whether we 301 redirect a page or update existing page?
Hi guys, any advice would be really appreciated. We have an existing page that ranks well for 'red widgets'. The page isn't monetised right now, but we're bringing in a new product onto our site that we optimised for 'blue widgets'. Unfortunately, not enough research was done for this page and we've now realised that consumers actually search for 'red widgets' when looking for the product we're creating as 'blue widgets'. The problem with this is that the 'red widgets' page is in a completely different category of our site than what it needs to be (it needs to be with 'blue widgets'). So, my question is; Should we do a 301 redirect from our 'red-widgets' page to our 'blue-widgets' page which we want to update and optimise the content on there for 'red-widgets'. Or, should we update the existing red-widgets page to have the right products and content on there, even thought it is in the wrong place of our site and users could get confused as to why they are there. If we do a 301 redirect to our new page, will we lose our rankings and have to start again, or is there a better way around this? Thanks! Dave
| davo230 -
Iframes, AJAX, JS, Etc.
Just started SEO on some legacy sites running JS navigation. Are there any proven ways to stop Google from parsing links and passing internal linkjuice? Ex: iframes, Ajax, JS, etc. Google is parsing some JS links on a couple of our legacy sites. The problem is that some pages are getting link juice and others aren't. It's also unpredictable which links are parsed and which aren't. The choice is rebuild the navigation (ouch), or figure out a way to block JS links entirely and build a simple text based secondary nav for link juice distribution. I definitely don't want to use nofollow. Any thoughts?
| AMHC0 -
Hiding Price html component for all countries except US
Hello everybody, We are planning to have a new website soon, which will be an E-Commerce website for people from the US, and non E-Commerce website for people from other countries. In other words, in the poduct pages, we would like to have the price of the product shown to the users from the US, and on the other hand we would like it to be invisible for users outside of the US. We thought about setting the html elelment of the price to be visible only for US users (by ip). My question is - can Google crawler see this as potential cloacking, since we hide some of the content to some of the users (while google might scan it from US iip address)? Thanks in advance...
| skifr0 -
Spike in server errors
Hi, we've recently changed shopping cart platforms. In doing so a lot of our URL's changed, but I 301'ed all of the significant landing pages (as determined by G Analytics) prior to the switch. However, WMT is warning me about this spike in server errors now with all the pages that no longer exist. However they are only crawling them because they used to exist/are linked from pages that used to exist. and no longer actually exist. Is this something I should worry about? Or let it run its course?
| absoauto0 -
GWT returning 200 for robots.txt, but it's actually returning a 404?
Hi, Just wondering if anyone has had this problem before. I'm just checking a client's GWT and I'm looking at their robots.txt file. In GWT, it's saying that it's all fine and returns a 200 code, but when I manually visit (or click the link in GWT) the page, it gives me a 404 error. As far as I can tell, the client has made no changes to the robots.txt recently, and we definitely haven't either. Has anyone had this problem before? Thanks!
| White.net0 -
Omitted results
Hello We are facing a loss in ranking and organic traffic from 3 months on our ecommerce website. Mostly we have lost our ranking on our product pages. These pages are gone in the "omitted results" of google. It all started 3 months ago, when we had to face a duplicate content issue due to a technical priblems with our servers: 2 other domains that we own have been pushed online on google, while they shouldn't have. They have created millions of links to our main domain in a few days, and duplicate version / redirection to our main website. We have fixed this a long time ago now. But in GWT we still see that these domains are bringing links to our main ecommerce. It has dowgraded from 36 millions links to 3 millions.... Even if today there is no link ! We have done a lot of optimizations on site like adding specific content to our most important page, rebuilding the navigation, adding microdatas, adding canonical urls on products pages that we found were very similar (we sell very technical products, and we have products that are very similar. Now we have choosen 1 product to put in canonical each time it was necessary) Bt still our products pages don't rank in google. They stay in the "omitted results". Before they were ranking very well on 1st page of google's results. And we have noticed that some adswe put on ads listing websites are now well ranked in the google's results!... Like if the ads were having more authority on the subject than our own webpages... We started to delete some of these ads. But it's not always possible. And 2-3 of them are still online. Any advice to get our most important webpages at the top on the google's results back? Regards
| Poptafic0
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.