Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hello, We have a site that sells a certain product on www.example.com. This site contains thousands of pages including a whole section of well written content that we invested a lot of money in making. The site ranks on many KWs both brand and non-brand related. SERPs include the Homepage and many of the articles mentioned. We receive traffic and clients to this site from around the world, BUT our main geo-targeting is UK. Due to lack of resources and some legal needs we now have to create a new site - www.example.co.uk that all UK traffic will be able to purchase the product only from this site and not from the .com site anymore. We have no resources to create new content for the new .co.uk site and that is the reason we want to duplicate the site on both domains and use a canonical tag to point the .co.uk site as the primary site. Does anyone have experience with such activity? will this work across the whole site? We need to have a fast solution here, as we do not have too much time to wait because of the legal issue I mentioned. What is the best solutions you can offer to do this so we do not lose important SERPs. On the one hand since our main market is the UK, we assume the main site to promote will be www.example.co.uk but as said earlier, we still have users from other parts of the world as well. Is there any risk that we are missing here? Thanks James

    | Tit
    0

  • We have been kicking around this idea for a while now, and I wanted to get the communities honest opinion before we begin building it. So we create a lot of posts on social media showcasing articles we find on SEO, tips and tricks, reviews, etc. We were thinking rather than always linking out to the other sites, we are going to create a section on our site called "From Around The Web" and have brief breakdowns of what was covered, then provide a link to the full article. Most of these would be between 300-500 words, and be optimized around what we were linking to and writing about. So since the content would not be "in-depth" would this hurt us in any way? To me, it doesnt not make sense to send people to the other article right away, when we can summarize it and link to the full articles from our site. (Most people dont want to read a 3000 word article on SEO, especially small business owners who just want the breakdown) Thoughts? Think it will help, or not be useful enough to invest labor in?

    | David-Kley
    0

  • Which cms is good for health product website (selling).?

    | JordanBrown
    0

  • Hello, How would you suggest finding content topics for this site: nlpca.com The end goal is signups for training seminars in San Francisco, California and Salt Lake City, Utah. In the future the seminars will move more towards life coaching trainings but right now they are mostly about NLP. NLP is a personal development field. Just looking for ideas for the process of finding topics for the most link-bait-heavy fabulous content. The owners of the site are authorities in the field. This is for both blog and article content. Thanks.

    | BobGW
    0

  • Hi, We have been trying to remove a ‘partial’ google penalty for a new client by the way of removing unnatural backlinks over a period of time and then submitting a reconsideration request, and uploading a disavow file etc. Previously Google listed the partial penalty in the ‘manual actions’ section of webmaster tools, making it possible for us to submit a reconsideration request. Having just logged in however we get the message ‘no manual webspam actions found’. So there isn’t any way we can submit a reconsideration request. Does this mean that the penalty has been lifted? Or could it still exist? If the latter is there any other way to submit a reconsideration request? Many thanks in advance, Lee.

    | Webpresence
    0

  • Is it best to noindex search results pages, exclude them using robots.txt, or both?

    | YairSpolter
    0

  • Hello, We are redesigning our product page and have considered putting our customer reviews in a 'tab' on the page, so it is not visible to the user until they click on the tab. Are there any SEO implications of this?  Right now, we do have problems with this because we use a third party tool for our reviews and they are in javascript, so they do not get crawled, but going forward we will be using our native platform.  We want the text of the reviews to get crawled and indexed. Thanks.

    | Colbys
    0

  • We’ve had some success recently by reducing the number of duplicate title tags on our website. We have managed to fix all the simple cases but there are a number of stubborn examples that we don’t know how to fix. A lot of the duplicate tags come from the website’s forums. Many questions have been asked multiple times over the years where the user has phrased the question in the same way. This has led to many cases where different forums posts have the same title tag. For example, there are six title tags with the words ‘’need help”!  These are being highlighted as duplicates and currently we have several thousand of these. Would this be a problem? I’d be tempted to say that we should leave them as they don’t seem unnatural to me. One solution other solution we are considering is to append the forum name to the question to any post after the original, falling back to appending the date if that doesn’t distinguish it. Do people think that this is a good solution to implement or would it be better to leave these duplicate title tags as they are? Any help would be appreciated 🙂

    | RG_SEO
    0

  • I am currently doing a link audit on one of my sites and I am coming across some links that appear to be spam. Is there a tool that I can plug their URL into to see if they have been deemed spam by GOOGLE?

    | Mozd
    0

  • I have 12 sitemaps submitted to Google. After about a week, Google is about 50% of the way through crawling each one. In the past week I've created many more pages. Should I wait until Google is 100% complete with my original sitemaps or can I just go ahead and refresh them? When I refresh the original files will have different URLs.

    | jcgoodrich
    0

  • We have a zend framework that is complex to program if you ask me, and since we have 20k+ pages that we need to get proper titles to and meta descriptions, i need to ask if we use Javascript to handle page titles (basically the previously programming team had NOT set page titles at all) and i need to get proper page titles from a h1 tag within the page. current course of action which we can easily implement is fetch page title from that h1 tag being used throughout all pages with the help of javascript, But this does makes it difficult for engines to actually read what's the page title? since its being fetched with javascript code that we have put in, though i had doubts, is anyone one of you have simiilar situation before? if yes i need some help! Update: I tried the JavaScript way and here is what it looks like http://islamicencyclopedia.org/public/index/hadith/id/1/book_id/106  i know the fact that google won't read JavaScript like the way we have done with the website, But i need help on "How we can work around this issue"  Knowing we don't have other options.

    | SmartStartMediacom
    0

  • I have a semi-big site (500K pages) with lots of new pages being created. I also have a process that updates my sitemap with all of these pages automatically. I have 11 sitemap files and a sitemap index file. When I update my sitemaps and submit them to Google, should I keep the same names?

    | jcgoodrich
    0

  • Given that google have stated that duplicate content is not penalised is this really something that will give sufficient benefits for the time involved?Also, reading some of the articles  on moz.com they seem very ambivalent about its use – for example http://moz.com/blog/rel-confused-answers-to-your-rel-canonical-questionsWill any page with a canonical link normally NOT be indexed by google?Thanks.

    | fdmgroup
    0

  • I am moving my website to a new platform. The URS's will be the exact same. What are the 10 most important items I should check before I swap over to the new platform.

    | robbieire
    0

  • Hey guys, I wanted to ask you your opinion.. If you had a website - portfolio style for argument's sake and it was based on wordpress, obviously the front page won't be SEO friendly if you want to keep the minimalistic approach - there will be hardly any content to tell google what to rank your site for... So my question is, can you use a plugin that Google can 'see' content - such as a long unique article - that the user can't see in order to help you rank? I.e. for Gbot, the plugin would load the content plugin as plain html, but 'hide' it from most people visiting the site... What would you do in this scenario? Your response would be much appreciated! Thanks in advance for your help!

    | geniusenergyltd
    0

  • I put the Schema.org data on my item pages and it works great. However, when an item closes it removes the price. It showed an empty price and that causes an error. The site is now programmed to where if an item closes it removes the price component. This was done about 2 weeks ago and it is still showing a lot of errors. Any ideas?

    | EcommerceSite
    0

  • I have question for the community and whether or not this is a good or bad idea. I currently have a Joomla site that displays www.domain.com/index.php in all the URLs with the exception of the home page.  I have read that it's better to not have index.php showing in the URL at all.  Does it really matter if I have index.php in my URL?  I've read that it is a bad practice. I am thinking about installing the sh404SEF component on my site and removing the index.php.  However, I rank pretty high for the keywords I want in Google, Bing and Yahoo.  All of the URLs that show up in the searches have index.php as part of the URL. Has anyone ever used sh404SEF to remove the index.php and how did you overcome not loosing your search engine links?  I don't want an existing search showing www.domain.com/index.php/sales and it not linking to the correct page which would now be www.domain.com/sales.  I guess I could insert the proper redirects in the htaccess file.  But I was hoping to avoid having every page of my site in the htaccess file for redirecting. Any help or advice appreciated.

    | MedGroupMedia
    0

  • We are currently adding reviews to a clients site from The Review Centre. We are trying to use semantic markup more, so would like to know the best way to do this. Example: <blockquote cite="http://www.example.co.uk">
    Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.
    </blockquote> Question: Does "cite=" pass equity and if so, should we nofollow them?

    | Silkstream
    0

  • We list products on our site and suspect that we have been hit by Panda as we are duplicating listings across our own site. Not intentionally, we just have multiple pages listings the same content as they fall into multiple categories. Has anyone else had the same issue and if so how did you deal with it?.. Have you seen a change in results/rankings due to the changes you made?

    | nick-name123
    0

  • This is a question that I am not sure has a "right" answer. I am just wondering what everyone's thoughts are on this. I can see benefit of both sides of the coin. In your opinion, is it better to have one large e-commerce site with all of your content on the same domain or is it better to have multiple more targeted domains with your content broken up into smaller chunks? The reason I ask is, I feel like while multiple more targeted sites certainly have the benefit of focus, aren't you taking all your traffic and content, splitting it up and leaving you with several sites that most likely are getting less traffic than one large site would. All opinions welcome.

    | unikey
    0

  • I have started working on a website that it written in JAVA. It has 26 URL's But because of the way it is written it is all shown on the home page code and does not have the ability to add unique title and description tags. Is there a work around for SEO on websites like this aside from adding content?  I was wondering if there is a way to submit a sitemat with title and description tags. Any advice? Chris.K

    | CKerr
    0

  • Hi, I'm aware Google doesn't care if you have helpful content you can hide/unhide by user interaction. I am also aware that Google frowns upon hiding content from the user for SEO purposes. We're not considering anything similar to this. The issue is, we will be displaying only a part of our content to the user at a time. We'll load 3 results on each page initially. These first 3 results are static, meaning on each initial page load/refresh, the same 3 results will display. However, we'll have a "Show Next 3" button which replaces the initial results with the next 3 results. This content will be preloaded in the source code so Google will know about it. I feel like Google shouldn't have an issue with this since we're allowing the user action to cycle through all results. But I'm curious, is it an issue that the user action does NOT allow them to see all results on the page at once? I am leaning towards no, this doesn't matter, but would like some input if possible. Thanks a lot!

    | kirmeliux
    0

  • Hi Moz, Do search engines really treat subdomains as separate domains in this regard? Or are we more likely to get more real estate on the first page with a new domain? Our goal is to have our main site and this new subdomain or domain ranking in positions 1 and 2 for our company name. This is going to be a careers site/portal. Thanks for reading!

    | DA2013
    0

  • Mozzers, I have changed my site url structure several times. As a result, I now have a lot of old URLs that don't really logically redirect to anything in the current site. I started out 404-ing them, but it seemed like Google was penalizing my crawl rate AND it wasn't removing them from the index after being crawled several times. There are way too many (>100k) to use the URL removal tool even at a directory level. So instead I took some advice and changed them to 200, but with a "noindex" meta tag and set them to not render any content. I get less errors but I now have a lot of pages that do this. Should I (a) just 404 them and wait for Google to remove (b) keep the 200, noindex or (c) are there other things I can do? 410 maybe? Thanks!

    | jcgoodrich
    0

  • After 6 months of effort with an SEO provider, the results of our campaign have been minimal. we are in the process of reevaluating our effort to cut costs and improve ROI.  Our site is for a commercial real estate brokerage in New York City. Which of these options would have the best shot of creating results in the not too long term future: -Create a keyword matrix and optimize pages for specific terms. Maybe optimize 50 pages.
    -Add content to "thin" pages. Rewrite 150-250 listing and building pages.
    -Audit user interface and adjust the design of forms and pages to improve conversions.
    -Link building campaign to improve the link profile of a site with not many links (most of those being of low quality). I would really like to do something about links, but have been told this will have no effect until the next "Penguin refresh". In fact I have been told the best bet is to improve user interface since it is becoming increasingly difficult to improve ranking. Any thoughts? Thanks, lan

    | Kingalan1
    0

  • Hi, I'm noticing a huge difference in the number of pages in Googles index (using 'site:' search) versus the number of pages indexed by Google in Webmaster tools. (ie 20,600 in 'site:' search vs 5,100 submitted via the dynamic sitemap.) Anyone know possible causes for this and how i can fix? It's an ecommerce site but i can't see any issues with duplicate content - they employ a very good canonical tag strategy. Could it be that Google has decided to ignore the canonical tag? Any help appreciated, Karen

    | Digirank
    0

  • Hi Mozzers, A quick question.  In the last few months I have noticed that for a number of keywords I am having 2 different pages on my domain show up in the SERP.  Always right next to each other (for example, position #7 and #8 or #3 and #4).  So in the SERP it looks something like: www.mycompetition1.com www.mycompetition2.com www.mywebsite.com/page1.html
    4) www.mywebsite.com**/page2.html**
    5) www.mycompetition3.com Now, I actually need both pages since the content on both pages is different - but on the same topic.  Both pages have links to them, but page1.html always tends to have more.  So, what is the best practice to tell Google that I only want 1 page to rank?  Of course, the idea is that by combining the SEO Juice of both pages, I can push my way up to position 2 or 1. Does anybody have any experience in this?  Any advice is much appreciated.

    | rayvensoft
    0

  • Hi, In the attached Google SERP example the first listing below the paid search ads has a large box with a snippet of content from the relevant page then followed by the standard link. Does anyone know how you get Google to display a box like this in their SERPs? I checked the code on the page and there doesn't appear to be anything special about it such as any schema markup. It uses standard list code. Does this only appear for particular types of content or sites, such as medical content in this case? Is the content more likely to appear for lists? Does it only appear for high authority sites that Google has selected? We have a similar medical information based site and it would be great to try to get Google to display a similar box of content for some of our pages. Thanks. Damien ZmPJVSl.png

    | james.harris
    0

  • One strategy I have seen recommended over and over is to look at your competitor's back links and see if any could be relevant for your site and worth pursuing. My question is how do I evaluate a link and not end up pursuing some penalized site? I would guess checking for Google index is a good idea since some of the webmasters may not be aware they are penalized. Is it DA and whether they are indexed alone? Many sites I have seen have DA in the teens but are legitimate in our industry. Should they not be considered due to low DA? Also I see links from directories on many competitor sites. Seems a controversial subject, but assuming the directory is industry specific, is it OK? Thanks in advance!

    | Chris661
    0

  • Hi all! An easy one for probably most of you. I have a client who wants to re-brand their business name and also match their URL to the new name. Their current domain name URL has been out there for over 5 years, and the site is performing quite well in search. Switching to a new URL will obviously be a very bad thing, but what are the options? newname.com redirect to the aged oldname.com, but when they are on the site, or when they find them in search, the oldname.com has nothing to do with the new brand. Or should we 301 every page of the oldname.com site to the newname.com be good enough? What is recommended? Thanks!

    | BBuck
    0

  • Greetings: Our New York City commercial real estate site is www.nyc-officespace-leader.com. Key MOZ metric are as follows: Domain Authority: 23
    Page Authority: 33
    28 Root Domains linking to the site
    179 Total Links. In the last six months domain authority, page authority, domains linking to the site have declined. We have focused on removing duplicate content and low quality links which may have had a negative impact on the above metrics. Our ranking has dropped greatly in the last two months. Could it be due to the above metrics? These numbers seem pretty bad. How can I reverse without engaging in any black hat behavior that could work against me in the future? Ideas?
    Thanks, Alan Rosinsky

    | Kingalan1
    0

  • Hi all, I am wondering what peoples thoughts are on using rel="nofollow" for a link on a page like this http://askgramps.org/9203/a-bushel-of-wheat-great-value-than-bushel-of-goldThe anchor text is "Brigham Young" and the page it's pointing to's title is Brigham Young and it goes into more detail on who he is. So it is exact match. And as we know if this page has too much exact match anchor text it is likely to be considered "over-optimized". I guess one of my questions is how much is too much exact match or partial match anchor text? I have heard ratios tossed around like for every 10 links; 7 of them should not be targeted at all while 3 out of the 10 would be okay. I know it's all about being natural and creating value but using exact match or partial match anchors can definitely create value as they are almost always highly relevant. One reason that prompted my question is I have heard that this is something Penguin 3.0 is really going look at.On the example URL I gave I want to keep that particular link as is because I think it does add value to the user experience but then I used rel="nofollow" so it doesn't pass PageRank. Anyone see a problem with doing this and/or have a different idea? An important detail is that both sites are owned by the same organization. Thanks

    | ThridHour
    0

  • If a company has a handful of large sites that function as collection of unique portals into client-specific content (password protected), will it have any positive effect on search ranking to migrate all of the sites to one URL structure.

    | trideagroup
    0

  • Hi I know that GWT's will not show all my links but is there a 3rd party (other than Moz of course!) tool that will? And how quickly should they show up? Thanks Ash

    | AshShep1
    0

  • I'm doing a detailed analysis of how Google sees and indexes our website and we have found that there are 240,256 pages in the index which is way too many. It's an e-commerce site that needs some tidying up. I'm working with an SEO specialist to set up URL parameters and put information in to the robots.txt file so the excess pages aren't indexed (we shouldn't have any more than around 3,00 - 4,000 pages) but we're struggling to find a way to get a list of these 240,256 pages as it would be helpful information in deciding what to put in the robots.txt file and which URL's we should ask Google to remove. Is there a way to get a list of the URL's indexed? We can't find it in the Google Webmaster Tools.

    | sparrowdog
    0

  • Scenario:
    A website that we manage was hit with a manual action penalty for unnatural incoming links (site-wide). The penalty was revoked in early March and we're still not seeing any of our main keywords rank high in Google (we are found on page 10 and beyond). Our traffic metrics from March 2014 (after the penalty was revoked) - July 2014 compared to November 2013 - March 2014 was very similar. Question: Since the website was hit with a manual action penalty for unnatural links, is the content affected as well? If we were to take the current website and move it to a new domain name (without 301 redirecting the old pages), would Google see it as a brand new website? We think it would be best to use brand new content but the financial costs associated are a large factor in the decision. It would be preferred to reuse the old content but has it already been tarnished?

    | peteboyd
    0

  • Hi everyone, I am currently working on a website that the XML sitemap is set to update weekly. Our client has requested that this be changed to daily. The real issue is that the website creates short term product pages (10-20 days) and then the product page URL's go 404. So the real problem is quick indexing not daily vs weekly sitemap. I suspect that daily vs weekly sitemaps may help solve the indexing time but does not completely solve the problem. So my question for you is how can I improve indexing time on this project? The real problem is how to get the product pages indexed and ranking before the 404 page shows u?. . Here are some of my initial thoughts and background on the project. Product pages are only available for 10 to 20 days (Auction site).Once the auction on the product ends the URL goes 404. If the pages only exist for 10 to 20 days (404 shows up when the auction is over), this sucks for SEO for several reasons (BTW I was called onto the project as the SEO specialist after the project and site were completed). Reason 1 - It is highly unlikely that the product pages will rank (positions 1 -5) since the site has a very low Domain Authority) and by the time Google indexes the link the auction is over therefore the user sees a 404. Possible solution 1 - all products have authorship from a "trustworthy" author therefore the indexing time improves. Possible solution 2 - Incorporate G+ posts for each product to improve indexing time. There is still a ranking issue here since the site has a low DA. The product might appear but at the bottom of page 2 or 1..etc. Any other ideas? From what I understand, even though sitemaps are fed to Google on a weekly or daily basis this does not mean that Google indexes them right away (please confirm). Best case scenario - Google indexes the links every day (totally unrealistic in my opinion), URL shows up on page 1 or 2 of Google and slowly start to move up. By the time the product ranks in the first 5 positions the auction is over and therefore the user sees a 404. I do think that a sitemap updated daily is better for this project than weekly but I would like to hear the communities opinion. Thanks

    | Carla_Dawson
    0

  • I have run across a couple of articles recently suggesting that using the hreflang tag could solve any SEO problems associated with having duplicate content on the different versions (.co.uk, .com, .ca, etc). here is an example here: http://www.emarketeers.com/e-insight/how-to-use-hreflang-for-international-seo/ Over to you and your technical colleagues, I think ….

    | JordanBrown
    0

  • | On June 14th the number of indexed pages for our website on Google Webmaster tools increased from 676 to 851 pages. Our ranking and traffic have taken a big hit since then. The increase in indexed pages is linked to a design upgrade of our website. The upgrade was made June 6th. No new URLS were added. A few forms were changed, the sidebar and header were redesigned. Also, Google Tag Manager was added to the site. My SEO provider, a reputable firm endorsed by MOZ, believes the extra 175 pages indexed by Google, pages that do not offer much content, may be causing the ranking decline. My developer submitted a page removal request to Google via Webmaster tools around June 20th. Now when a Google search is done for site:www.nyc-officespace-leader.com 851 results display. Would these extra pages cause a drop in ranking? My developer issued a link removal request for these pages around June 20th and the number in the Google search results appeared to drop to 451 for a few days, now it is back up to 851. In Google Webmaster Tools it is still listed as 851 pages. My ranking drop more and more everyday. At the end of displayed Google Search Results for site:www.nyc-officespace-leader.comvery strange URSL are displaying like:www.nyc-officespace-leader.com/wp-content/plugins/... If we can get rid of these issues should ranking return to what it was before?I suspect this is an issue with sitemaps and Robot text. Are there any firms or coders who specialize in this? My developer has really dropped the ball. Thanks everyone!! Alan |

    | Kingalan1
    0

  • Hello, Awhile back my company changed from http: to https: sitewide (before i started working here). We use a very standard rewrite rule that looks like this: RewriteEngine On
    RewriteCond %{SERVER_PORT} 80
    RewriteRule ^(.*)$ https://opiates.com/$1 [R,L] However, with this rule in place, some http: urls are being redirected with a 302 status code. My question is, can I safely change the above code to look like this: RewriteEngine On
    RewriteCond %{SERVER_PORT} 80
    RewriteRule ^(.*)$ https://opiates.com/$1 [R=301,L] to ensure that every redirected is returned with a 301 status code. The only change is in the [R,L] section. Thanks to whomever can help with this. I'm pretty sure its safe but I dont want the site to go down, even for a second, so figured I would ask first.

    | Waismann
    0

  • What are the benefits of having Image Metadata?

    | JordanBrown
    0

  • We are about to migrate a large website with a fair few images (20,000). At the moment we include images in the sitemap.xml so they are indexed by Google and drive traffic (not sure how I can find out how much though). Current image slugs are like:
    http://website.com/assets/images/a2/65680/thumbnails/638x425-crop.jpg?1402460458 Like on the old site, images on the new website will also have unreadable cache slugs, like:
    http://website.com/site_media/media/cache/ce/7a/ce7aeffb1e5bdfc8d4288885c52de8e3.jpg All content pages on the new site will have the same slugs as on the old site. Should I go through the trouble of redirecting all these images?

    | ArchMedia
    0

  • Hi all just wondering if anyone has ever had any experience / tips / advice. moving from domain name a to b is well published all over the web and the practice is often discussed on here. but my question is has anyone ever done moving the domain from a to b and then after x time move back to domain a. i can't find any examples, notes anywhere on Google. thanks in advance

    | Andy-Halliday
    0

  • Hi everyone, I'm a recent(ish) beginner to SEO, and while I feel I've got a good grounding in the industry now, there are still certain aspects that make me say "what!?". I'm looking to write a blog post on this and would love to know what parts of SEO still confuse you or make you say "what!?", and explain them from a semi-beginners point of view. Any comments appreciated! Thanks!

    | White.net
    0

  • We have a client that are ranking well on most Australian cities for competitive keywords except Google Sydney. If you toggled the cities on the search field when you search for a keyword, their places are almost exactly the same except for Sydney on which they can't be found at all in the Top 100 results. The keywords are not city specific, they are general commonly searched keywords about health. This is not a Google Places issue. The search result shows the right landing pages of the site for their respective keywords. Any ideas or experience on this kind of situation. Much appreciated Louie

    | louieramos
    0

  • Hello, my team is trying to understand how to best construct slugs. We understand they need to be concise and easily understandable, but there seem to be vast differences between the three examples below. Are there reasons why one might be better than the others? http://www.washingtonpost.com/news/morning-mix/wp/2014/06/20/bad-boys-yum-yum-violent-criminal-or-not-this-mans-mugshot-is-heating-up-the-web/ http://hollywoodlife.com/2014/06/20/jeremy-meeks-sexy-mug-shot-felon-viral/ http://www.tmz.com/2014/06/19/mugshot-eyes-felon-sexy/

    | TheaterMania
    0

  • Hi, I need help with international SEO redirects. I'm going to have intelligencebank.com/au for Australian visitors and intelligencebank.com for the rest of the world. I would like to automatically redirect aus users that land on .com to .com/au and vice versa for non-australian users. 1. Which automatic redirects should I use: a) java script because it will allow US based google bots to crawl my /au website (bots won't read javascript so they won't be redirected) b) http redirects c) 301 redirects d) 302 redirects e) anything else? a) Should I still use rel alternate even though I only use english? b) if I should add rel alternate,  can I still keep my existing rel canonical tags that are use to avoid duplicate content (I use a lot of utm codes when advertising)

    | intelligencebank
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.