Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • Buonjorno from 13 degress C heavily overcast wet Wetherby UK... When you enter term "Ramsdens York" this page is visible in the serps:
    https://www.ramsdensforcash.co.uk/buy-it/find-your-nearest-branch/york/#results When you enter term "Ramsens arbroath" the following page is not visible:
    https://www.ramsdensforcash.co.uk/buy-it/find-your-nearest-branch/arbroath/#results So my question is please why when both pages have equally bad mark up & equally bad internal linking does one page rank & the other is invisible? Any insights welcome 🙂

    | Nightwing
    0

  • Hi Mozers I'm still struggling with my London based client with two locations and one business.  Basically she has a location in W1W 'Westminster' and a location in 'WD!' Borehamwood. Has anyone any good resources of input concerning geotargeting.  I've done some searching but can't get quite the help I'm seeking. I'd like to make the Pages cover a 5mile radius and be highly specific to their locations.  Is this the right way to proceed? Thanks

    | catherine-279388
    0

  • For one of the keyword of my website wrong URL appearing in SERP.I wanted to remove that listing in SERPs.How can i remove that listing in SERP?

    | Alick300
    0

  • For the last 3 months we've been working on www.UneekJewelry.com to get it rank for "unique engagement rings" and got it to position 14 but since September 18th, 2012 that all changed. The website no longer ranks for our money keyword not even in the top 500. We haven't been building exact match keywords but using more phrase match and I can't tell if its because we don't have enough exact match or something else. We have been working on trying to figure out what the issue is as they still rank for other keywords like "unique engagement rings in los angeles", therefore we don't know if this is a keyword penalty or glitch in Google, can anyone give us any insight on what could cause such a drop in those specific keywords for our site. Most of the link building we have done for this client is from in content blogs.  The only blog we were suspicious of we removed the article from that blog two days ago.  The ranking have not come back. This was the last blog that the exact match keyword was used where we removed the article from: http://colorfuljewelry.blogspot.com/ Thanks,

    | harrykabadaian
    0

  • Dear SEO Gurus, I have been working on #2 site for a couple of months and I think it is a good idea to redirect #1 (old site) to #2 (new site) below, yes?  What is the most effective way of doing this?  Do I have to 301 Redirect one page from the old site to a relevant page on the new site and do this for every page.... or can I do a 301 redirect for the whole old site to the new site? Thank you for your time in advance for helping me out! Sheryl | | 1 | Gazpachos - Restaurnte Y CantinaExplore http://www.gazpachorestaurant.com/ |
    | 2 | Gazpacho's RestaurantExplore http://www.restaurantsdurango.com/ | Page Authority | Page Linking Root Domains | Domain Authority | Root Domain Linking Root Domains |
    | 42 | Check_big_gray 36 | 30 | 37 |
    | 21 | 1 | 6 | 1 |

    | TOMMarketingLtd.
    0

  • I have searched google for 'how to setup .htaccess file' and it seems that every website has some variation. For example: RewriteCond %{HTTP_HOST} ^yoursite.com RewriteRule ^(.*)$ http://www.yoursite.com/$1 [R=permanent,L] On SEOMOZ someone posted this: RewriteCond %{HTTP_HOST} !^www.yoursite.com [NC] RewriteRule (.*) http://www.yoursite.com/$1 [L,R=301] On yet another website, I found this: RewriteEngine On RewriteCond %{HTTP_HOST} !^your-site.com$ [NC] RewriteRule ^(.*)$ http://your-site.com/$1 [L,R=301] As you can see there are slight differences. Which one do I use? I'm on Apache CentOS and I have HTML5 websites and several Joomla! wesites. Would the HTACCESS File be different for both?

    | maxduveen
    0

  • Bonjourno from 10 degrees C lighly raining Wetherby UK 🙂 Every so often SEO feels like a game of snakes & ladders. One minute your rankings go up and then then within the click of a mouse they drop back down. Like a Greek play you begin to feel our mortal lives as SEO pundits is controlled by the Google Gods. A case in point is illustrated here in this graph:
    http://i216.photobucket.com/albums/cc53/zymurgy_bucket/lincoln-drop_zpseeb04690.jpg Now if i want to explain why the rapid dip has occured for target term "Lincoln Solicitors" here's is what i'd do: 1. Go to webmaster tools and check for crawl errors 2. See if a Google algo change has changed the rules of engagment 3. Check another site administrator hasnt tinkered with the original layout But i wonder what process do other SEO practitioners follow to explain to a disgruntled client - "Why have my rankings that i pay you to look after nose dived?" Any insights welcome:-)

    | Nightwing
    0

  • Hi Guys, Sometime ago one of the SEO experts said to me if I get links from the same IP address, Google doesn't count them as with much value. For an example, I am a web devleoper and I host all my clients websites on one server and link them back to me. Im wondering whether those links have any value when it comes to seo or should I consider getting different hosting providers? Regards Uds

    | Uds
    0

  • I have a website www.Hindi-comedy.com , on 12 of Sep the traffic gradually fell from 320 clicks to 50 clicks in a day and my website ranking which was on 1st of 2nd page is now no where, has google applied some update on 12th of September, I use to rank well on keyword "Hindi comedy" but now its nowhere. Regards Gaurav

    | gaurav.agarwal
    0

  • Manufacturer product website (product.com) has an associated direct online store (buyproduct.com). the online store has much duplicate content such as product detail pages and key article pages such as technical/scientific data is duplicated on both sites. What are some ways to lessen the duplicate content here? product.com ranks #1 for several key keywords so penalties can't be too bad and buyproduct.com is moving its way up the SERPS for similar terms. Ideally I'd like to combine the sites into one, but not in the budget right away. Any thoughts?

    | Timmmmy
    0

  • Happened several times during this year. After launch as soon as we reach exactly 100K uniques Google kills 90% of the traffic.  Then sudden recovery (pretty much without any action from us) after several weeks not connected to any algo updates/refresh. No warnings. No malware. WMT clean as a baby. Only old good whitehat SEO. Not even close to the edge of wrongdoing:) This time it happens again Aug. 22nd right after Panda 3.9.1.  What is different now same exact date Bing traffic went down as well. http://bit.ly/eEu27I Need advice:)

    | LocalLocal
    0

  • On product listing page on e-commerce site We use POST forms as 'Add To Cart' buttons. Because of that We have dozens (~40-80) forms on any product listing page, and two questions regarding them: Does these forms affect link juice of other links on the page? Are there cases when forms are somehow counted by Google as links? Regards, Lucek

    | lucek
    0

  • Search results on my keyword (engraved wedding glasses) produces several pages of linked domains.  (My domain is giftthings.net) Some are good.  And admittedly, some are not so good.  My question then is simply, why does seomoz link analysis show such a small number of links?  And the second part of my question is, "Is there some sort of "magic number", some sort of thresh hold that triggers Google's interest?  With a link list that is small but growing, am I missing something in my concern that I'm not moving up in the search listings? I've written a few articles, continuing my work on link building but I remain buried in the search results.

    | AhmadS
    1

  • Hi, I have 2 questions about the "call" button on mobile google serps when doing a business name search: -since when is this button available in SERPS -is there anything specific you can do to actually have google display that call button (schema.org, ...) Kind regards Pieter

    | TruvoDirectories
    0

  • Hello All, I am currently trying to establish the TITLE tag of my homepage. I am trying to target 2 terms plus my company name. For example, purposes, the two keywords are: Widget Program Widget Software My company Name is: Widget Direct I originally had the title as: Widget Program | Software | Widget Direct My thought was that I didn't want to repeat the word "Widget"  too many times. However, the SEOmoz on-page report card keeps telling me I should have the exact keyword in my title tag. In that case it would make the title: Widget Program | Widget Software | Widget Direct Do you think that is better so that I have each keyword in the title or will that result in a penalty because it looks like I'm stuffing the title with the keyword 'widget'? Any insight is greatly appreciated! Thanks!

    | Robert-B
    0

  • Hello SEO Gurus! I have a client who runs 7 web properties. 6 of them are satellite websites, and 7th is his company's main website. For a long while, my company has, among other things, blogged on a hosted blog at www.hismainwebsite.com/blog, and when we were optimizing for one of the other satellite websites, we would simply link to it in the article. Now, however, the client has gone ahead and set up separate blogs on every one of the satellite websites as well, and he has a nifty plug-in set up on the main website's blog that pipes in articles that we write to their corresponding satellite blog as well. My concern is duplicate content. In a sense, this is like autoblogging -- the only thing that doesn't make it heinous is that the client is autoblogging himself. He thinks that it will be a great feature for giving users to his satellite websites some great fresh content to read -- which I agree, as I think the combination of publishing and e-commerce is a thing of the future -- but I really want to avoid the duplicate content issue and a possible SEO/SERP hit. I am thinking that a noindexing of each of the satellite websites' blog pages might suffice. But I'd like to hear from all of you if you think that even this may not be a foolproof solution. Thanks in advance! Kind Regards, Mike

    | RCNOnlineMarketing
    0

  • On my SEOmoz report, there are several 404 pages that I assume need deletion.  Yes? When I am looking at my pages from the back-end of WordPress, how do I identify these to delete or fix them?  In the list of pages I have created, it is not at all apparent when I click into "edit" the page that any of these are broken pages.  I think the 404 pages are urls from pages that I changed the url to be more seo friendly, but they don't really exist.  I hope this makes sense - it is baffling to me : ) Thank you for any insight and help with getting these cleared.  The errors are listed below from the report. Sheryl | 404 : Error http://durangocodentists.com/durango-dentists-why-greg-mann/dentists-in-durango-co/Cosmetic_Dentistry_Services_Teeth_Whitening_Montezuma_CO.html 404 1 0 404 : Error http://durangocodentists.com/durango-dentists-why-greg-mann/dentists-in-durango-co/General_Dentistry_Services_White_Fillings_Montezuma_CO.html 404 1 0 404 : Error http://durangocodentists.com/durango-dentists-why-greg-mann/dentists-in-durango-co/Request_an_Appointment.html 404 1 0 404 : Error http://durangocodentists.com/videos/repairing-teeth/pid%3A4078865 404 1 0 404 : Error http://durangocodentists.com/videos/teeth-whitening/pid%3A4078865 404 1 0 404 : Error http://durangocodentists.com/videos/veneers/pid%3A4078865 | 404 | 1 | 0 |

    | TOMMarketingLtd.
    0

  • In November 2010 Google introduced the "standout tag" http://support.google.com/news/publisher/bin/answer.py?hl=en&answer=191283 I can't find any articles/blog posts/etc in google after that date, but its use was suggested in a google forum today to help with original content issues. Has anyone used them? Does anyone know what's the latest with them? Are they worth trying for SEO? Is there a possible SEO penalty for using them? Thanks, Jean

    | JeanYates
    0

  • I'm using the e-commerce platform Shopify to host an e-store. We've put our products into different collections. Shopify automatically creates different URL paths to a product in multiple collections. I'm worried that the same product listed in different collections is soon as different pages, and therefore duplicate content by Google/Bing/Yahoo. Would love to get your opinion on this concern! Thanks! Matthew

    | HappinessDigital
    0

  • Hi all, I checked a website of my company: sitelinks in SERP are with the correct url, but one of the sitelinks’ title is completely irrelevant. Is it possible that it was changed from "outside"? Or maybe it's a bug? Thank you, Imre

    | DDL
    0

  • We are moving a site to a new domain.  I have setup an .htaccess file and it is working fine.  My problem is that Google Webmaster tools now says it cannot access the robots.txt file on the old site.  How can I make it still see the robots.txt file when the .htaccess is doing a full site redirect? .htaccess currently has: Options +FollowSymLinks -MultiViews
    RewriteEngine on
    RewriteCond %{HTTP_HOST} ^(www.)?michaelswilderhr.com$ [NC]
    RewriteRule ^ http://www.s2esolutions.com/ [R=301,L] Google webmaster tools is reporting: Over the last 24 hours, Googlebot encountered 1 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%.

    | RalphinAZ
    0

  • I just receive a n error message from google webmaster Wonder it was something to do with Yoast plugin. Could somebody help me with troubleshooting this? Here's original message Over the last 24 hours, Googlebot encountered 189 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%. Recommended action If the site error rate is 100%: Using a web browser, attempt to access http://www.soobumimphotography.com//robots.txt. If you are able to access it from your browser, then your site may be configured to deny access to googlebot. Check the configuration of your firewall and site to ensure that you are not denying access to googlebot. If your robots.txt is a static page, verify that your web service has proper permissions to access the file. If your robots.txt is dynamically generated, verify that the scripts that generate the robots.txt are properly configured and have permission to run. Check the logs for your website to see if your scripts are failing, and if so attempt to diagnose the cause of the failure. If the site error rate is less than 100%: Using Webmaster Tools, find a day with a high error rate and examine the logs for your web server for that day. Look for errors accessing robots.txt in the logs for that day and fix the causes of those errors. The most likely explanation is that your site is overloaded. Contact your hosting provider and discuss reconfiguring your web server or adding more resources to your website. After you think you've fixed the problem, use Fetch as Google to fetch http://www.soobumimphotography.com//robots.txt to verify that Googlebot can properly access your site.

    | BistosAmerica
    0

  • I have a Google Alert setup which is pulling information from a blog. I am receiving images as part of the alert. The issue that I am having is that the images have nothing to do with the blog post. Is there a way to control what images are received in the alert. From what I have gathered, if it grabs an image it should be part of the blog post.

    | ricknakao
    0

  • I have a client that owns one franchise location of a franchise company with multiple locations.  They have one large site with each location owning it's own page on the site, which I feel is the best route.  The problem is that each location page has basically duplicate content on each page resulting in like 80 pages of duplicate content. I'm looking for advice on how to create unique content for each location page?  What types of information can we write about to make each page unique, because you can only twist sentences and content around so much before it just all sounds cookie cutter and therefore offering little value.

    | RonMedlin
    0

  • With Bing's new app that will integrate their news feed into Facebook, I'd like to optimize for inclusion in Bing news pickup. Does Bing accept news sitemaps yet?

    | Aggie
    0

  • I have been whittling away at the duplicate content on my clients' sites, thanks to SEOmoz's pro report, and have been getting push back from the account manager at register.com (the site was built here and the owner doesn't want to move it).  He says these are the exact same page and he can't access one to redirect to the other.  Any suggestions? The SEOmoz report says there is duplicate content on both these urls: Durango Mountain Biking | Durango Mountain Resort - Cascade Village http://www.cascadevillagehotel.com/index.htm Durango Mountain Biking | Durango Mountain Resort - Cascade Village http://www.cascadevillagehotel.com/ Your help is greatly appreciated! Sheryl

    | TOMMarketingLtd.
    0

  • Hi, Putting together a URL for a product we are selling. We sell IT Training courses and the structure is normally Top Folder=Main Courses section Sub Folder=Vendor Page Specific=Course Name + Term An example is courses/microsoft/mcse-training However I have a product where the vendor and course name are the same. How should I best organise the URL - double mention or single mention So a) courses/togaf/togaf-foundation-training or b) courses/togaf/foundation-training

    | RobertChapman
    0

  • When looking through my google webmaster tools, I clicked into the advanced settings under index status and was surprised  to see that google has marked around 90% of my pages on my site as "Not Selected" when crawling.  Please take a look and offer any suggestions. www.luxuryhomehunt.com

    | Jdubin
    0

  • I am not a website programmer and all of our websites are in Wordpress. I never change the coding on the backend. Is this a necessity if one wants to use Open Graph?

    | dahnyogaworks
    0

  • On many publication sites I have noticed weird links like I have never seen before <a <="" span="">href="http://test.com" onclick="linkClick(this.href)">Test</a> Are these still follow links? Is the only thing that determines a no follow link "rel=nofollow"? So as long as the link doesn't have that, it's good to go? Why might they have used a link like this? For tracking?

    | BlueLinkERP
    0

  • Buonjourno from Latitude 53.92705600 Longitude -1.38481600 🙂 On this site http://www.collegeofphlebology.com/ therre are multiple banners pointing to 3rd party sites illustrated here http://i216.photobucket.com/albums/cc53/zymurgy_bucket/banner-links-to-other-sites.jpg so my question is please: 1. What affect if any will these banners loose SEO juice (Love that pharse not) 2. If they are detrimental will adding nofollow links resolve the problem or... is linking out no problem in terms of loosing authority. GRazzie TAnto, David

    | Nightwing
    0

  • So we have this site set up in analytics as www.domain.com and then analytics is showing the referral traffic as coming from domain.com and I just wanted to make sure I'm right in the theory that Google is counting the 301 as a different site and showing what is otherwise direct traffic as traffic coming from domain.com. If that's wrong let me know. Otherwise I'll just go through with that theory since no one on any forums that I could find had an answer to it.

    | KateGMaker
    0

  • Hi All you Seo Experts from seomoz I have a question about one of my webshops where I have the same product listed in different categories where I on the duplicate pages use the Rel Caninical Tag on, that points to the main product url. I just have to verify with you guys that I did it correctly Example on the shop. This is just an example www.phonetech.dk/shop/product1.html - This is Main Duplicates www.phonetech.dk/shop/iphone3G/product1.html -  Canonical Tag on this one pointing to the main. www.phonetech.dk/shop/iphone3g/backcovers/product1.html - Canonical Tag on this one pointing to the main. www.phonetech.dk/shop/iphone3gs/colorbackcovers/product1.html - Canonical Tag here also pointing to main Hope you guys can help me that my use of Canonical Tag is correct. Regards Christian - Denmark

    | noerdar
    0

  • Experimenting a little bit to recover from Panda and added "noindex" tag for quite a few pages. Obviously now we need Google to re-crawl them ASAP and de-index. Should we leave these pages in sitemaps (with updated "lastmod") for that? Or just patiently wait? 🙂 What's the common/best way?

    | LocalLocal
    0

  • Hi, I have been monitoring some of the authority sites and I noticed something with one of them. This high authority site suddenly started using multiple tags for each post. And I mean, loads of tags, not just three of four. I see that each post comes with at least 10-20 tags. And these tags don't always make sense either. Let's say there is a video for "Bourne Legacy", they list tags like bourne, bourney legacy, bourne series, bourne videos, videos, crime movies, movies, crime etc. They don't even seem to care about duplicate content issues. Let's say the movie is named The Dragon, they would inclue dragon and the-dragon in tags list and despite those two category pages(/dragon and /the-dragon) being exactly the same now, they still wouldn't mind listing both the tags underneath the article. And no they don't use canonical tag. (there isn't even a canonical meta on any page of that site) So I am curious. Do they just know they have a very high DA, they don't need to worry about duplicate content issues? or; I am missing something here? Maybe the extra tags are doing more good than harm?

    | Gamer07
    0

  • Each month I upload my auction catalog in different formats (word, pdf and rtf). I have about 9 years of catalogues online that have all been indexed by Google. In each catalog there is a link to my terms and conditions page (which has made the page authority for that page quite high in some unusual, but desired keywords), there is also many, many mentions of non-desired keywords in each of those documents and links to my domain. Is it worth updating all these old, previously indexed catalogues with better keyword juice and more relevant links ? Would they even get re-visited by google ? I suppose that leads to the next question... is it worth adding each of these pages to my sitemap ? To this point I have only added my major pages, not any of the subordinate pages etc.

    | blinkybill
    0

  • Hi On our new website which is just a few weeks old upon logging into Webmaster tools I am getting the following message Googlebot can't access your site - The overall error rate for DNS queries is 50% What do I need to do to resolve this, I have never had this problem before with any of the sites - where the domains are with Fasthosts (UK) and hosting is with Dreamhosts.  What is the recommended course of action Google mention contacting your host in my case Dreamhost - but what do you need to ask them in a support ticket.  When doing a fetch in WMT the fetch status is a success?

    | ocelot
    0

  • Hi all, After searching around, there doesn't seem to be any clear agreement in the SEO community of the best way to implement a shareable dynamic infographic for other people to put into their site. i.e. That will pass credit for the links to the original site. Consider the following example for the web application that we are putting the finishing touches on: The underlying site has a number of content pages that we want to rank for. We have created a number of infogrpahics showing data overlayed on top of a google map. The data continuously changes and there are javascript files that have to load in order to achieve the interactivity. There is one infographic per page on our site and there is a link at the bottom of the infographic that deep links back to each specific page on our site. What is the ideal way to implement this infographic so that the maximum SEO value is passed back to our site through the links? In our development version we have copied the youtube approach implemented this as an iframe. e.g. <iframe height="360" width="640" src="http://www.tbd.com/embed/golf" frameborder="0"></iframe>. The link at the bottom of that then links to http://www.tbd.com/golf This is the same approach that Youtube uses, however I'm nervous that the value of the link wont pass from the sites that are using the infographic. Should we do this as an embed object instead, or some other method? Thanks in advance for your help. James

    | jtriggs
    0

  • I have a wordpress blog. I am getting an error message from SEOmoz "too many on page links" However SEOmoz is counting a full month of blogs as one page. For example-3 onpage internal links in each blog times 30 different blog article in a month is recorded as 90 on page links. Is there any mechanism to fix this on wordpress

    | wianno168
    0

  • Hey guys, I'm new to SEO and have the following error msg 'Duplicate Page Content'. Of course I know what it means, but my question is how do you delete the old pages that has duplicate content? I use to run my website through Joomla! but have since moved to Shopify. I see that the duplicated site content is still from the old Joomla! site and I would like to learn how to delete this content (or best practice in this situation). Any advice would be very helpful! Cheers, Peter

    | pjuszczynski
    0

  • Hi All, We have a multinational client that would like servers in different country's with localised language.The DNS will determine what server in which country to serve from Is there any SEO implications based on content duplication? Thanks Chris Byrnes

    | SEOBrisbane90
    0

  • I have an old domain - When i use the link explorer i get way more juice out of the www version of my  domain. I will be using wordpress to set up a new domain with the same name . My question is - How do I make it proper for seo? Do i just change the http:// to www in wordpress and be done with it? Does it even matter (thinking it does)

    | imagatto2
    0

  • I'm trying to figure out why when I search for "international news" or "world news", for example, some sites in the SERPs have links to news articles, while others don't.  For "international news", result of Fox News and New York Times have links to articles, while CNN (the top result), only have sitelinks.   I would appreciate any theories on why this happens.  Thanks.

    | seoFan21
    0

  • Hi Everyone, I have a question. Your input will be very much appreciated. My company's new website design is using a popup. I have some reservation about it and I want to know what your thoughts are. Ok, some information on what this popup is like. When a user clicks on a subcategory page, there's a popup that would ask for size, color, etc  - it's like a form and those are the criteria. If nothing is selected,  the product list on the subcategory page doesn't load - so the only thing is showing is the the H1 and description but everything else is empty. When a user does select a criteria the landing page is no longer the subcategory but another page with that ID. So basically the user never really land on the subcategory page but to another page with a different query string. Is this bad for SEO? Would you recommend to keep the popup? Thanks,

    | truckguy77
    0

  • I have a site which has various white label sites with the same content on each. I have canonical tags on the white label sites pointing to the main site. I have changed some URLs on the main site and 301'd the previous URL to the new ones. Is it ok to have the canonicals pointing to the old URLs that now have a 301 redirect on them.

    | BeattieGroup
    0

  • Hi Mozers I keep puzzling over this one!  I work from home and really don't want to plaster my address all over the web.  The GP page now allows for me to hide my exact location, which is great.  However, as far as I can see this is not the case with all the potential local directories and listings. I have been trying in to get around this by not adding my house number and last digit and 2 characters of my post code.  So far this has been allowed by the local listings I have signed up with. When I tried doing as recommended by the excellent Miriam and checking my business name with 'Getlisted' I found that I could only see these local listings if I added the doctored address, i.e. no house number or full postcode. My question, finally is, if I continue in this fashion for businesses based at home addresses am I going to confuse the search engines.  I want to provide a consistent NAP but GPP insists that I add a full postcode.  The only way I could possible see around this is to add: street name city full postcode and omit the house name/number. Will this be a reasonable work around to maintain client confidentiality and satisfy the NAP requirement of Local search?

    | catherine-279388
    0

  • SEOmoz is saying that I have duplicate content on: http://www.XXXX.com/content.asp?ID=ID http://www.XXXX.com/CONTENT.ASP?ID=ID The only difference I see in the URL is that the "content.asp" is capitalized in the second URL. Should I be worried about this or is this an issue with the SEOmoz crawl? Thanks for any help. Mike

    | Mike.Goracke
    0

  • PensacolaRealEstate.com I am new to SEO. I lost my first place ranking on google. I held it for YEARS. I think I lost it with the last update... How do I analyze my site to find what is penalizing me? How can I get back to the top?

    | JML1179
    0

  • Hello, If your website is getting flagged for duplicate content from your main domain www.domain.com and your multilingual english domain www.domain.com/en/ is it wise to 301 redirect the english multilingual website to the main site? Please advise. We've recently installed the joomish component to one of our joomla websites in an effort to streamline a spanish translation of the website. The translation was a success and the new spanish webpages were indexed but unfortunately one of the web developers enabled the english part of the component and some english webpages were also indexed under the multilingual english domain www.domain.com/en/ and that flagged us for duplicate content. I added a 301 redirect to redirect all visitors from the www.domain/en/ webpages to the main www.domain.com/ webpages. But is that the proper way of handling this problem? Please advise.

    | Chris-CA
    0

  • Hi guys We are launching a new product, the web pages are being built by a 3rd party and fall outside our current CMS. We're considering either hosting it on 1) sub domain 2) folder within existing site (although will be tricky to implement) or 3) a different URL altogether. What would you say is the best for SEO? Many thanks in advance.... Nigel

    | Richard555
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.