Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi, I have a question about the "4xx Staus Code" errors appearing in the Analysis Tool provided by SEOmoz. They are indicated as the worst errors for your site and must be fixed. I get this message from the good people at SEOmoz: "4xx status codes are shown when the client requests a page that cannot be accessed. This is usually the result of a bad or broken link." Ok, my question is the following. How do I fix them? Those pages are shown as "404" pages on my site...isn't that enough? How can fix the "4xx status code" errors indicated by SEOmoz? Thank you very much for your help. Sal

    | salvyy
    0

  • Hi Everyone, I apologize if the answer to this questions is obvious, but I wanted some input on how changing our web address of our site will affect our SERP. We are looking to change our website address from a.com to b.com due to rebranding of our company (primarly to expand our product line as our current url and company name are restricting). I understand that this can be done using 301 direct  and via webmaster tools with google. My question is how does this work exactly? Will our old website address show in SERP rankings, and when a user clicks on the listing are they redirected to our new address? With regards to building new links from press releases etc, do we have links point to our new web address or the old one in order to increase SERP? Does google see our old address and new address as the same website and therefor it does not matter where inbound links point to and both will increase our ranking positions? It took 6 years of in house seo  to get our website to rank on the first page of all the major search engines for our keywords, so we am being very cautious before we do anything. Thanks everyone for your input, it is greatly appreciated 🙂

    | AgentMonkey
    0

  • Hi ! We are building a site that is going to be available in some countrys with the same language (spanish), and we have some doubts about whih is the best way to do it. Option 1)  Subdomains: Example; españa.mydomain.com , mexico.mydomain.com (the problem here is that there are some problems with linkbuilding with subdomains) Option 2) Language folders: Example; mydomain.com/es/es  mydomain.com/es/mx (the problem here is that the prestige of the category in the url is going to be in 3rd position, example: mydomain.com/es/es/category and is not recommended for SEO) Option 3) Country domains Example; mydomain.es<a></a> mydomain.mx (the link building is going to be much more, cause we have to multipliate the links that we need ffor being in a good position with the diferent domains of each country) I am not sure of which one is the best option, what do you think? The only thing I am sure is to use te TAG: rel="alternate" hreflang="x" for not having duplicate content, because index and categories are going to be the same, the only thing that is going to change is the products of each country. Looking forward to your suggestions! Thanks, Regards Exequiel

    | SeoExpertos
    0

  • I've got an Exact Match Domain that has just started to do well in Google for say the past year. I've always received good rankings from Bing and Yahoo but I love the traffic levels that Google sends. Long story short on the 25th according to webmaster tools, my impressions on their search engine have been destroyed. No problems, not de-indexed, just not showing my site anymore. I like this site and have been careful, built some links and the anchor text is suspect but also not suspect because its the same as the domain. What I feel the problem may be is the site structure. I set it up a long time ago like this: Exact-Match-keyword. com/ Exact-match-keyword.php/state I thought it looked kinda spammy at the time but also thought it may help. Now I'm wondering if I shorten all the page titles to the state name and 301 the old links if I will regain rankings, or if I may lose some from other search engines. I used to think Penguins were cute......

    | TEGS
    1

  • Good evening guys, I changed my titles last month, in preparation for the over optimisation penalty and the result was an instant and quite dramatic loss in traffic. I believe the reason is, the change resulted in a lot of duplicate titles. My website is similar to deviant art, but for mobile phones. So the titles include the brand of mobile phone for example. The titles were: Upload name + Brand + Content type - 3 tags - FILEID So an example would be Black Nokia wallpaper - black, abstract, grey - 12345 I changed them to Black Nokia wallpaper by artist name on domain name. But this resulted in thousands of duplicate titles and a dramatic loss in traffic. For example a user could upload 20 black wallpapers. With this in mind, I need to change my titles and fast. But I don't want to make another mistake. The one I am quite keen to try is: Black Nokia Wallpaper - Tag1, tag2 wallpapers - on domain name. So the main variable would be the name of the upload and then the 2 tags, to mix things up a little. Another option would be to throw the file ID in there somewhere? As that will always be unique. Perhaps the file ID could be in the place of the "wallpapers" after the two tags? I'd like to keep the domain name, for branding reasons. Any other suggestions are warmly welcomed. Thanks a lot.

    | seo-wanna-bs
    0

  • We have a chance to purchase a domain with our main KW dot net.  We are already a competitor for this KW  in its other variations.  This domain is currently being used as a re-direct to another site. What are the risks associated with changing domain names and how to best evaluate if this domain will even help us win that KW in Google results?

    | devonkrusich
    0

  • I noticed bad 404 error links in Google Webmaster Tools and they were pointing to directories that do not have an actual page, but hold information. Ex: there are links pointing to our PDF folder which holds all of our pdf documents. If i type in , example.com/pdf/ it brings up a unformated webpage that displays all of our PDF links. How do I prevent this from happening. Right now I am blocking these in my robots.txt file, but if i type them in, they still appear. Or should I not worry about this?

    | hfranz
    0

  • I am attempting to figure something out with a site I'm trying to fix.  So the problem is that I've got two categories that are basically related keywords. I set this up when I first started doing this work and didn't know what I was doing.  So that site at one time was ranking on the first page for a specific term (example: 'project manager salary' and posted in the category 'project manager salary'.  But they we added 'project manager salary in Vermont' and all other 50 posts for all states in a different category called, 'project manager salaries and benefits'. So my question is this: Would this cause some kind of keyword rank cannibalization? How do I fix this properly? Thanks! Michael

    | mtking.us_gmail.com
    0

  • I have a complex SEO issue I've been wrestling with and I'd appreciate your views on this very much. I have a sports website and most visitors are looking for the games that are played in the current week (I've studied this - it's true). We're creating a new website from scratch and I want to do this is as best as possible. We want to use the most elegant and best way to do this. We do not want to use work-arounds such as iframes, hiding text using AJAX etc. We need a solid solution for both users and search engines. Therefor I have written down three options: Using a canonical URL; Using 301-redirects; Using 302-redirects. Introduction The page 'website.com/competition/season/week-8' shows the soccer games that are played in game week 8 of the season. The next week users are interested in the games that are played in that week (game week 9). So the content a visitor is interested in, is constantly shifting because of the way competitions and tournaments are organized. After a season the same goes for the season of course. The website we're building has the following structure: Competition (e.g. 'premier league') Season (e.g. '2011-2012') Playweek (e.g. 'week 8') Game (e.g. 'Manchester United - Arsenal') This is the most logical structure one can think of. This is what users expect. Now we're facing the following challenge: when a user goes to http://website.com/premier-league he expects to see a) the games that are played in the current week and b) the current standings. When someone goes to http://website.com/premier-league/2011-2012/ he expects to see the same: the games that are played in the current week and the current standings. When someone goes to http://website.com/premier-league/2011-2012/week-8/ he expects to the same: the games that are played in the current week and the current standings. So essentially there's three places, within every active season within a competition, within the website where logically the same information has to be shown. To deal with this from a UX and SEO perspective, we have the following options: Option A - Use a canonical URL Using a canonical URL could solve this problem. You could use a canonical URL from the current week page and the Season page to the competition page: So: the page on 'website.com/$competition/$season/playweek-8' would have a canonical tag that points to 'website.com/$competition/' the page on 'website.com/$competition/$season/' would have a canonical tag that points to 'website.com/$competition/' The next week however, you want to have the canonical tag on 'website.com/$competition/$season/playweek-9' and the canonical tag from 'website.com/$competition/$season/playweek-8' should be removed. So then you have: the page on 'website.com/$competition/$season/playweek-9' would have a canonical tag that points to 'website.com/$competition/' the page on 'website.com/$competition/$season/' would still have a canonical tag that points to 'website.com/$competition/' In essence the canonical tag is constantly traveling through the pages. Advantages: UX: for a user this is a very neat solution. Wherever a user goes, he sees the information he expects. So that's all good. SEO: the search engines get very clear guidelines as to how the website functions and we prevent duplicate content. Disavantages: I have some concerns regarding the weekly changing canonical tag from a SEO perspective. Every week, within every competition the canonical tags are updated. How often do Search Engines update their index for canonical tags? I mean, say it takes a Search Engine a week to visit a page, crawl a page and process a canonical tag correctly, then the Search Engines will be a week behind on figuring out the actual structure of the hierarchy. On top of that: what do the changing canonical URLs to the 'quality' of the website? In theory this should be working all but I have some reservations on this. If there is a canonical tag from 'website.com/$competition/$season/week-8', what does this do to the indexation and ranking of it's subpages (the actual match pages) Option B - Using 301-redirects Using 301-redirects essentially the user and the Search Engine are treated the same. When the Season page or competition page are requested both are redirected to game week page. The same applies here as applies for the canonical URL: every week there are changes in the redirects. So in game week 8: the page on 'website.com/$competition/' would have a 301-redirect that points to 'website.com/$competition/$season/week-8' the page on  'website.com/$competition/$season' would have a 301-redirect that points to 'website.com/$competition/$season/week-8' A week goes by, so then you have: the page on 'website.com/$competition/' would have a 301-redirect that points to 'website.com/$competition/$season/week-9' the page on  'website.com/$competition/$season' would have a 301-redirect that points to 'website.com/$competition/$season/week-9' Advantages There is no loss of link authority. Disadvantages Before a playweek starts the playweek in question can be indexed. However, in the current playweek the playweek page 301-redirects to the competition page. After that week the page's 301-redirect is removed again and it's indexable. What do all the (changing) 301-redirects do to the overall quality of the website for Search Engines (and users)? Option C - Using 302-redirects Most SEO's will refrain from using 302-redirects. However, 302-redirect can be put to good use: for serving a temporary redirect. Within my website there's the content that's most important to the users (and therefor search engines) is constantly moving. In most cases after a week a different piece of the website is most interesting for a user. So let's take our example above. We're in playweek 8. If you want 'website.com/$competition/' to be redirecting to 'website.com/$competition/$season/week-8/' you can use a 302-redirect. Because the redirect is temporary The next week the 302-redirect on 'website.com/$competition/' will be adjusted. It'll be pointing to 'website.com/$competition/$season/week-9'. Advantages We're putting the 302-redirect to its actual use. The pages that 302-redirect (for instance 'website.com/$competition' and 'website.com/$competition/$season') will remain indexed. Disadvantages Not quite sure how Google will handle this, they're not very clear on how they exactly handle a 302-redirect and in which cases a 302-redirect might be useful. In most cases they advise webmasters not to use it. I'd very much like your opinion on this. Thanks in advance guys and galls!

    | StevenvanVessum
    0

  • Hello, I had all the content of my product reviews in the product url ie /computer-brand-y-intel-i5/ and there I wrote my reviews and users did the same thing, attached images, etc Now, I have been proposed to do it this way. /computer-brand-y-intel-i5/ (here, the main specifications) /computer-brand-y-intel-i5/review/ /computer-brand-y-intel-i5/user-reviews/ /computer-brand-y-intel-i5/multimedia/ Many other websites have the same specifications so with my reviews etc I enriched the main url. What do you think is the best option to go? Thanks a lot and regards

    | antorome
    0

  • We are laying the foundation for a domain change. I'm gathering all of the requirements listed from Google (301's, sign up the new domain with WMT, etc), customer communications, email system changes, social updates, etc. But through everything I've read, I'm not quite clear on one thing. We have the option of keeping our current domain and the new domain running off the same eCommerce database at the same time. This means that we have the option of running two exact duplicates simultaneously. The thought is that we would slowly, quietly turn on the new domain, start the link building and link domain changing processes, and generally give the new domain time to make sure it's not going to croak for some reason. Then, after a week or so, flip on a full 301 rewrite for the old domain. There are no concerns regarding order databases, as both domains would be running off of the same system. The only concern I have in the user experience is making sure I have internal links all set to relative, so visitors to the new domain aren't flipped over and freaked out by an absolute URL. I'm not confident that this co-existing strategy is the best approach, though. I'm wondering if it would be better from an SEO (and customer) perspective to Have the new domain active and performing a 302 redirect from the new domain to the corresponding page on the old domain When we're ready to flip the switch, implement the 301 redirect from old to new (removing the 302, of course) at switch time. Any thoughts or suggestions?

    | Goedekers
    0

  • Hi All, I know that social bookmarks are considered to be a good thing when pointing to a specific page. Is that the same to simple facebook "Likes" and tweets? By Likes I mean on page and not on Facebook fan page. Thanks

    | BeytzNet
    0

  • Cookstr appears to be syndicating content to shape.com and mensfitness.com a) They integrate their data into partner sites with an attribution back to their site and skinned it with the partners look. b) they link the image back to their image hosted on cookstr c) The page does not have microformats or as much data as their own page does so their own page is better SEO. Is this the best strategy or is there something better they could be doing to safely allow others to use our content, we don't want to share the content if we're going to get hit for a duplicate content filter or have another site out rank us with our own data. Thanks for your help in advance! their original content page: http://www.cookstr.com/recipes/sauteacuteed-escarole-with-pancetta their syndicated content pages: http://www.shape.com/healthy-eating/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta
    http://www.mensfitness.com/nutrition/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta

    | irvingw
    0

  • I launched a website for a client in mid-March. The site is already indexed, I have built quite a few links to it (links are also indexed), and ranks well for some targeted keywords. However, when I try to check backlinks to the site with Open Site Explorer, it comes back with "No Data Available For This URL". Is this something I should be worried about or merely a case of 'recency' of page creation'? I know it says that it can take 45-60 days for a site to be included in Linkscape but I'm approaching the 60 days mark and still nothing.

    | Igor-Avidon
    0

  • In this post, http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo, it mentions that the noindex tag is more effective than using robots.txt for keeping URLs out of the index. Why is this?

    | nicole.healthline
    0

  • One of our clients has a blog with an English and Spanish version of every blog post. It's in WordPress and we're using the Q-Translate plugin. The problem is that my company is publishing blog posts in English only. The client is then responsible for having the piece translated, at which point we can add the translation to the blog. So the process is working like this: We add the post in English. We literally copy the exact same English content to the Spanish version, to serve as a placeholder until it's translated by the client. (*Question on this below) We give the Spanish page a placeholder title tag, so at least the title tags will not be duplicate in the mean time. We publish. Two pages go live with the exact same content and different title tags. A week or more later, we get the translated version of the post, and add that as the Spanish version, updating the content, links, and meta data. Our posts typically get indexed very quickly, so I'm worried that this is creating a duplicate content issue. What do you think? What we're noticing is that growth in search traffic is much flatter than it usually is after the first month of a new client blog. I'm looking for any suggestions and advice to make this process more successful for the client. *Would it be better to leave the Spanish page blank? Or add a sentence like: "This post is only available in English" with a link to the English version? Additionally, if you know of a relatively inexpensive but high-quality translation service that can turn these translations around quicker than my client can, I would love to hear about it. Thanks! David

    | djreich
    0

  • What are your sure fire ways to build up page rank, quickly and effectively for long term gains. Do you have a check list?

    | therealmarkhall
    0

  • Our site has 500 rel canonical issues. This is the way i understand the issues. All our blog posts automatically include a rel=canonical to themselves.
    eg a blog about content marketing has: Should this tag point to one of the main pages instead so the link juice is sent back to our home page?

    | acs111
    0

  • Hi, I have an important question and I hope you'll be able to help me find the answer. My site http://www.pokeronlineitalia.com had a PR of 3 (three). Then, I had it restructured to make it look more appealing in order to increase the conversion rate. The problem is that after it was redesigned, the PR dropped all of a sudden from 3 to 0. This is really bad, as it took me over three years to reach that point. Could you please analyze the site and find out what happened? I used SEOMOZ's research tools to try to understand and noticed the following message" "Accessible to Engines Easy fix Crawl status Status Code: 200 meta-robots: noindex, follow meta-refresh: None X-Robots: None Explanation Pages that can't be crawled or indexed have no opportunity to rank in the results. Before tweaking keyword targeting or leveraging other optimization techniques, it's essential to make sure this page is accessible. Recommendation Ensure the URL returns the HTTP code 200 and is not blocked with robots.txt, meta robots or x-robots protocol (and does not meta refresh to another URL)". Basically, the message said that the search engines cannot access the homepage (http://www.pokeronlineitalia.com). May this be the reason why the PR dropped? What do I have to do to solve this problem? Is there a chance I can reach a PR of 3 again? Thank you very much for your help. It'd be great if you could help my site regain its SEO strength.

    | salvyy
    0

  • I've just signed up and now I want to start using all the information that your site is providing.  How do I go about it? I know how to get to the 'back end' of my site, Joomla (CMS) and can alter all the information.  I just need to know how to implement all the data you give me. Sorry, but I am new to this.....

    | Aim4fun
    0

  • Hey there! I was wondering is there were any particular 'Pros or Cons' to sharing the same server IP address as a competitors website which links back? Site 'A' = http://goo.gl/N1JUO // Stockists of Site 'B' brand products Site 'B' = http://goo.gl/fU9hc // Links to all Stockists of it products with a follow link including site 'A' It is desirable for site 'A' to rank above site 'B' to reduce the volume of visitors finding other stockists linked to from the stockists page of site 'B' when searching for site 'A' primary broadest keyword. However, site 'B' is one of site 'A' best in-bound links. So does it matter that site 'A' is sharing the same server IP address as site 'B'. Also should site 'A' have only 'no-follow' links to site 'B'? Look forward to your input fellow SEOMozers 🙂 Cheers! Ben

    | chichesterdesign
    0

  • On a wordpress site, I have one blog post that performs extremely well for Adsense revenue.  But the post is getting older and older, and requires me to place some updates into the article from time to time. It's a blog post, but really feels like more of a reference types page (it's about stocks in a particular industry). Now that I see so many people landing on this page through search (#1 for the term), I'm thinking I really should really develop this information further, and make a reference page out of this information and keep it updated, with a link to it from the nav menu. However, I don't know if it will be bad to have both the reference page and the old post page trying to rank for the same keyword term or not? (They won't be duplicate content, the new page will just the same topic rewritten and expanded). Is that something I can get penalized for? I'm getting very good income off of this existing blog post and don't want to mess it up, but I also know that only keeping this info on a post that's getting older and older is not a good long term plan, and I need to pounce on the interest in the subject matter. So, I see these options: 1. Create the new expanded page, and let Google sort it in the SERPs. 2. Create the new page and redirect the old blog post to the new page.  That just doesn't seem right to remove access to my old blog post, though. Which of these is the right thing to do, or is there some way I'm not thinking of?

    | bizzer
    0

  • Hi, I want to dynamically create unique page titles (possible meta descriptions  too) on a 10k page site. Many of the page titles are either duplicates or are missing. I heard about the option of grabbing the page titles from a database or possibly using the h1 as the page title. solmelia.com (the website consist of mostly static pages) Any suggestions would be much appreciated. Best Regards,

    | Melia
    0

  • Hi All I am in the process of creating a number of sites within the garden products sector; each site will have unique, original content and there will be no cross over. So for example I will have one on lawn mowers, one on greenhouses, another on garden furniture etc. My original thinking was to create a single limited company that would own each of the domains, therefore all the registrant details will be identical. Is this a sensible thing to do? (I want to be totally white hat) And what, if any, are the linking opportunities between each of the sites? (16 in total). Not to increase ranking, more from an authoritative perspective. And finally, how should I link between each site? Should I no follow the links? Should I use keyword contextual links? Any advice ideas would be appreciated 🙂 Please note: It has been suggested that I just create one BIG site. I've decided against this as I want to use the keyword for each website in the domain name as I believe this still has value. Thanks

    | danielparry
    0

  • Does using a slider widget (such as the one Skype uses) harm my SEO? Meaning, is the text still totally readable to search engines or will it be in JAVAscript. Are there any SEO-friendly sliders you would recommend?

    | theLotter
    0

  • Hi All, Just wondering out of the SEOmoz community who has come out on top after Penguin and who has been hit and why. Personally my site has come out on top. I started workig on the site back in December and NOTHING had been done, no link development, no onpage, nothing, a virginal website. The site was chock-a-block with issues both technically and in content. After 4 months of hard work, we have climbed from 100+ to top ten on most of our phrases and post Penguin we have climbed even higher as some of our competitors were dragged down into the murky depths. So I think thats a win (for now). My focus has been on Guest posting, social outreach, reviews and getting my on page right (still a ways to go, but our CMS is clunky to say the least). A little humour attached 😉 (Why has no one yet stuck Matt Cutts head on a Penguin?) Are you a Penguin Winner or have you experienced the wrath of the penguin. keep-calm-and-deoptimise.jpg

    | Aran_Smithson
    0

  • If you have links on the left hand side of the website on the Navigation and content at the bottom of the page and link to the same page with different anchor text or the same would it help the page (as it is surrounded by similar text) or is the first one counted and this is it?

    | BobAnderson
    0

  • We are currently working on a new site structure and we would like to take out a WP plugin that adds a .html to the end of every URL. From my understanding it is best to not use this plug in. Once I take out this plug in will I need to do anything for all the external links to count. Will the link juice pass through? if you type my url now without the .html in your browser it adds it in the .html. However, all the external links we built have the .html in the URL link. Do I need to do any 301 or canonical to pass link juice or will I be fine after taking out the plug in?

    | SEODinosaur
    0

  • So I have a question. We have used nearly every trick in the book to rank our site, including a ton of white hat stuff.... but then also a lot of black hat practices that resulted in us dropping in the rankings by about 30-40 positions. And getting back to where we were (top 10 for most keywords) is proving to be nearly impossible. We have a ton of great content coming off of the site and we actually offer a quality product. We follow most of the guidelines advocated here on SEOmoz. But the black hat stuff we did has really taken a toll. And it's gonna be pretty much impossible to go back in time and erase all of the Black Hat stuff we did. So what should we do? Should we design a completely new website with a new domain? What can be done to help?

    | LilyRay
    0

  • We publish popular general interest content.  Using a commercial scanning service, we've found our copied content in many places. Is there an SEO value in getting copied content removed from websites who infringe / copy our content?  It is a time consuming process and many infringements. And.... what if they copy the content, but include original links to our site in the content.  Ironically, this is actually generating links for us - does this affect the answer?

    | sftravel
    0

  • If I have a title tag with 28 words (around 90 letters) with some repetition and Google has decided not to use this in the SERPS is this a bad thing (basically saying the title tag is not liked by Google).

    | BobAnderson
    0

  • I have been hit by Latest google update . dont have enough money to pay seo consultants , just wondering if anyone expert can off a free analysis of my site and can point problems. Its not a adsence site . My living depends on this site so cant just keep on doing experiments , by the looks I have been hit hard by negative seo by competitor . I give url in private . Is there any one kind here who can help in private . Would really appreciate .

    | HateDoingSEO
    0

  • Ok - anyone knows what to do with the temporary redirect to the login page? In our e-commerce system we have a checkout page, which requires user to be logged in - if they are not, we redirect them to the login page using simple php header("Locaiton: url"). This however has been found as a Warning as it's a temporary redirect. I can't really put there permanent redirect for obvious reasons so if someone could give me some clue on this situation that would be much appreciated.

    | coremediadesign
    0

  • Hi, We are suddenly having an error "Unreachable Page" when any page of our site is accessed as Googlebot from webmaster tools. There are no DNS errors shown in "Crawl Errors". We have two web servers named web1 and web2 which are controlled by a software load balancer HAProxy. The same network configuration has been working for over a year now and never had any GoogleBot errors before 21st of this month. We tried to check if there could be any error in sitemap, .htaccess or robots.txt by excluding the loadbalancer and pointing DNS to web1 and web2 directly and googlebot was able to access the pages properly and there was no error. But when loadbalancer was made active again by pointing the DNS to it, the "unreachable page" started appearing again.  This very same configuration has been working properly for over a year till 21st of this month. Website is properly accessible from browser and there are no DNS errors either as shown by "Crawl Errors". Can you guide me about how to diagnose the issue. I've tried all sorts of combinations, even removed the firewall but no success. Is there any way to get more details about error instead of just "Unreachable Page" error ? Regards, shaz

    | shaz_lhr
    0

  • Hi all - I've been trawling for duplicate content and then I stumbled across a development URL, set up by a previous web developer, which nearly mirrors current site (few content and structure changes since then, but otherwise it's all virtually the same). The developer didn't take it down when the site was launched. I'm guessing the best thing to do is tell him to take down the development URL (which is specific to the pizza joint btw, immediately. Is there anything else I should ask him to do? Thanks, Luke

    | McTaggart
    0

  • Hello, I am using Opensite Explorer as well as Link Builder from wordtracker to find good links which either link to my competitors or either links pointing to top 20 sites in my niche keyword. Then my team follow each link and find Directory Links Forum Profile Links Bookmark Links PR / Article Sites Links Guest Blog Post Sites Links... Then we make links manually to those sites for our websites as well. Is this a good whitehat strategy for long term good SEO, i believe opensiteexplorer's high page authority links shall worth in a long run. Also I timely post article to my blog and then distribute it to my twitter as well as run few social bookmarks on my article posted on my blog. I want to know community that am i doing SEO for link building in right way or any suggestion there from honorable SEOMOz Members. I know content is key however we are an ecommerece sites mostly thus we need to timely create backlinks as well to stay in competition. I will wait for feedback of honorable community if we are on right direction for SEO or not?

    | andishm
    0

  • So, how many of you got got killed on the recent updates?

    | TheGrid
    0

  • I have 2 clients that have apparently random examples of the 'show map of' link in Google search results. The maps/addresses are accurate and for airports. They are both aggregators, they service the airports e.g. lax airport shuttle (not actual example) BUT DO NOT have Google Place listings for these pages either manually OR auto populated from Google, DO NOT have the map or address info on the pages that are returned in the search results with the map link. Does anyone know how this is the case? Its great that this happens for them but id like to know how/why so I can replicate across all their appropriate pages. My understanding was that for this to happen you HAD to have Google Place pages for the appropriate pages (which they cant do as they are aggregators). Thanks in advance, Andy

    | AndyMacLean
    0

  • Hi there, I am new in Link Building external, I want some help of you guys! I would like to learn about the best ways to do link building for Ecommerce. Can u help me, please? Thank You!

    | Fabricio_Sahdo
    0

  • Calling all SEO ninjas! I'm currently developing single web pages for various clients which function as abbreviated versions of their main websites.  They are all related & under a single domain. When a user visits these pages on a mobile device, CSS is used to display mobile friendly versions of these pages. My clients are thrilled with these mobile versions and now want to also redirect mobile visitors from their main site (which is not mobile optimised) to these pages. My questions are: Are there any negative implications if we did this? ie. redirecting to a different domain What is the best method for redirection? eg. JavaScript Can this be achieved by adding a single line of code to their main site Can this be done in an SEO friendly way so that the redirection acts like a backlink? Many thanks.

    | martyc
    0

  • Does anyone know what we can expect next from Google Panda/Penguin? We did prepare for this latest update and so far so good.

    | jjgonza
    0

  • Dear all, I am managing a Belgian online pharmacy (www.pharma2go.be) . The online pharmacy has a quite high bounce rate (+/- 79%) and low avg. visit time (< 1 minute). This could somehow be related to a choices that have been made in the past to also build the site in English (but without English product texts available - only Dutch and French). The reason is that people all over Europe could order. Another reason could be that also product on prescription are shown which cannot be ordered. This was chosen to still offer the visitors the product leaflets as a service. I am wondering if it would be beneficial for SEO to remove the English version and the on prescription  products. At least if this would lower bounce rate and increase the average visit time. Thanks for your input. Kindest regards, Stefaan

    | stefaanva
    0

  • Sorry, been a long day and wanted a second opinion on this please.... I am developing an affiliate store which will have dozens of products in each category. We will not be indexing the product pages themselves as they are all duplicate content. The plan is to have just the first page of the category results indexed as this will have unique content about the products in that section. The later pagnated pages (ie pages 2,3,4,5 etc) will have 12 products on each but no unique content. Would the best advice be to add a canonical tag to all pages in the 'chairs' category pointing to the page with the first 12 results and the descriptions? This would ensure that the visitors are able to browse many pages of product but google won't index products 13 and onwards. Am I right in my thinkings? A supplemental question. What is the best way to block google from indexing/crawling 90,000 product listings which are pulled direct from the merchant so are not unique in the least. I have previous played with banning google from the product folder but it reports health issues in webmaster tools. Would the best route be a no index tag on all the product pages and to no follow all the products in the category listings? Many thanks Carl

    | Grumpy_Carl
    0

  • Hi, I have been doing seo for this client based in Sri lanka for almost 8 months now. Since we started SEO we had set up geographic target setting to UK through google webmaster tools. At the moment Site is completely ranking higher on google uk & other countries except Sri Lanka . On Google.lk site doesn't even come within 1st 5 pages for keywords which are ranked on1st page in other countries ? What do you think about this ? How does it happen ?

    | pyxle
    0

  • I've been investigating a serious Google ranking drop for a small website in the UK. They used to rank top 5 for about 10 main keywords and overnight on 24/3/12 they lost rankings. They have not ranked in top100 since. Their pages are still indexed and they can still be found for their brand/domain name so they have not been removed completely. I've coverered all the normal issues you would expect to look for and no serious errors exist that would lead to what in effect looks like a penalty. The investigation has led to a an issue about their domain registration setup. The whois record (at domaintools) shows the status as "Registered and Parked or Redirected" which seems a bit unusual. Checking the registration details they had DNS settings pointing correctly to the webhost but also had web forwarding to the domain registrar's standard parked domain page. The domain registrar has suggested that this duplication could have caused ranking problems. What do you think? Is this a realistic reason for their ranking loss? Thanks

    | bjalc2011
    0

  • Hello, We are a quality site hit by Panda Our article collection: http://www.nlpca(dot)com/DCweb/NLP_Articles.html is partially articles written by the site owners and partially articles that are elsewhere on them web. We have permission to post every article, but I don't know if Google knows that. Could this be why we were hit by Panda? And if so, what do we do? We've dropped way down in rank but have worked our way half-way back up. Two of our main keywords are: NLP NLP Training Thanks!

    | BobGW
    0

  • I have a client who is using the rel=canonical tag across their e-commerce site.  Here is an example of how it is set up. URLs 1.  http://www.beautybrands.com/category/makeup/face/bronzer.do?nType=22.  http://www.beautybrands.com/category/makeup/face/bronzer.doThe canonical tag points to the second URL. Both pages are indexed by Google.The first page has a higher page authority (most of the internal site links go to the first URL) than the second one.  Should the page with the highest authority be the one that the canonical tag points to?  Is there a better way to handle these situations?  Does any authority get passed through the tag?Thanks!

    | AlightAnalytics
    0

  • Hello, My company just launched a new website and its a competitve market it looks like. Its for moving boxes and moving supplies. They want a bullet point list (nothing real specific) of what I will be doing for SEO for the new website. I have been out of the loop for more than a year with SEO so not sure what the best things to do first are. Any help would be great. Thanks John

    | maximumrank
    0

  • One of the SEOmoz help desk professionals told me this today regarding some of my website pages. "it looks like you have pages hosted as separate pages on both the root domain and the www subdomain, which means that these pages are competing for rankings and authority. You may want to consider a 301 redirect or the use of rel=canonical tags.". Can anyone help me understand this?   How can I tell which pages are which?

    | webestate
    0

  • Hi, Just read on ViperChill that Matt Cutts told Glen (owner of ViperChill) that ping services can help your blog posts. Now lets say you have a list of 10 that you ping and you put an article up everyday, thats 300 pings a month, is that not spammy? Here is the link to the post: http://www.viperchill.com/future-of-blogging/ If you scroll down your see a screen print of Google search box, its the para above and below this screen print.

    | activitysuper
    1

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.