Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
How important is it to create a subdomain?
I was just reading an article about how Hubpages claims they pulled through from Panda by dividing their content up on subdomains. I'm wondering if anyone else has had similar success? Also, Panda aside, how important do you think it is to separate different types of content out on separate subdomains?
| nicole.healthline0 -
Image Links Vs. Text Links, Questions About PR & Anchor Text Value
I am searching for testing results to find out the value of text links versus image links with alt text. Do any of you have testing results that can answer or discuss these questions? If 2 separate pages on the same domain were to have the same Page Authority, same amount of internal and external links and virtually carry the same strength and the location of the image or text link is in the same spot on both pages, in the middle of the body within paragraphs. Would an image link with alt text pass the same amount of Page Authority and PR as a text link? Would an image link with alt text pass the same amount of textual value as a text link? For example, if the alt text on the image on one page said "nike shoes" and the text link on the other page said "nike shoes" would both pass the same value to drive up the rankings of the page for "nike shoes"? Would a link wrapped around an image and text phrase be better than creating 2 links, one around the image and one around the text pointing to the same page? The following questions have to do with when you have an image and text link on a page right next to each other, like when you link a compelling graphic image to a category page and then list a text link underneath it to pass text link value to the linked-to page. If the image link displays before the text link pointing to a page, would first link priority use the alt text and not even apply the anchor text phrase to the linked page? Would it be best to link the image and text phrase together pointing to the product page to decrease the link count on the page, thus allowing for more page rank and page authority to pass to other pages that are being linked to on the page? And would this also pass anchor text value to the link-to page since the link would include an image and text? I know that the questions sound a bit repetitive, so please let me know if you need any further clarification. I'd like to solve these to further look into ways to improve some user experience aspects while optimizing the link strength on each page at the same time. Thanks!
| abernhardt
Andrew0 -
Index.php canonical/dup issues
Hello my fellow SEOs! I would LOVE some additional insight/opinions on the following... I have a client who is an industry leader, big site, ranks for many competitive phrases, blah blah..you get the picture. However, they have a big dup content/canonical issue. Most pages resolve with and without the /index.php at the end of the URL. Obviously this is a dup content issue but more importantly they SEs sometimes serve an "index.php" version of the page, sometimes they don't, and it is constantly changing which version it serves and the rank goes up and down. Now, I've instructed them that we are going to need to write a sitewide redirect to attempt a uniform structure. Most people would say, redirect to the non index.php version buttttt 1. The index.php pages consistently outperforms the non index.php versions, except the homepage. 2. The client really would prefer to have the "index.php" at the end of the URL The homepage performs extremely well for a lot of competitive phrases. I'd like to redirect all pages to the "index.php" version except the homepage and I'm thinking that if I redirect all pages EXCEPT the homepage to the index.php version, it could cause some unforeseen issues. I can not use rel=canonical because they have many different versions of the their pages with different country codes in the URL..example, if I make the US version canonical, it will hurt the pages trying to rank with a fr URL, de URL, (where fr/de are country codes in the URL depending where the user is, it serves the correct version). Any advice would be GREATLY appreciated. Thanks in advance! Mike
| MikeCoughlin0 -
Local hosts for sites in foreign countries?
Hi everyone. I'm going to be launching localized websites in 5 different european countries (.de, .it. etc). Must I have a local host with servers in those countries or can I use a U.S. based host? WOuld having a U.S. based host hurt SEO?
| TexaSEO0 -
Advanced Question on Synonym Variation Pages!
Hi, This is quite an advanced question, so I'll go through in detail - please bare with me! I launched the new version of our website exactly a week ago - and all the key metrics are in the right direction: Pages / Visit +5% , Time on Site +25%, Bounce rate down 1 %. I work in an industry were our primary keyword has 4 synonyms and our long tail keywords are location related. So as an example I have primary synonyms like: Holiday, Vacation, Break, Trip (Not actually these but they are good enough as an example). Pluralised versions and you have 8 in total. So my longtail keywords are like: Las Vegas Vacation / Las Vegas Vacations
| James77
Las Vegas Holiday / Las Vegas Holidays
Las Vegas Trip / Las Vegas Trips
Las Vegas Breaks / Las vegas Breaks All these synonyms effectively mean the same thing, so my thinking on my new website was to specifically target each of these synonyms with their own unique page and optimise the meta and page titles, to those exact words. To make these pages truely unique, I therefore got a bunch of copywriters to write about 600 words unique for every long tail synonym (well over 750,000 words in total!). So now at this point I have my page "Las Vegas Holidays" with 600 unique words of content, and "Las Vegas Vactions" with 600 words of unique content etc etc etc. The problem is, when the user is searching for these words, there primary goal is not to read 600 words of content on "Las Vegas Holidays" - their primary goal is to get a list of last vegas holidays that they can search, view purchase (they may want to read 600 words of content, but is not their primary goal). So this puts me in a dilema - I need to display the nuts and bolt (IE the actual holidays in las vegas) to the customer on any page they land on off my synonyms as the primary content. But to make sure these pages are unique I need to also have this unique content on that page. So here's what I did: On every synonym version of the page I display the exact same information. However, on each page I have a "Information" link. and on click this pop's up a layer which contains my unique content for that page. To further optimise using perfect anchors in this content pop-up, I have cross linked the synonym pages (totally naturally) - IE on my "Las Vegas Holidays" page, in the content I may have the words "Las Vegas Breaks" - this would be linked the the "Las Vegas Breaks" synonym page. In theory I don't think there is anything wrong with what I am doing in the eyes of the customer - but I have a big concern that this may well look "fishy" to SE's. IE the pages are almost identical to the user except for this information pop-up layer of unique content, titles and meta. We know that Google at least can get can tell exactly what the user see's when they land on that page ( from their "Preview") and can distinguise between user visible and hidden text. Therefore, even though from a user experience, I think we are making a page that is perfect for them (they get the list of vactions etc as the primary content, and can read infomation if they want by clicking a button), I am concerned that SE's are going to say - hold on a minute there are load of pages here that are identical except for a chuck of text that is not visible to the user (Even though this is visible to the user if they click the "Information" button), and this content cross links to a load of almost identical pages with the same thing. Today I checked our rankings, and we have taken a fair whack from google - I'm not overly concerned at the moment as I expected big fluctuations from ranking for the first few weeks - but I'd be a lot more confident if they were fluctuating in the right direction!! So what do I do?
As far as I can see my options break down as follows: Content Display:
1/. Keep it as it is, and hope the SE's don't see it as spammy. Even though I think what we are doing is the best for customer experience, I'm concerned SE's won't. 2/. On every synonym page, below all the list of products, packages etc that the customer wants to see, display the unique content as a block of subtext text which is visble by default. This however could make the page a bit ugly. 3/. Display a visible snippet of the unique content, below all the packages, and have a more button which expands the rest of the content - IE have a part visible layer. This is slightly better for display, but again I'm only displaying a portion of visible content and the rest will still be flagged as "hidden" by default to the SE's. Cross Linking within the content:
1/. Keep it as it is where synonym keywords link to the synonym version of the page. 2/. Alter it so that every sysnonym keyword links to the "primary" synonym version of the page - EG if I now "Las Vegas Holidays" is my main keyword, then "Las Vegas Vactions" keyword, would not link to my "Las Vegas Vactions" page as current, but would link to my "Las Vegas Holidays" page. I apologise for the indepth questions, but it requires a lot of explanation to get it across clearly. I would be grateful on any of your thoughts. Many thanks in advance.0 -
Should the sitemap include just menu pages or all pages site wide?
I have a Drupal site that utilizes Solr, with 10 menu pages and about 4,000 pages of content. Redoing a few things and we'll need to revamp the sitemap. Typically I'd jam all pages into a single sitemap and that's it, but post-Panda, should I do anything different?
| EricPacifico0 -
Culling 99% of a website's pages. Will this cause irreparable damage?
I have a large travel site that has over 140,000 pages. The problem I have is that the majority of pages are filled with dupe content. When Panda came in, our rankings were obliterated, so I am trying to isolate the unique content on the site and go forward with that. The problem is, the site has been going for over 10 years, with every man and his dog copying content from it. It seems that our travel guides have been largely left untouched and are the only unique content that I can find. We have 1000 travel guides in total. My first question is, would reducing 140,000 pages to just 1,000 ruin the site's authority in any way? The site does use internal linking within these pages, so culling them will remove thousands of internal links throughout the site. Also, am I right in saying that the link juice should now move to the more important pages with unique content, if redirects are set up correctly? And finally, how would you go about redirecting all theses pages? I will be culling a huge amount of hotel pages, would you consider redirecting all of these to the generic hotels page of the site? Thanks for your time, I know this is quite a long one, Nick
| Townpages0 -
Any ideas for capturing keywords that your client rejects because they aren't politically correct?
Here's the scenario: you need to capture a search phrase that is very widely used in common search, but the term is considered antiquated, overly vernacular, insensitive or outright offensive within the client's industry. In this case, searchers overwhelmingly look for "nursing homes," but the term has too many negative connotations to the client's customers, so they won't use it on-page. Some obvious thoughts are to build IBLs or write an op-ed/blog series about why the term is offensive. Any other ideas?
| Jeremy_FP1 -
Need Guidance On SEO Campaign
Hello, my website is www.mybluedish.com and I am launching an SEO campaign to increase our rankings, especially for the keyword 'satellite internet'. We used to be ranked #5 for this keyword and Google made some changes about 9 months ago that dropped us way back. We have recovered a lot back to #18 now, but have been stuck here for a while. I want to step up our SEO campaign and have come up with the following campaign. Could any SEO's please tell me if you think the following is the make up of a solid SEO campaign or if it should be adjusted? Thank you. 12 new blog articles/month 24 Articles submitted to 10 different directories each (total of 200 article directory submissions)/month 1 basic Press Release from PR.com/month 200 do follow blog comments/month 10 paid blog posts (on other people’s blogs of PR 1-5)/month
| MyNet0 -
Will changing domain registration details affect ranking?
Hi all, I've got a .co.uk site that I want to update the domain registrant account details for. The change will involve registering the domain to a limited company name rather than an individual and possibly changing the registered address. Does anyone know whether this could affect google rankings? I've heard of stories of sites losing their PR because the registrant details have changed and I don't really want that to happen! Thanks in advance for any help
| PeterAlexLeigh0 -
How do I index these parameter generated pages?
Hey guys, I've got an issue with a site I'm working on. A big chunk of the content (roughly 500 pages) is delivered using parameters on a dynamically generated page. For example: www.domain.com/specs/product?=example - where "example' is the product name Currently there is no way to get to these pages unless you enter the product name into the search box and access it from there. Correct me if I'm wrong, but unless we find some other way to link to these pages they're basically invisible to search engines, right? What I'm struggling with is a method to get them indexed without doing something like creating a directory map type page of all of the links on it, which I guess wouldn't be a terrible idea as long as it was done well. I've not encountered a situation like this before. Does anyone have any recommendations?
| CodyWheeler0 -
301 Redirect All Url's - WWW -> HTTP
Hi guys, This is part 2 of a question I asked before which got partially answered; I clicked question answered before I realized it only fixed part of the problem so I think I have to post a new question now. I have an apache server I believe on Host Gator. What I want to do is redirect every URL to it's corresponding alternative (www redirects to http). So for example if someone typed in www.mysite.com/page1 it would take them to http://mysite.com/page1 Here is a code that has made all of my site's links go from WWW to HTTP which is great, but the problem is still if you try to access the WWW version by typing it, it still works and I need it to redirect. It's important because Google has been indexing SOME of the URL's as http and some as WWW and my site was just HTTP for a long time until I made the mistake of switching it now I'm having a problem with duplicate content and such. Updated it in Webmaster Tools but I need to do this regardless for other SE's. Thanks a ton! RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} ^www.yourdomain.com [NC] RewriteRule ^(.*)$ http://yourdomain.com/$1 [L,R=301]
| DustinX0 -
Is there a development solution for AJAX-based sites and indexing in Bing/Yahoo?
Hi. I have outlined a solution for an AJAX-based site in order to rank preserve indexing and rank in Google using the hashbang. I'm curious if anyone has some insight for doing the same for Bing/Yahoo! (a development question)
| OveritMedia0 -
Legit Domain Masking
I am working with a real estate client. They have one main site (ie. company.com) that contains all the info, then they have several name domains (ie. salesrepresentative.com) that are forwarding to the main site, but use domain masking to appear as if there is a separate site for each representative. My question is how can I make this legit in Google's eyes, or is this totally not advised?
| ukao0 -
301 Redirect To Corresponding Link No Matter The URL?
Hey guys I have hosting on Host Gator with I believe an apache web server. I need a code to put in the HT ACCESS to redirect all WWW URL's to their corresponding http URL. I haven't been able to get a code to work. For example, http://www.mysite.org/page1.html -> http://mysite.org/page1.html , without having to redirect hundreds of pages individually Here is the format my server uses in the HT ACCESS file for 301 redirects. RewriteCond %{HTTP_HOST} ^mysite.org$ [OR] RewriteCond %{HTTP_HOST} ^www.mysite.org
| DustinX
$RewriteRule ^Electric-Pressure-Cookers.html$ "http://mysite.org/Pressure-Cookers.html" [R=301,L] Thanks0 -
I need help with htaccess redirect
Hi guys, we have the domain cheats.co.uk, it has always displayed as cheats.co.uk without the www. However it is now showing 2 version of the site, both the www. and the non www. version. I know how to add to the htaccess folder to get the non www. version going to the www. version but i am worried about doing this because the non www. version has always been the one indexed in Google and has a page rank of 3. Should i in fact be redirecting the www.version to the non www. version to keep page rank etc? or will page rank be passed over etc if i redirect to the www. version I hope thats clear Thanks guys Jon
| imrubbish0 -
No index, follow vs. canonical url
We have a site that consists almost entirely as a directory of videos. Example here: http://realtree.tv/channels/realtreeoutdoorsclassics We're trying to figure out the best way to handle pagination and utility features such as sort for most recent, most viewed, etc. We've been reading countless articles on this topic, but so far have been unable to determine what might be considered the industry standard. Two solutions seem to stand out... Using the canonical url on all the sorted and paginated pages. However, after reading many blog posts, it seems that you should NEVER use the canonical url to solve the issue of paginated, and thus duplicated content because the search bots will never crawl past the first page leaving many results not in the index. (We are considering ruling this method out.) Another solution seems to be using the meta tag for noindex, follow so that a search engine like Google will crawl your directory pages but not add them to the index themselves. All links are followed so content is crawled and any passing link juice remains unchanged. However, I did see a few articles skeptical of this solution as well saying that there are always better alternatives, or that there is no verification that search engines obey this meta tag. This has placed some doubt in our minds. I was hoping to get some expert advice on these methods as it would pertain to our site. Thank you.
| grayloon0 -
Navigation - Balancing UX & SEO
I'm currently evaluating our navigation in the course of a site relaunch. From reading a number of articles and posts on seoMOZ, here are the elements I've found important to consider: Use CSS (not Javascript) for the primary drop-down navigation menu Get rid of two design elements from our earlier days: The 30 something site-wide category links in the footer, and many no-followed internal links (in an attempt to sculpt PR) Keep all pages within 3 clicks of the homepage, and have ample cross-links within internal pages. The one major problem I'm facing is how to balance UX and SEO in the primary navigation bar. To illustrate, let's assume I sell Tennis equipment. If one of the top-level categories on my navigation bar was "Rackets", if I was designing purely with SEO in mind the category names would be: Tennis Rackets -> Wilson Tennis Rackets Head Tennis Rackets Prince Tennis Rackets ....as the full, three word anchor text will be most specific and valuable to pass reputation to the category pages. However, from a UX perspective, writing "Tennis Rackets" after each category is unnecessary, and it would look MUCH cleaner to instead have: Tennis Rackets -> Wilson Head Prince ....but this would obviously be less beneficial from a SEO standpoint for each individual, manufacturer racquet page as the entire search term ("Wilson Tennis Rackets") is not in the anchor text. As these links will be on every page of the site, I'm struggling with which to choose - clean navigation or improved SEO. My Questions: I would love to hear the communities thoughts on how to weigh the balance of these two - clean UX navigation vs. SEO-rich specific anchor text - in navigation. Also, I'd appreciate hearing if any of my original 3 assumptions for the re-design are off-base or incorrect. Thank you!
| AndrewY0 -
Is this a good strategy?
Okay, so let's say I have a landing page or an ecommerce website with limited content. If I start a blog and write quality posts that have anchor text linking back to my homepage, then bookmark the hell out of those blog posts, post to twitter, cite the post on Q&A websites, etc . . . would that be an effective strategy beyond the normal stuff like directory submisson and blog comments?
| DanHenry0 -
Different pages ranking for search terms, often irrelevant.
Website: www.templatemonster.com
| templatemonster
Problem: Positions dropped while pages which were ranking previously disappeared from top 100 and now different - often completely irrelevant - pages are ranking. Examples:
Search term: Joomla Templates
Previous Position: 8
Current Position: 35
Previously Ranked Page: http://www.templatemonster.com/joomla-templates.php
Currently Ranked Page: http://www.templatemonster.com/logo-templates.php Similar situation with the following search terms: virtuemart templates, virtuemart themes, prestashop templates, prestashop themes, magento themes, zencart templates, zencart themes, zen cart templates, zen cart themes When: according to the Google Analytics (drop in visitors stats) this happened on July, 2nd Preconditions: we had 45 minutes downtime on July 2-nd - but could this 45 mins have had such disastrous results?
No redirects or canonical URL were used which could lead to such change of ranking page.
No changes in the site's informational structure and design.
In webmaster tools (inbound links report) we saw a website yesterday which had over 800,000 links pointing to our domain - http://moviebestwatch.com/ - and today this site is NOT found in Webmaster Tools report! Also, site is down, domain is quite new (how could it have possibly developed 800,000 pages in such a short time?) and whois is privacy protected. Is this some dirty trick from competitors - could it have possibly influenced our positions? Still, what I completely fail to understand - how could a page like http://www.templatemonster.com/logo-templates.php be the top ranking page for 'Joomla templates' if there is: not a single mention of the word 'Joomla' on the page (or source code), i.e. the page is completely irrelevant to the search term not a single link with 'Joomla templates' anchor text pointing to that page, neither external nor internal PS. No similar changes in other search engines noticed. Also, the pages in question have been re-spidered July 4th and cache shows the right pages, i.e. it is not that Googlebot has seen logotypes page instead of Joomla templates page. I checked any possible reason I could think of (see "Preconditions") but still have no clue - what is going on?1 -
Is site:domain.com + keyword a good indicator of the quality of a page?
Are the results provided by site:domain.com + keyword a good indicator of the quality of certain pages? For example, should the first result be more relevant, have a higher number of links, etc than the second result?
| nicole.healthline0 -
Non www has 110 links the www has 5 - rankings have gone
A site I'm working on resolves on the non www address and has 100+ links pointing at this address, last month it started to rank and had various phases within the top 50, this month it's totally gone from the search results. The www has 5 links. My questions Which is best? Www or non How do you fix it? Any reason why the rankings have disappeared!? It's a word press site domainname.co.uk = 100+ links www.domainname.co.uk = 5 links
| therealmarkhall0 -
Quick URL structure question
Say you've got 5,000 articles. Each of these are from 2-3 generations of taxonomy. For example: example.com/motherboard/pc/asus39450 example.com/soundcard/pc/hp39 example.com/ethernet/software/freeware/stuffit294 None of the articles were SUPER popular as is, but they still bring in a bit of residual traffic combined. Few thousand or so a day. You're switching to a brand new platform. Awesome new structure, taxonomy, etc. The real deal. But, historically, you don't have the old taxonomy functions. The articles above, if created today, file under example.com/hardware/ This is the way it is from here on out. But what to do with the historical files? keep the original URL structure, in the new system. Readers might be confused if they try to reach example.com/motherboard, but at least you retain all SEO weight and these articles are all older anyways. Who cares? Grab some lunch. change the urls to /hardware/, and redirect everything the right way. Lose some rank maybe, but its a smooth operation, nice and neat. Grab some dinner. change the urls to /hardware/ DONT redirect, surprise Google with 5k articles about old computer hardware. Magical traffic splurge, go skydiving. Panic, cry into your pillow. Get job signing receipts at CostCo Thoughts?
| EricPacifico0 -
Understanding Canocalization, domain structure, redirects
Hey guys, My background is more in marketing aspect of SEO and I'm afraid my technical knowledge is not where it should be. I'm confused about how to find out whether a site is splitting link juice by having to many domains(?) that are not redirected properly. Am I asking that right? How do you figure that out? And, once you know, do you just go to the ones that are not redirecting and add a 301? Where is the best place to add a 301? I know there's a difference in the eyes of the search engines between, say, example.com and www.example.com and probably other forms, correct? I'm not a programmer or IT specialist, I'm a marketing consultant, but I feel like I'm really missing it when it comes to understanding all this stuff (looking at HTTP headers, using GWT, reading source code, etc) and am not sure the best way to learn it effectively so I can be sure I'm not missing something when consulting with clients. Help? Please? Thanks, David
| DavidPPeters0 -
Internal competition
If we have two different sub domain pages that talk about the same service but with different content, how will Google react while ranking? Example : xxxx.abc.com/company/solutions/service1 yyyy.abc.com/service1 Suppose if www.abc.com has good authority, which URL will be more benefited?
| gmk15670 -
Moving External TLD To Subfolder of Corporate Domain?
Our Challenge: Our corporate site receives about 20,000 visits per week. Unfortunately, nearly half of those visitors are "looking" for a link we provide in our navigation link found on our homepage that takes them to our “Employee" focused site which resides on a separate TLD. Because so much of the traffic to our corporate site lands on our homepage only as a stepping stone en route to the "Employee" site, the bounce rate, time on site and Average page views for our corporate site are all negatively impacted. Meanwhile, the "Employee" site gets more than 100,000 visits per month and enjoys enviable metrics- low bounce rate, high average page views and average time on site. Our Goal: minimize or eliminate the negative impact of so many visitors using our corporate site as a stepping stone to reach the employee site. leverage the traffic volume and positive metrics enjoyed by the employee site to improve the search engine authority of our corporate site. Our Solution: Move the "Employee" site to a subfolder of our corporate site – for example www.oursite.com/employees Install Google Analytics on all pages within the subfolder Provide a 301 redirect from old "Employee" domain to new employee subfolder The expected result is an increase in overall corporate site traffic, more page views, higher time on site, and lower overall bounce rate from merging these two website properties. Our Need: After comparing the subfolder option to subdomain approach, we feel that the proposed solution is our best course of action and are looking for validation or an alternate recommendation.
| ChrisMakara0 -
Solving Keyword Cannibalisation WITHOUT exact match internal links
Hi guys, I have an ecommerce client I'm working with (they are a tour operator). The client has multiple variations of very very similar tours which has created a keyword cannibalisation issue. I've read this blog from Rand on the issue, and I understand that I need to use internal links to show the bots which page I want to rank for which term. Problem is, I cant use exact match anchor text as it wouldn't adequately describe the tour from a user's perspective. eg I want a single page to rank for 'Los Angeles Tour' however, because the tour also takes in san francisco, I cant use the exact match anchor text 'Los Angeles Tour' because it doesn't give users a realistic indication of the page that they are going to. My solution... Is to use the internal linking structure eg 'San Francisco & Los Angeles Tour', This has the keyword phrase I want to optimise for within the anchor text. Does this have the same effect as using the exact match anchor text? I cant really see any other solution, so I'm guessing that s the right course of action Your thoughts would be much appreciated
| jamesjackson0 -
Get Higher in Google Shopping
Hello, A few days ago i imported my product list into Google Shopping and everything got accepted, but when i look in Google Shopping for my product. It's on page 3, how can i get my product higher in Google shopping? I assume this thing is different from just normal SEO? Regards, Yannick
| iwebdevnl0 -
+entrylink+
Hey All Has anybody else been getting 404 error page reports in campaigns detailing url with +entrylink+ added to every page on our site. Any help or advice will be greatly appreciated. This is messing with my OCD like state of having no errors 🙂 Thanks
| CPASEO0 -
Best way to stop pages being indexed and keeping PageRank
If for example on a discussion forum, what would be the best way to stop pages such as the posting page (where a user posts a topic or message) from being indexed AND not diluting PageRank too? If we added them to the Disallow on robots.txt, would pagerank still flow through the links to those blocked pages or would it stay concentrated on the linking page? Your ideas and suggestions will be greatly appreciated.
| Peter2640 -
Duplicate page content
Hi. I am getting error of having duplicate content on my website and pages its showing there are: www.mysitename.com www.mysitename.com/index.html As my best knowledge it only one page, I know this can be solved with some conical tag used in header, but do not know how. Can anyone please tell me about that code or any other way to get this solved. Thanks
| onlinetraffic0 -
Google Places Duplicate Listings
Hey Mozzers- I know the basic process for handling duplicate listings, but I just want to make sure and ask because this one is a little sensitive. I have a client with a claimed and verified listings page, which is here: http://maps.google.com/maps/place?q=chambers+and+associates&hl=en&cid=9065936543314453461 There is also another listing (which I have not claimed yet) here: http://maps.google.com/maps/place?q=dr.+george+chambers&hl=en&cid=14758636806656154330 The first listing has 0 reviews, where the 2nd unverified listing has 12 fantastic 5 star reviews. We can all agree that if I can get these two listings to merge, his general listing will perform much better than it already is (the first listing has about 200 actions per months). So, what is the best way to merge these two without losing any reviews and without suspending my places account? Thanks in advance! Ian
| itrogers0 -
Best website structure for product benefits and features.
I'm in disagreement with my partner over how best to represent our products' benefits and features on the homepage of our website. I'm interested in this from primarily a SEO perspective but it obviously has an impact on conversions as well. I believe that a homepage shouldn't contain too much information so as not to overwhelm the user, a brief sentence or two about each benefit with a link to another page with in depth info about the related feature. Each of these inner pages would be optimized and contain much more content that you could put on the homepage example below. Each Please see wireframe A He believes in more information on the homepage. There is more content to index which he believes is important for the homepage. Also, by using tabs most of the content is hidden from initial view so its doesn't clutter the page and the user doesn't have to leave the page to decide whether he is interested in the software. Please see wireframe B below. I'd really love to hear from other Moz'ers which they would choose and why?
| Riona0 -
SEO for Global Navigations
I did my first SEO audit from the book SEO Secrets by Danny Dover on my new website at http://melo4.melotec.com:4010/ In the book he says to disable Javascript and see if the global navigation still works. So when I did that the dropdown menus in my navigation don't show. I'm assuming this is a problem but when I check the cache text only version of the site, the dropdowns are in the text only version. Are their any experienced SEO's out their who can weigh in on this issue? Should I have my developer redo the navigation without any javascript? Thanks, Shawn
| Romancing0 -
One page wordpress site - what are the steps for SEO
Hello, I am launching 5 sites with keyword exact domains. I am developing the sites on wordpress as one page sales funnel sites. What do I need to do to optimize my sites? Really appreciate any bullet points or directions. Tks
| brianmaher0 -
Links from tumblr
I have two links from hosted tumblr blogs which are not on tumblr.com. So, website1 has a tumblr blog: tumblr.website1.com And another site website2.com also uses the a record/custom domains option from tumblr but not on a subdomain, which is decribed below: http://www.tumblr.com/docs/en/custom_domains Does this mean that all links from such sites count as coming from the same IP in google's eyes? Or is there value in getting links from multiple sites because the a-record doesn't affect SEO in a negative way? Many thanks, Mike.
| team740 -
Competitors and Directory Links
Hi guys, wanted to get some input and thoughts here. I'm analyzing many competitor links for a specific client (even other clients actually as well) and come across a pretty heavy directory backlink profiles. has anyone here had success with directory listings? Seem many of the competitors backlinks are coming from directories. What say you?
| PaulDylan1 -
Does Schema.org markup require a HTML 5 doctype?
I would like to implement Schema/microdata on a clients website, but the site is currently XHTML. Is html5 required for the micro data tags to be recognised? Will it work if I implement micro data on my XHTML site? Thanks for any advice you can offer.
| cmaddison1 -
Blog - on the domain or place on separate site, now that Panda ranks for bounce, TOP, depth of visit
Over 10 years ago, we decided to run our blog external to our main website. contrary to conventional wisdom then, we thought we’d have more control/opps for generating external anchor text links, plus working in a bona fide blog software environment (WP). As we had hoped, the blog generated alot of strong inbound links, captured inbound links of it own from other sites and I think, helped improve our SERPs and traffic. Once the blog was established and with the redesign of the website, we capitulated, and finally moved the blog onto the main domain. After reading a number of pieces on Panda and the new reality of SEO, sounds like bounce rates (in particular), time on page, and other GA measures may have a more profound influence on google rankings now. Given that blogs are notoriously for high bounce rates (ours is), low time on site, depth of visit, seems logical that it adversely affects our site averages for the main domain). Is it time to re-consider pulling our blog off the main domain to reassert the ‘true’ GA measures of the main domain? I guess it still gets down to the question... is the advantage of all the inbound links to the blog on the main domain of greater value than moving the blog off-site and reasserting better 'site stats' for google's pando algo? Thanks.
| ahw0 -
In-House SEO - Doubt about one SEO issue - Plz guys help over here =)
Hello, We wanna promote some of our software's. I will give u guys one example bellow: http://www.mediavideoconverter.de/pdf-to-epub-converter.html We also have this domain: http://pdftoepub.de/ How can we deal about the duplicate content, and also how can we improve the first domain product page. If I use the canonical and don't index the second domain and make a link to the first domain it will help anyway? or don't make any difference? keyword: pdf to epub , pdf to epub converter What u guys think about this technique ? Good / Bad ? Is there the second domain giving any value to the first domain page? Thanks in advance.
| augustos0 -
Keyword Ranking Question
I have recently hired a SEO company to help with our keyword. My question is what are the best tools to use to verify what that are reporting. I can do an unpersonalized search, but I am likely still getting the my local results. I have been using the SEOmoz rank tracker in the past but for some reason it is not able to retrieve results over the past day or so. Are there any other good tools to check ranking for an exact url at the for non-localized, non personalized results? Thanks for the suggestions.
| fertilityhealth0 -
Panda Prevention Plan (PPP)
Hi SEOMOzers, I'm planning to prepare Panda deployment, by creating a check-list from thinks to do in SEO to prevent mass trafic pert. I would like to spread these ideas with SEOMoz community and SEOMoz staff in order to build help ressources for other marketers. Here are some ideas for content website : the main one is to block duplicate content (robots.txt, noindex tag, according to the different canonical case) same issue on very low quality content (questions / answers, forums), by inserting canonical redirect or noindex on threads with few answers
| Palbertus1 -
I have a duplicate content problem
The website guy that made the website for my business Premier Martial Arts Austin disappeared and didn't set up that www. was to begin each URL, so I now have a duplicate content problem and don't want to be penalized for it. I tried to show in Webmaster tools the preferred setup but can't get it to OK that I'm the website owner. Any idea as what to do?
| OhYeahSteve0 -
How can I change my website's content on specific pages without affecting ranking for specific keywords?
My client's website (www.nursevillage.com) content has not been touched for 4 years and we are currently ranking #1 for "per diem nursing". They do not want to make any changes to the site in fear that it might decrease our rankings. We want to try to use utilize that keyword ranking on specific pages (www.nursevillage.com/nv/content/careeroptions/perdiem.jsp ) ranking for "per diem nursing" and try redirecting traffic or placing some banners and links on that page to specific pages or other sites related to "per diem nursing" jobs so we can get nurses to apply to our new nursing jobs. Any advice on why "per diem nursing" is ranking so high for us and what we can change on the site without messing up our ranking would be greatly appreciated. Thanks
| ryanperea1000 -
What are the best joomla seo plugins?
When optimizing a joomla site what is the best plugins etc for seo tweeking
| DavidKonigsberg0 -
Should I Combine 30 websites into one?
I have a Private health care company that I have just begun consulting for. Currently in addition to the main website serving the whole group, 30 individual sites which are for each of the hospitals in their group. Each has it's own domain. Each site, has practically identical content: something that will be addressed in my initial audits. But should I suggest that they combine all the sites into one domain, providing individual category pages for each hosptial, or am I really going to suggest that each of the 30 sites, create unique content of their own. This means thirty pages of content on "hip replacements" thirty different versions of "our treatement" etc, and bearing in mind they all run off the same CMS, even with different body text, the pages are going to be practically identical. It's a big call either way! The reason they started out with all these sites, is that each hospital is it's own cost centre and whilst the web development team is a centralized resource. They each have their own sites to try and rank indivdually for local searches, naturally as they will each tend to get customers from their own local area. Not every hospital provides the full range of treatments.
| Ultramod0 -
Can Anyone show me a site that has followed the seomoz seo rules
Hi i have been reading the seo information on here which is very interesting and i would like to know if anyone can point to any sites that have followed the rules and advice. It is great when you can read the info and rules but i feel it is also better to see a site that has followed the rules and to hear from people who have followed the information and put them into practice and explain what results they have got. I am currently building the following website http://www.womenlifestylemagazine.com so it would be great to see a site that has followed all the rules and who can explain if they work or not.
| ClaireH-1848860 -
Usage of Schema.org Microdata?
I am trying to figure out the correct usage of Schema.org for a business. Example: http://schema.org/Restaurant There is information like opening times or payments accepted. Would you populate this data within meta tags on every page (i.e. in the header) or really target specific pages? This could also apply to general info such as address, contact details, etc.. Interested in hearing your thoughts 🙂 Cheers Noel
| noeltock0 -
How to position in local Google
Hello, It's been easy for me to jump in .com - English only results. But, regional google is making me problems. How to position there? What are the top 3 key elements for ranking locally that don't matter much internationally? Thanks!
| DaBomb110 -
Entering into a new website franchise model, currently subdomains, client wants scalability. Best approach?
This is my first experience with a franchise model business. It is less than 1 year old and on page SEO is in pitiful shape with hundreds of subdomains already for specific locations. What is the best approach to take here? I've seen a lot of debate regarding subdomains and folders and it seems the folder structure may be the best long term course of action but I'm still a bit unclear on this. What is the best approach to ensure that all SEO addressed on the site has the most impact and moving forward, what is the best method to scaling the SEO for franchisee owners. What is the best practice to help each location be best positioned in search in the future, how much should the corporate franchise site typically provide in terms of SEO services to franchisees, and how does the lead SEO consultant scale those services to franchisees?
| methods0
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.