Category: Intermediate & Advanced SEO
Looking to level up your SEO techniques? Chat through more advanced approaches.
-
Retail Store Detail Page and Local SEO Best Practices
We are working with a large retailer that has specific pages for each store they run. We are interested in leveraging the best practices that are out their specifically for local search. Our current issue is around URL design for the stores pages themselves. Currently, we have store URL's such as: /store/12584 The number is a GUID like character that means nothing to search engines or, frankly, humans. Is there a better way we could model this URL for increased relevancy for local retail search? For example: adding store name:
| mongillo
www.domain.com/store/1st-and-denny-new-york-city/23421
(example http://www.apple.com/retail/universityvillage/) fully explicit URI www.domain.com/store/us/new-york/new-york-city/10027/bronx/23421
(example http://www.patagonia.com/us/patagonia-san-diego-2185-san-elijo-avenue-cardiff-by-the-sea-california-92007?assetid=5172) the idea with this second version is that we'd make the URL structure more rich and detailed which might help for local search. Would there be a best practice or recommendation as to how we should model this URL? We are also working to create an on-page optimization but we're specifically interested in local seo strategy and URL design.0 -
Blocked from google
Hi, i used to get a lot of trafic from google but sudantly there was a problem with the website and it seams to be blocked. We are also in the middle of changing the root domain because we are making a new webpage, i have looked at the webmaster tools and corrected al the errors but the page is still not visible in google. I have also orderd a new crawl. Anyone have any trics? do i loose a lot when i move the domainname, or is this a good thing in this mater? The old one is smakenavitalia.no The new one is Marthecarrara.no Best regards Svein Økland
| sveinokl0 -
Canonical OR redirect
Hi, i've a site about sport which cover matches. for each match i've a page. last week there was a match between: T1 v T2 so a page was created: www.domain.com/match/T1vT2 - Page1 this week T2 host T1, so there's a new page www.domain.com/match/T2vT1 - Page2 each page has a unique content with Authorship, but the URL, Title, Description, H1 look very similar cause the only difference is T2 word before T1. though Page2 is available for a few days, on site links & sitemap, for the search query "T2 T1 match" Page1 appears on the SERP (high location). of course i want Page2 to be on SERP for the above query cause it's the relevant match. i even don't see Page2 anywhere on the SERP and i think it wasn't indexed. Questions: 1. do you think google see both pages as duplicated though the content is different? 2. is there a difference when you search for T1 vs T2 OR T2 vs T1 ? 3. should i redirect 301 Page1 to Page2? consider that all content for Page1 and the Authorship G+ will be lost. 4. should i make rel=canonical on Page1 to Page2? 5. should i let google sort it out? i know it's a long one, thanks for your patience. Thanks, Assaf
| stassaf0 -
"Category" word in URLs of blog is it SEO Friendly URL ??
Hello respected community members, I saw many times that "Category" word comes in URL of blog. So my que is that is this negative for SEO or Positive. & if we don't wanna to come CATEGORY in URL how can we remove while URL Optimization ?
| sourabhrana390 -
Ecommerce: remove duplicate product pages or use rel=canonical
Say we have a white-widget that is in our white widget collection and also in our wedding widget collection. Currently, we have 3 different URLs for that product (white-widgets/white-widget and wedding-widgets/white-widget and all-widgets/white-widget).We are automatically generating a rel=canonical tag for those individual collection product pages that canonical the original product page (/all-widgets/white-widget). This guide says that is the structure Zappos uses and says "There is an elegance to this approach. However, I would re-visit it today in light of changes in the SEO world."
| birchlore
I noticed that Zappos, and many other shops now actually just link back to the parent product page (e.g. If I am in wedding widget section and click on the widget, I go to all-products/white-widget instead of wedding-widgets/white-widget).So my question is:Should we even have these individual product URLs or just get rid of them altogether? My original thought was that it would help SEO for search term "white wedding widget" to have a product URL wedding-widget/white-widget but we won't even be taking advantage of that by using rel=canonical anyway.0 -
What constitutes a duplicate page?
Hi, I have a question about duplicate page content and wondered if someone is able to shed some light on what actually constitutes a "duplicate". We publish hundreds of bus timetable pages that have similar, but technically with unique urls and content. For example http://www.intercity.co.nz/travel-info/timetable/lookup/akl The template of the page is oblivious duplicated, but the vast majority of the content is unique to each page, with data being refreshed each night. Our crawl shows these as duplicate page errors, but is this just a generalisation because the urls are very similar? (only the last three characters change for each page - in this case /akl) Thanks in advance.
| BusBoyNZ0 -
Is Video Sharing sites is still useful for SERP ?
Well I am not talking about the audience views, i am asking whether it is good for submitting videos to multiple video sites for backlinks and any sharp movements for the keywords. I seen most of the sites are nofollow which is not useful but for the link diversification is that something good ?
| chandubaba0 -
Link Research Tools - Detox Links
Hi, I was doing a little research on my link profile and came across a tool called "LinkRessearchTools.com". I bought a subscription and tried them out. Doing the report they advised a low risk but identified 78 Very High Risk to Deadly (are they venomous?) links, around 5% of total and advised removing them. They also advised of many suspicious and low risk links but these seem to be because they have no knowledge of them so default to a negative it seems. So before I do anything rash and start removing my Deadly links, I was wondering if anyone had a). used them and recommend them b). recommend detoxing removing the deadly links c). would there be any cases in which so called Deadly links being removed cause more problems than solve. Such as maintaining a normal looking profile as everyone would be likely to have bad links etc... (although my thinking may be out on that one...). What do you think? Adam
| NaescentAdam0 -
Where's all the text?
Hi, We recently (yesterday) had a developer make a new site for us on Wix http://www.appointeddhq.com/ as the one we were planning to put up had a few teething issues (the beackend booking system wasn't ready and we needed something up immediately for a TV show we were being featured in). Having now had the chance to look through it, I'm not quite sure what's going on. None of the text appears to be there on any page, I can't find any of the descriptions we gave the developer, the alt tags behind pictures (and even the pics themselves) don't appear to be there, the URLs are messed up, titles are incorrect and there are no title tags to be found. Am I misunderstanding or is the whole site built in java? Obviously, this is quite a huge issue and I'll want to get it sorted immediately, but I thought it best to see what the good folks here though. Thanks!
| LeahHutcheon0 -
Issue with Robots.txt file blocking meta description
Hi, Can you please tell me why the following error is showing up in the serps for a website that was just re-launched 7 days ago with new pages (301 redirects are built in)? A description for this result is not available because of this site's robots.txt – learn more. Once we noticed it yesterday, we made some changed to the file and removed the amount of items in the disallow list. Here is the current Robots.txt file: # XML Sitemap & Google News Feeds version 4.2 - http://status301.net/wordpress-plugins/xml-sitemap-feed/ Sitemap: http://www.website.com/sitemap.xml Sitemap: http://www.website.com/sitemap-news.xml User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Other notes... the site was developed in WordPress and uses that followign plugins: WooCommerce All-in-One SEO Pack Google Analytics for WordPress XML Sitemap Google News Feeds Currently, in the SERPs, it keeps jumping back and forth between showing the meta description for the www domain and showing the error message (above). Originally, WP Super Cache was installed and has since been deactivated, removed from WP-config.php and deleted permanently. One other thing to note, we noticed yesterday that there was an old xml sitemap still on file, which we have since removed and resubmitted a new one via WMT. Also, the old pages are still showing up in the SERPs. Could it just be that this will take time, to review the new sitemap and re-index the new site? If so, what kind of timeframes are you seeing these days for the new pages to show up in SERPs? Days, weeks? Thanks, Erin ```
| HiddenPeak0 -
URL with a # but no ! being indexed
Given that it contains a #, how come Google is able to index this URL?: http://www.rtl.nl/xl/#/home It was my understanding that Google can't handle # properly unless it's paired with a ! (hash fragment / bang). site:http://www.rtl.nl/xl/#/home returns nothing, but: site:http://www.rtl.nl/xl returns http://www.rtl.nl/xl/#/home in the result set
| EdelmanDigital0 -
Traffic down 60% - about to cry, please help
Hiya guys and girls, I've just spent 6 months, a lot of blood sweat and tears, and money developing www.happier.co.uk. In the last weeks the site started to make a trickle of money, still loss making but showing green shoots. But then on Friday the traffic dropped due to my rankings on google.co.uk dropping. Visits: Thur 25th april = 1950 Fri 26th april = 1284 Sat 27th april = 906 So it looks like Ive been hit with some sort of penalty. I did get a warning on the 20th april about an increase in the number of 404 errors, currently showing 77. I've now remove the links to those 404 pages, ive left the 404 pages as is, as was suggested here: http://www.seomoz.org/blog/how-to-fix-crawl-errors-in-google-webmaster-tools. Could that be the reason? We have spent a lot of time on site design and content. We think the site is good, but I agree it has a long way to go but without income that is hard, so we have been struggling through. Any ideas on the reason/s for the penalty? Big thanks, Julian.
| julianhearn0 -
Penguin Penalty On A Duplicate url
Hi I have noticed a distinct drop in traffic to a page on my web site which occurred around April of last year. Doing some analysis of links pointing to this page, I found that most were sitewide and exact match commercial anchor text. I think the obvious conclusion from this is I got slapped by Penguin although I didn't receive a warning in Webmaster Tools. The page in question was ranking highly for our targeted terms and the url was structured like this: companyname.com/category/index.php The same page is still ranking for some of those terms, but it is the duplicate url: companyname.com/category/ The sitewide problem is associated with links going to the index.php page. There aren't too many links pointing to the non index.php page. My question is this - if we were to 301 redirect index.php to the non php page, would this be detrimental to the rankings we are getting today? ie would we simply redirect the penguin effect to the non php page? If anybody has come across a similar problem or has any advice, it would be greatly appreciated. Thanks
| sicseo0 -
Updating existing content - good or bad?
Hi All, There are many situations where I encounter the need (or the wish) to update existing content. Here are few reasons: Some update turned up on the subject that does not justify a new posy / article but rather just adding two lines. The article was simply poorly written yet the page has PR as it is a good subject and is online for quite some time (alternatively I can create a new and improved article and 301 the old one to the new). Improving titles and sub titles of old existing articles. I would love to hear your thoughts on each of the reasons... Thanks
| BeytzNet1 -
Duplicate content across hundreds of Local sites and they all rank #1
Usually when we discuss duplicate content, we're addressing the topic of penalties or non-indexing. In this case, we're discussing ranking high with duplicate content. I've seen lots of dental, chiropractor and veterinarian sites built by companies that give them cookie cutter sites with the same copy. And they all rank #1 or #2. Here are two companies that do that:
| katandmouse
http://www.rampsites.com/rampsites/home_standard.asp?sectionid=4
http://mysocialpractice.com/about/ The later uses external blogs to provide inbound links to their clients' site, but not all services do that, in fact, this is the first time I've seen them with external blogs. Usually the blog with duplicate copy is ON SITE and the sites still rank #1. Query "Why Your Smile Prefers Water Over Soft Drinks" to see duplicate content on external blogs. Or "Remember the Mad Hatter from the childhood classic, Alice in Wonderland? Back then, the process of making hats involved using mercury compounds. Overexposure could produce symptoms referred to as being" for duplicate content on chiropractor sites that rank high. I've seen well optimized sites rank under them even though their sites have just as much quality content and it's all original with more engagement and inbound links. It appears to me that Google is turning a blind eye on duplicate content. Maybe because these are local businesses with local clientele it doesn't care that a chiropractor in NY has the same content as one in CA, just as the visitor doesn't care because the visitor in CA isn't look at a chiropractor's site in NY generally. So maybe geo-targeting the site has something to do with it. As a test, I should take the same copy and put it on a non-geo-targeted site and see if it will get indexed. I asked another Local SEO expert if she has run across this, probably the best in my opinion. She has and she finds it difficult to rank above them as well. It's almost as if Google is favoring those sites. So the question is, should all dentists, chiropractors and veterinarians give it up to these services? I shudder to think that, but, hey it's working and it's a whole lot less work - and maybe expense - for them.0 -
Temporary Duplicate Sites - Do anything?
Hi Mozzers - We are about to move one of our sites to Joomla. This is one of our main sites and it receives about 40 million visits a month, so the dev team is a little concerned about how the new site will handle the load. Dev's solution, since we control about 2/3 of that traffic through our own internal email and cross promotions, is to launch the new site and not take down the old site. They would leave the old site on its current URL and make the new site something like new.sub.site.com. Traffic we control would continue to the old site, traffic that we detect as new would be re-directed to the new site. Over time (the think about 3-4 months) they would shift the traffic all to the new site, then eventually change the URL of the new site to be the URL of the old site and be done. So this seems to be at the outset a duplicate content (whole site) issue to start with. I think the best course of action is try to preserve all SEO value on the old URL since the new URL will eventually go away and become the old URL. I could consider on the new site no-crawl/no-index tags temporarily while both sites exist, but would that be risky since that site will eventually need to take those tags off and become the only site? Rel=canonical temporarily from the new site to the old site also seems like it might not be the best answer. Any thoughts?
| Kenn_Gold0 -
Should you cache redirects?
I would like to know what fellow SEO people think, should you cache a redirect? Problems I see with caching redirects are meta refreshes and there might be a slow down in page load, but is it a big issue? Should we cache redirects? Do pages get indexed more if you cache redirects? Our ecommerce product pages are all dynamic, and currently we cache redirects but i'm seeing a lot of meta refresh issues. Another area that cropped up is that, the redirect doesn't pass on query parameters. Our system dumps URLs and they are redirected to SEO ones, but the redirect doesn't pass on parameters like Google Analytic tracking tags. What are your thoughts? Thanks
| Bio-RadAbs0 -
Canonical url question
i just search seomoz tooll it say duplicate content for www.mysite.com and www.mysite.com/index.php should i use canonical url for this ? is yes then is this right ?
| constructionhelpline0 -
Canonical Tag for Pages with Less Content
I am considering using a cross-domain canonical tag for pages that are very similar but one has less content than the other. The domains are geo specific, so for example. www.page.com - with content xxx, yyy, zzz, and www.page.fr with content xxx is this a problem because while there is clearly duplicate content here the pages are not actually significantly similar since there is so much less content on one page than the other?
| theLotter0 -
Getting a Sitemap for a Subdomain into Webmaster Tools
We have a subdomain that is a Wordpress blog, and it takes days, sometimes weeks for most posts to be indexed. We are using the Yoast plugin for SEO, which creates the sitemap.xml file. The problem is that the sitemap.xml file is located at blog.gallerydirect.com/sitemap.xml, and Webmaster Tools will only allow the insertion of the sitemap as a directory under the gallerydirect.com account. Right now, we have the sitemap listed in the robots.txt file, but I really don't know if Google is finding and parsing the sitemap. As far as I can tell, I have three options, and I'd like to get thoughts on which of the three options is the best choice (that is, unless there's an option I haven't thought of): 1. Create a separate Webmaster Tools account for the blog 2. Copy the blog's sitemap.xml file from blog.gallerydirect.com/sitemap.xml to the main web server and list it as something like gallerydirect.com/blogsitemap.xml, then notify Webmaster Tools of the new sitemap on the galllerydirect.com account 3. Do an .htaccess redirect on the blog server, such as RewriteRule ^sitemap.xml http://gallerydirect.com/blogsitemap_index.xml Then notify Webmaster Tools of the new blog sitemap in the gallerydirect.com account. Suggestions on what would be the best approach to be sure that Google is finding and indexing the blog ASAP?
| sbaylor0 -
.COM or .ORG - Which is better?
I work for a non-profit association. We currently use a .com as our primary, but also own the .org. Should we switch to the .org address? What would the benefits be?
| vpoffunk0 -
Huge google index with un-relevant pages
Hi, i run a site about sport matches, every match has a page and the pages are generated automatically from the DB. pages are not duplicated, but over time some look a little bit similar. after a match finishes it has no internal links or sitemap entry, but it's reachable by direct URL and continues to be on google index. so over time we have more than 100,000 indexed pages. since past matches have no significance and they're not linked and a match can repeat and it may look like duplicate content....what you suggest us to do: when a match is finished - not linked, but appears on the index and SERP 301 redirect the match Page to the match Category which is a higher hierarchy and is always relevant? use rel=canonical to the match Category do nothing.... *301 redirect will shrink my index status, some say a high index status is good... *is it safe to 301 redirect 100,000 pages at once - wouldn't it look strange to google? *would canonical remove the past matches pages from the index? what do you think? Thanks, Assaf.
| stassaf0 -
Google drop down - keyword gone, why?
Hi guys, i received traffic off a yearly based term, this year for '2013' i noticed it is nowhere near what the yearly term was for the year before. I believe that Google has stopped the yearly term appearing in a drop-down menu from a big volume related term, my question is how do they determine what goes in the drop down menu for related/relevant searches?
| pauledwards0 -
Ranking 1st on Google, but not in top 50 on Bing and Yahoo?
Hi Mozzers, Roughly 2 weeks ago we were ranked:
| Travis-W
#2 on Google for "African American Business Owner Mailing Lists"
#2 on Bing
#2 on Yahoo Now we are ranked
#1 on Google #50 on Bing
#50 on Yahoo I noticed a lot of our other keywords improved on Google during this period but vanished from the other 2 search engines. Other KWs include
"Apartment Owner Mailing Lists " (#4 on Google)
"Community College Mailing Lists (#3 on Google)
etc. What gives?
Thoughts?0 -
Bing and Yahoo Vanished
I currently have a Website that is ranking #1 or #2 for almost every keyword we are targeting in Google; however, when I go to Yahoo or Bing the site doesn’t appear at all. In fact, even when I search for the domain using these engines I don’t appear – only a sub-domain that’s hosted on the site does and at the bottom of page 1 for the domain search. Any insight as to what might be causing this or where I should start looking?
| VERBInteractive0 -
Website is not indexed in Google, please help with suggestions
Our client website was removed from Google index. Anybody could recommend how to speed up process of re index: Webmaster tools done SM done (Twitter, FB) sitemap.xml done backlinks in process PPC done Robots.txt is fine Guys any recommendations are welcome, client is very unhappy. Thank you
| ThinkBDW0 -
Outbound link to PDF vs outbound link to page
If you're trying to create a site which is an information hub, obviously linking out to authoritative sites is a good idea. However, does linking to a PDF have the same effect? e.g Linking to Google's SEO starter guide PDF, as opposed to linking to a google article on SEO. Thanks!
| underscorelive0 -
301 Redirection and apostrophes in URLs
Hi I am experiencing trouble getting any redirects with apostrophes in the URLs to 301 redirect in order to eliminate 404 errors. I have tried replacing the instance of the apostrophe in the source URL field to %27 and variations of this but to no avail. The site is a wordpress site (the old URLS are legacies from the old Business Catalyst site) and I am using the redirection plug in. I have gone into some detail with a helpful soul here http://wordpress.org/support/topic/how-to-deal-with-apostrophes-in-source-url but unfortunately to no result. If anyone has any idea how to solve this puzzle I would be grateful for the help. Example: http://www.tesselaars.com/blog/Inside_Flowers/post/Online_Marketing_for_Florists_Part_1%E2%80%93_A_Website_You_Won%27t_Regret/
| Seamoose0 -
Two websites in different niches. Should I create separate G+ authorship profiles?
I have two different websites. One of them is one of the most authoritative e-commerce websites in its niche. I own forums, installation resource websites and various other sites that provide excellent user information and customer interactions. There is another e-commerce website that I own which is very young and not as authoritative. I am about to start building it out like I did for my other site. My question is about whether I should link my G+ profile and become a "contributor/author" to the new e-commerce site(s) or if I should have someone else in the company be the "face" of this website. Since they're in two completely different niche's, I didn't know if it will give mixed signals to Google if my G+ profile is all about niche A, and then I start throwing in rel=author and being a contributor to other sites that have nothing to do with the original niche. Should I create another G+ profile to contribute to all of the guest posting & 2nd tier site creation for the new niche site or just use the one I have now for the time being?
| SWWebTeam2 -
How Do I Generate a Sitemap for a Large Wordpress Site?
Hello Everyone! I am working with a Wordpress site that is in Google news (i.e. everyday we have about 30 new URLs to add to our sitemap) The site has years of articles, resulting in about 200,000 pages on the site. Our strategy so far has been use a sitemap plugin that only generates the last few months of posts, however we want to improve our SEO and submit all the URLs in our site to search engines. The issue is the plugins we've looked at generate the sitemap on-the-fly. i.e. when you request the sitemap, the plugin then dynamically generates the sitemap. Our site is so large that even a single request for our sitemap.xml ties up tons of server resources and takes an extremely long time to generate the sitemap (if the page doesn't time out in the process). Does anyone have a solution? Thanks, Aaron
| alloydigital0 -
Google Reconsideration - Denied for the Third Time
I have been in the process of trying to get past a "link scheme" penalty for just over a year. I took on the client in April 2012, they had received their penalty in February of 2012 before i started. Since then we have been trying to manually remove links, contact webmasters for link removal, blocking over 40 different domains via the disavow tool and requesting reconsideration multiple times. All i get in return "Site violates Google's quality guidelines." So we regrouped and did some more research to find that about 90% of the offending spam links pointed to only 3 pages of the website so we decided to just delete the pages, display a 404 error in their place and create new pages with new URLs. At first everything was looking good, the new pages were ranking and receiving page authority and the old pages were gone from the indexes. So we resubmitted for reconsideration for the third time and we got the same exact response! I don't know what else to do? I did everything i could think of with the exception of deleting the whole site. Any advice would be greatly appreciated. Regards - Kyle
| kchandler0 -
Reducing onpage links - manufacture list
Hi Guys, I am relaunching one of my sites, and one of the categories has 746 links on it due to a list of boating manufactures. Ideally I need to cut this down - anyone got any tips on how I can do this without losing the user experience but still allowing google to crawl all the manufactures? Cheers
| Sayers0 -
How to make Google include our recipe pages in its main index?
We have developed a recipe search engine www.edamam.com and serve the content of over 500+ food bloggers and major recipe websites. Our legal obligations do not allow us to show the actual recipe preparation info (e.g. the most valuable from the content), we can only show a few images, the ingredients and nutrition information. Most of the unique content goes to the source/blog. By submitting XML sitemaps on GWT we now have around 500K pages indexed, however only a few hundred appear in Google's main index and we are looking for a solution to include all of them in the index. Also good to know is that it appears that all our top competitors are in the exactly same situation, so it is a challenging question. Any ideas will be highly appreciated! Thanks, Lily
| edamam0 -
Need help creating sitemap
Hello, The details of my question is sitemap related. Below is the background info: we are ecommerce site with around 4000 pages, and 20000 images. we dont have a sitemap implemented on our site yet. i have checked alot of sitemap tools out there, like g-sitecrawler, xml sitemap, a1 sitemap builder etc, and i tried to create sitemaps via them, but all them give different results. the major links are all there, but the results start to vary for level 2, level 3 links and so on. plus no matter how much i read up on sitemaps, the more i am getting confused. i read lots of seomoz articles on sitemaps, and due to my limited seo and technical knowledge, the extra information on these articles gets more confusing. i also just read an article on seomoz that instead of having one sitemap, having multiple smaller sitemaps is very good idea, specially if we are adding lots of new products (which we are). Now my question: My question is having understood the immense value of sitemap (and by having it very poorly implemented before), how can i make sure that i get a very good sitemap (both xml and html sitemap). i do not want to do something again and just repeat old mistakes by having a poorly implemented sitemap for our site. I am hoping that one of the professionals out there, can help me also make and implement the sitemap. If you can please point me to the right direction.
| kannu10 -
Canonical tags required when redirecting?
Hello, My client bought a new domain and he wants it to be the main domain of his company. His current domain though has been online for 10 years and ranks pretty well on a few keywords. I feel it is necessary to redirect the old domain to the new one to take advantage of its ranking and avoid any broken links. The sites are exactly the same. Same sections and same content. Is it necessary to place canonical tags on one of the sites to avoid duplicate content/sites? Any thoughts? Thanks
| Eblan0 -
If parent domain is www, does it matter if subdomain on a different server is non-www?
If you have a main website (www.example.com) with a subdomain of the website (service.example.com) that lives on a separate server with a separate IP address, is there an SEO benefit/advantage to have having the www included in the url since the parent url includes the www? Assume: 1. Applicable 301 redirects are in place on both sites 2. No duplicate content issues Additionally, would your answer be different if the site is a .gov or .edu site vs. a .com?
| SEOteamfl0 -
Switching from Google Plus Local to Google Plus Business
Greetings, We have a website design firm located in India. We wanted to target customers in our city who are looking for website design locally. And with google plus local and a few content marketing would get us into first page very soon because none in the competition is using social signals or even content marketing. BUT unfortunately from last month even though our Google Places is verified we cant verify our Google Local Plus page https://plus.google.com/b/116513400635428782065/ It just shows error 500. Its a bug and its been a year for people without it being addressed. So we are skeptical if our strategy would work without Google+. At the least we decided we would just make company local page and connect it with website. But it might not have effect as local. So we are still unsure which step to take either to wait for google to fix it.(feedbacks emails calls nothing worked) OR We start the process with Google Business Category.
| hard0 -
Should I build & try to rank several pages for similar keywords?
I have a client who's domain already ranks #1 on Google for 'automotive advertising agency'. However we want several listing on the first page. Should I create a few more pages like www.domain.com/automotive-advertising-agency www.domain.com/advertising-agency www.domain.com/automotive-advertising I'm assuming I can get these pages to rank well, but I'm wondering if Google will penalize us for this. Is this a good or bad idea?
| Branden_S0 -
Correct strategy for long-tail keywords?
Hi, We are selling log houses on our website. Every log house is listed as a "product", and this "product" consists of many separate parts, that are technically also products. For example a log house product consists of doors, windows, roof - and all these parts are technically also products, having their own content pages. The question is - Should we let google index these detail pages, or should we list them as noindex? These pages have no content, only the headline, which are great for long-tail SEO. We are probably the only manufacturer in the world who has a separate page for "log house wood beam 400x400mm". But otherwise these pages are empty. My question is - what should we do? Should we let google index them all (we have over 3600 of them) and maybe try to insert an automatic FAQ section to every one of them to put more content on the page? Or will 3600 low-content pages hurt our rankings? Otherwise we are ranking quite well. Thanks, Johan
| JohanMattisson0 -
Google's Exact Match Algorithm Reduced Our Traffic!
Google's first Panda de-valued our Web store, www.audiobooksonline.com, and our traffic went from 2500 - 3000 (mostly organic referrals) per month to 800 - 1000. Google's under-valuing of our Web store continued to reduce our traffic to 400-500 for the past few months. From 4/5/2013 to 4/6/2013 our traffic dropped 50% more, because (I believe) of Google's "exact domain match" algorithm implementation. We were, even after Panda and up to 4/5/2013 getting a significant amount of organic traffic for search terms such as "audiobooks online," "audio books online," and "online audiobooks." We no longer get traffic for these generic keywords. What I don't understand is why a UK company, www.audiobooksonline.co.uk/, with a very similar domain name, ranks #5 for "audio books online" and #4 for "audiobooks online" while we've almost disappeared from Google rankings. By any measurement I am aware of, our site should rank higher than audiobooksonline.co.uk. Market Samurai reports for "audio books online" and "audiobooks online" shows that our Web store is significantly "stronger" than audiobooksonline.co.uk but they show up on Google's first page and we are down several pages. I also checked a few titles on audiobooksonline.co.uk and confirmed they are using the same publisher descriptions we and many other online book / audiobook merchants do = duplicate content. We have never received notice that our Web store was being penalized. Why would audiobooksonline.co.uk rank so much higher than audiobooksonline.com? Does Google treat non-USA sites different than USA sites?
| lbohen0 -
Daily Link Building tactics to move the needle
I know most of you may frown upon this question, but to those of you who are still going after blog commenting, forum posting, Q&A sites (even if that means you're getting nofollow links), do you have any recommendations on a guide/blog post that describes how to create a daily "low level" link building program to supplement the higher level, relationship dependent link building that you're already doing? Thanks in advance!
| pbhatt0 -
Is there anyway to recover my site's rankings?
My site has been top 3 for 'speed dating' on Google.co.uk since about 2003 and it went to below top 50 for a lot of it's main keywords shortly after 27 Oct 2012. I did a re-submission request and was told there was 'no manual spam action'. My conclusions is I was dropped by Google because of poor quality links I've gained over 10+ years. I have a Domain Authority of 40, a regular blog http://bit.ly/oKyi88, a KLOUT of 42, user reviews and quality content. Since Oct 2012 I've done some technical improvements and managed to get a few questionable links removed. I've continued blogging reguarly and got more active on Twitter. I've seen no improvement and my traffic is 80% down on last year. It would be great to be able to produce content that others want to link to but I've not had much success from that in over 10 years of trying and I've not seen many others in my sector, with small budgets having much success. Is there anything I can do to regain favour with Google?
| benners0 -
Why is this site have a PR 0 rank? Anyone can figure this out? LegionSafety.com
Our site dropped in PR and we haven't done anything and not sure why the drop. Anyone have any recommendations?
| legionsafety0 -
Using WP All Import csv import plugin for wordpress to daily update products on large ecommerce site. Category naming and other issues.
We have just got an automated solution working to upload about 4000 products daily to our site. We get a CSV file from the wholesalers server each day and the way they have named products and categories is not ideal. Although most of the products remain the same (don't need to be over written) Some will go out of stock or prices may change etc. Problem is we have no control over the csv file so we need to keep the catagories they have given us. Might be able to create new catgories and have products listed under multiple categories? If anyone has used wp all import or has knoledge in this area please let me know. I have plenty more questions but this should start the ball rolling! Thanks in advance mozzers
| weebro0 -
My landing page changed in google's serp. I used to have a product page now I have a pdf?
I have been optimizing this page for a few weeks now and and have seen our page for up from 23rd to 11th on the serp's. I come to work today and not only have I dropped to 15 but I've also had my relevant product page replaced by this page . Not to mention the second page is a pdf! I am not sure what happened here but any advice on how I could fix this would be great. My site is www.mynaturalmarket.com and the keyword I'm working on is Zyflamend.
| KenyonManu3-SEOSEM0 -
How do you make short tail keyword determinations when combined with long tail when there is not enough search volume to provide info
Confusing question, allow me to elaborate We have a few pages that target a particular doctor for example. One of those pages is about his backround. His short tail is his name "Dr Irving Weiss" for example, low competition of course and already too low of search volume to show in Google keywords tool (which i know isn't the best tool) so now one of his tab pages addresses info on his medical background (credentials, schools, awards, sanctions) if you search for those single keywords alone you get something like this hospitals 74,000 background 673,000 credentials 49,000 but that doesn't necessarily mean more people will search like "dr irving weiss background" than they will "dr irving weiss credentials" just because background has more searches "dr irving weiss background" and "dr irving weiss credentials are way too low search volume to have any data on, so how can you come to a proper keyword targeting conclusion when the data is not there? THANKS IN ADVANCE for any insight!
| irvingw0 -
Robot.txt error
I currently have this under my robot txt file: User-agent: *
| Rubix
Disallow: /authenticated/
Disallow: /css/
Disallow: /images/
Disallow: /js/
Disallow: /PayPal/
Disallow: /Reporting/
Disallow: /RegistrationComplete.aspx WebMatrix 2.0 On webmaster > Health Check > Blocked URL I copy and paste above code then click on Test, everything looks ok but then logout and log back in then I see below code under Blocked URL: User-agent: * Disallow: / WebMatrix 2.0 Currently, Google doesn't index my domain and i don't understand why this happening. Any ideas? Thanks Seda0 -
Same content pages in different versions of Google - is it duplicate>
Here's my issue I have the same page twice for content but on different url for the country, for example: www.example.com/gb/page/ and www.example.com/us/page So one for USA and one for Great Britain. Or it could be a subdomain gb. or us. etc. Now is it duplicate content is US version indexes the page and UK indexes other page (same content different url), the UK search engine will only see the UK page and the US the us page, different urls but same content. Is this bad for the panda update? or does this get away with it? People suggest it is ok and good for localised search for an international website - im not so sure. Really appreciate advice.
| pauledwards0 -
Will changing Google Places address hurt rankings?
I have a client transferring ownership of their service business (photo booth rental). The current listed address will change, so my main concern is preserving the rankings during the transition. Should I change the Google Local listing to a new physical address, or change it to "serve a surrounding area"? It seems best to set as "serving a surrounding area", but I know Google is really weird about making local listing changes. I've seen and heard about countless listings falling completely off the map after being updated. Any advice appreciated.
| Joes_Ideas0 -
Maintaining SEO with Ecommerce Search Refinement
Hey Everyone, i have an interesting scenario I'd appreciate some feedback on. I'm working on restructuring a client site for a store design, and he had previously built a bunch of landing pages mostly for SEO value- some of them aren't even accessible from the main nav and contain a lot of long-tail type targets. These pages are generating organic traffic but the whole thing is pretty not user-friendly because it's cumbersome to drill down into specific categories (that many of the landing pages fulfill) without going through 3 or 4 pages to get there. For example, if I want to buy orange shoes, i can see specific kinds of orange shoes, but not ALL the orange shoes, even though there is an SEO page for orange shoes that is otherwise inaccessible from the main navigation. If that wasn't too confusing, essentially the usability solution to this is implementing some search refinement so that the specific sub categories can be drilled into easily with less steps. My issue is that I'm hesitant to implement this even though I know it would be an overall benefit to the site, because the existence of these SEO pages and being wary of destroying the organic traffic they're already receiving. My plan was to see to it that the specific category pages are built with the necessary keywords and content to attract those organic visits, but I'm still nervous it might not be enough. Does anyone have any suggestions for this circumstance, but also just maximizing SEO efforts on a site with search refinement and how to minimize loss. From a usability standpoint, search refinement is great, but how do you counter the significant SEO risks that come with it? Thanks for your help!
| BrandLabs0
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.