Category: Technical SEO
Discuss site health, structure, and other technical SEO strategies.
-
Unavoidable duplicate page
Hi, I have an issue where I need to duplicate content on a new site that I am launching. Visitors to the site need to think that product x is part of two different services. e.g. domain.com/service1/product-x domain.com/service2/product-x Re-writing content for product x for each service section is not an option but possibly I could get over that only one product-x page is indexed by search engines. What's the best way to do this? Any advice would be appreciated. Thanks, Stuart
| Stuart260 -
Site removed from Google Index
Hi mozers, Two months ago we published http://aquacion.com We registered it in the Google Webmaster tools and after a few day the website was in the index no problem. But now the webmaster tools tell us the URLs were manually removed. I've look everywhere in the webmaster tools in search for more clues but haven't found anything that would help me. I sent the acces to the client, who might have been stupid enough to remove his own site from the Google index, but now, even though I delete and add the sitemap again, the website won't show in Google SERPs. What's weird is that Google Webmaster Tools tells us all the page are indexed. I'm totally clueless here... Ps. : Added screenshots from Google Webmaster Tools. Update Turns out it was my mistake after all. When my client developped his website a few months ago, he published it, and I removed the website from the Google Index. When the website was finished I submited the sitemap, thinking it would void the removal request, but it don't. How to solve In webmaster tools, in the [Google Index => Remove URLs] page, you can reinclude pages there. tGib0
| RichardPicard0 -
Looking for a technical solution for duplicate content
Hello, Are there any technical solutions to duplicate content similar to the nofollow tag? A tag which can indicate to Google that we know that this is duplicate content but we want it there because it makes sense to the user. Thank you.
| FusionMediaLimited0 -
Phone Number In Meta Description
People are more likely to call us, than email us. However, if they're using a mobile device, there's a click to call button on that site. My question is this: google does not include our phone number in our meta description. I could try to get the description changed, but it doesn't seem like it would make that big of a deal for just the desktop site. Am I missing something about the importance of the phone number on a desktop site? Any experience with this situation? Thanks, Ruben
| KempRugeLawGroup3 -
Unable to Get Back to Performance Levels Six Months After Attack
We work on a fine art site. In May we were attacked and blocked by Google for about 12hrs until we cleared all affected pages and files. Our response was pretty fast, and to be fair, so was Google's action to unblock us. However, shortly afterwards, the entire site's page rank disappeared, it never came back. I also believe people are now speculating that Google's PR updates are either overdue or have stopped, so I can't gauge PR at all. One way we can gauge potential for position is that we remain top 1-6 in Bing for most terms. Moz authority reflects this as well. An example: http://www.bing.com/search?q=doug+hyde+prints&qs=n&form=QBLH&filt=all&pq=doug+hyde+prints&sc=6-15&sp=-1&sk= https://www.google.co.uk/search?rls=en&q=doug+hyde+prints These are the types of position we occupied on Google prior to the drop. We've been told there are no manual actions against us, nevertheless we've worked to clean out any duplicate content and disavow some link trawling/people directory sites that duplicated a lot of our content. Page code has been cleaned up, pages optimised, loading times reduced where possible etc. etc.. This chap seems to have had similar issues back in 2011: http://moz.com/community/q/not-ranking-well-after-site-was-hacked Any input/experience gratefully received. Working our wotsits off to generate traffic off no PR and Adwords at the moment and it's a struggle. I've actually got new sites to rank faster and better than this in 3 months, so this is one big penalty. Roland
| designgroop0 -
Google Disavow Tool
Some background: My rankings have been wildly fluctuating for the past few months for no apparent reason. When I inquired about this, many people said that even though I haven't received any penalty notice, I was probably affected by penguin. (http://moz.com/community/q/ranking-fluctuations) I recently did a link detox by LinkRemovalTools and it gave me a list of all my links, 2% were toxic and 51% were suspiscious. Should I simply disavow the 2%? There are many sites where is no contact info.
| EcomLkwd0 -
Using 302 redirect for SEO
Hello, I'm in charge of SEO for an information website on which articles are only accessible if you have a login and password. Most of the natural links we get point to our subscribers' subomain : subscribers.mywebsite.com/article1 If they follow these natural links, visitors who are not logged get redirected (302) to www.mywebsite.com/article1 on which there is an extract of the article and they can request a free test subscription to read the end of the article. My goal is to optimize SEO for the www.mywebsite.com/article1 page. Does this page benefit from the links I get to the subscribers.mywebsite.com/article1 page or are theses links lost in terms of SEO? Thanks for your help, Sylvain
| Syl200 -
Will the Google juice flow after a redirect?
If someone links to this page : http://abonnes.hospimedia.fr/articles/20131004-plfss-2014-les-federations-de-l-aide-a Will the Google juice flow to this page? http://www.hospimedia.fr/actualite/articles/20131010-gestion-des-risques-un-projet-de-decret-vient
| Syl200 -
Subdomain question for law firm in Indiana, Michigan, and New Mexico.
Hi Gang, Our law firm has offices in the states of Indiana, Michigan, and New Mexico. Each state is governed by unique laws, and each state has its own "flavor," etc. We currently are set up with the main site as: http://www.2keller.com (Indiana) Subdomains as: http://michigan.2keller.com (Michigan) http://newmexico.2keller.com (New Mexico) My client questions this strategy from time to time, and I want to see if anyone can offer some reassurance of which I haven't thought. Our reason for setting up the sites in this manner is to ensure that each site speaks to state-specific practice areas (for instance, New Mexico does nursing home abuse, whereas the other states don't, etc.) and state-specific ethics law (for instance, in some states you can advertise your dollar amount recoveries, and others you can't.) There are so many differences between each state that the content would seem to warrant it. Local citations and listings are another reason these sites are set up in such a fashion. The firm is a member of several local state directories and memberships, and by having these links go directly to the subdomain they reference, I can see this being another advantage. Also, inside each state there are separate pages set up for specific cities. We geo-target major cities in each state, and trying to do all of this under one domain for 3 different states would seemingly get very confusing, very quickly. I had thought of setting up the various state pages through folders on the main domain, but again, there is too much state specific info to make this seem like a logical approach. Granted the linking and content creation would be easier for one site, but I don't think we can accomplish this in a clean way with the offices being in such different locales? I guess I'm wondering if there are some things I'm overlooking here? Thanks guys/gals!
| puck991 -
What can I do with 8000+ 302 temporary redirects?
Hi I'm working on a clients site that has an odd structure with 8000+ 302 temporary redirects. They are related to actions on the site (they have to be there and work this way for the site to function) but they also have a stupid number of perameters. Would it be ok to block them all in the robots.txt file? Would that make any difference?
| TomVolpe0 -
Googlebot Crawl Rate causing site slowdown
I am hearing from my IT department that Googlebot is causing as massive slowdown/crash our site. We get 3.5 to 4 million pageviews a month and add 70-100 new articles on the website each day. We provide daily stock research and marke analysis, so its all high quality relevant content. Here are the crawl stats from WMT: http://imgur.com/dyIbf I have not worked with a lot of high volume high traffic sites before, but these crawl stats do not seem to be out of line. My team is getting pressure from the sysadmins to slow down the crawl rate, or block some or all of the site from GoogleBot. Do these crawl stats seem in line with sites? Would slowing down crawl rates have a big effect on rankings? Thanks
| SuperMikeLewis0 -
Still ok to use
This is the flag to prevent google storing a copy of your webpage. I want to use it for good reasons but in 2013 is it still safe to use. My websites not spammy but it's still very fresh with little to no links. Each item I sell takes a lot of research to both buy and sell with the correct info. Once it's sold one I may just come across another and want to hold my advantage of having already done my research and my sold price to myself. Competitors will easily find my old page from a long tail search. Some off my old sold pages keep getting hits and high bounce rates from people using it as reasearch and price benchmark. I want to stop this. So, No archive first, then 301 to category page once sold. Will the two cause a problem in googles eyes?
| Peter24680 -
Bay Area E-Commerce SEO Firm Needed
So my e-commerce site recently got hit badly with the latest Penguin update. Traffic is down by 60%. We were using a cheap Indian SEO firm who did get us great results but it seems they was a lot more spamming than I realized. I am now looking to clean up my backlinks and create a new relationship with a local business so I can be more hands on with my SEO. Does anyone have any recommendations for SEO firms that have experience in e-commerce? Ideally somewhere in the Bay Area or even Sacramento?
| premierchampagne0 -
If the order of products on a page changes each time the page is loaded, does this have a negative effect on the SEO of those pages?
Hello, a client of mine has a number of category pages that each have a list of products. Each time the page is reloaded the order of those products changes. Does this have a negative effect on the pages' rankings? Thank you
| Kerry_Jones2 -
Add Small Relevant Posts to website?
Hello Everyone, I'm starting a new WordPress-based community site WP Temple. As a part of this site, i'm going to be adding a "Code Snippets" Section which will show visitors how to do very specific things on their site. These snippets, by nature, will be small articles but contain highly-relevant for search. Should this content be indexed and followed, as well as added to my XML sitemap? I appreciate the feedback! Zach
| Zachary_Russell0 -
Disavowing links, Is it effective?
Looking for your experiences with disavowing back-links? We've been flooded with new clients who need spammy link removal services and wanted to hear more about your experience with the disavow tool. For sites that have been penalized, how long did it take for them to come back using the disavow tool? Did you see sites come back after the next algo update? Here's the basics of our services for link deletion: 1. Find all the spammy links
| Keith-Eneix
2. Contact webmasters to delete them
3. Disavow all spammy links that are part of an obvious network
4. Implement a content plan for new quality links to get the site healthy again.
5. Report on all links removed and new links attained Just want to make sure our processes are in line with what everyone else is doing?0 -
Panda Recovery ETA?
I have a blog hit by Panda in 2011 and 2012. The thing is, I've no-indexed around 1000 posts out of 11xx. No-indexed tags and archives. But, Google was taking a very long time to remove them from their indexes. So, I had to do a manual removal from Google WMT. Removed /2011/ and /2013/ as directories, and removed /pages/ (this is an WordPress site) so all of them are now no longer in their index. It was a smartphone blog started in 2011 which I turned into an tech blog on a new domain (I let the old PR3 DA 30+ domain expire and now someone's asking me $200 if I am to get it). I had a team when it was a smartphone blog. Our articles had been featured on places like Engadget, PhoneArena, UberGizmo etc. So, with the loss of the domain, we've lost quite a few important backlinks as well. Also, Authorship doesn't work for the site. The Rich Snippets testing tool says everything's all right, but it never really works / shows up on SERPs. I fear it's because of a penalty. It seems to me like no one has ever thought about a penalty that affects Authorship. So, now you know the problem, and the things I did in order to fix it, could you tell me if: Google will lift the penalty whenever they wish. (And an ETA?) They'll lift it when the next major algorithmic update occurs. (I made the changes on September 28th) But I don't see how this is a possibility since Panda has now been integrated into the core algorithm. Anything else. Thanks in advance everyone!
| RohitPalit0 -
To 301 redirect or not...
Hi guys i'd like to get your opinion on this. We currently have two sites, site A is the old one with PA44 and DA33. Site B is the new one which is going to replace site A it currently has PA37 and DA24 Our plan for the future is to shut down site A and redirect all pages using 301 to the relevant pages on side B. Currently we have some links in place for a couple of keywords on site A to site B which seems to be working great for our ranking. Now i'm wondering if this is maybe a good option, to give back links from A to B or will i pass through more link juice when redirecting everything? (ps. both are e commerce sites hosted and registred with different companies)
| Immanuel0 -
Hashtag in url seems to remove the google plus one
My site has a catalogue page (catalog in US) with #anchors so that customers can get straight to them. I even link from other pages to the #anchor on the catalogue page. so I have for example: www.example.co.uk/catalogue.htm www.example.co.uk/catalogue.htm#blueitems www.example.co.uk/catalogue.htm#redtems I understand google doesn't index after the #, here is the post I found: http://moz.com/community/q/hashtag-anchor-text-within-content#reply_91192 So I shouldn't have an seo problem. BUT, if I navigate to www.example.co.uk/catalogue.htm and plusone the page it will show the plusone and then I navigate to www.example.co.uk/catalogue.htm#blueitems the plus one is gone. The same happens in reverse, if i plusone www.example.co.uk/catalogue.htm#redtems, then that plusone doesn't show in www.example.co.uk/catalogue.htm. I added rel=canonical and that fixed the plusone problem, now if you plus one /catalogue.htm#redtems it still shows on catalogue.htm This seems a bit extreme and did I do the right things??
| Peter24680 -
What is the best practice to re-index the de-indexed pages due to a bad migration
Dear Mozers, We have a Drupal site with more than 200K indexed URLs. Before 6 months a bad website migration happened without proper SEO guidelines. All the high authority URLs got rewritten by the client. Most of them are kept 404 and 302, for last 6 months. Due to this site traffic dropped more than 80%. I found today that around 40K old URLs with good PR and authority are de-indexed from Google (Most of them are 404 and 302). I need to pass all the value from old URLs to new URLs. Example URL Structure
| riyas_
Before Migration (Old)
http://www.domain.com/2536987
(Page Authority: 65, HTTP Status:404, De-indexed from Google) After Migration (Current)
http://www.domain.com/new-indexed-and-live-url-version Does creating mass 301 redirects helps here without re-indexing the old URLS? Please share your thoughts. Riyas0 -
Pointing Other URL to My Site? Good or bad for ranking.
A few years ago I purchased a few keyword rich domain names and set up some satellite sites. Spammy I now know. What should I do now? I own the domain names for at least another 3 years. Should I point them to my main site or would that hurt my main site ranking?
| caisson0 -
Which address do I use for citations
Hello, When I created my google places, I entered my address and when I got my google places activated I noticed that the address google places was displaying was a short abbreviation of my address. So my question is when it comes to creating citations for my listing do I grab the address google places generated for me in the listing or the long version of my address? I've just heard when it comes to creating citations, you need to make sure it is identical across the board. I hope this makes sense. Thanks!
| fbbcseo0 -
Affiliate Link is Trumping Homepage - URL parameter handling?
An odd and slightly scary thing happened today: we saw an affiliate string version of our homepage ranking number one for our brand, along with the normal full set of site-links. We have done the following: 1. Added this to our robots.txt : User-agent: *
| LawrenceNeal
Disallow: /*? 2. Reinserted a canonical on the homepage (we had removed this when we implemented hreflang as had read the two interfered with each other. We haven't had canonical for a long time now without issue. Is this anything to do with the algo update perhaps?! The third thing we're reviewing I'm slightly confused about: URL Parameter Handling in GWT. As advised - with regard to affiliate strings - to the question: "Does this parameter change page content seen by the user?" We have NO selected, which means they should be crawling one representative URL. But isn't it the case that we don't want them crawling or indexing ANY affiliate URLs? You can specify Googlebot to not crawl any of particular string, but only if you select: "Yes. The parameter changes the page content." Should they know an affiliate URL from the original and not index them? I read a quote from Matt Cutts which suggested this (along with putting a "nofollow" tag in affiliate links just in case) Any advice in this area would be appreciated. Thanks.0 -
$360 charged to embed 2 youtube video clips on web page with CMS system - Realistic?
Hi All, I have just had a bill from our webmaster / SEO provider for 3 hours work to embed two youtube video clips on to one page of our site (http://www.compoundsecurity.co.uk/raidervision-visual-verification-module-use-rdas-and-vx-gprs-wireless-security-systems) Now i am not an HTML programmer, but i have the ability to insert HTML code in to our eBay pages to embed videos where i want them on a page and it takes less than a minute. Would love to get feedback from several people on how long this should have taken. Is it really anywhere near 3 hours work. Everything else on the page was already there. If it means anything, the site is done in Apach Cheers
| DaddySmurf0 -
What effect does HTTPS have on SEO for a public site?
I have a client who I've been working with for 4 months but getting NO TRACTION at all on their SERPS. This is unusual for me! The only difference to their site from other clients is that the whole site is https so I'm wondering if that's making a big difference. The site is: https://www.cnc-ltd.co.uk Any help of hints would be great thanks in advance Steve
| stevecounsell0 -
Non Moving #1 Website
Hi Guys, Bit of an odd one we have recently undertaken our SEO in house (6-7 Months) and since doing this have had some great results, we have identified our competitors utilising Ahrefs, Moz and other tools that we have selected. However we have come up against something that we really cannot work out. One of our competitors ranks for everything related to there term on their website and industry at position #1 They are not actively doing SEO, Poor backlink structure from the research that we have carried out No fresh content being generate I was wondering if someone would perhaps mind spending 5 mins having a quick look to see what you think as i can't think what we might have missed. Tim.
| fordy0 -
Can you be penalised in Google for excessive internal keyword linking?
I have an online shop and 3 blogs (with different topics) all set up on sub-domains (for security reasons, don't want Word Press installed in the same hosting space as my shop in case one gets hacked). I have been on the front page of Google for a keyword, lets say 'widgets' for months now. I have been writing blogs about 'widgets', probably about 1/4 of all my blog posts are linking to the 'widgets' page in my shop. I write maybe 1-2 blogs a week, so it's not excessive. This morning I have woken to fine that the widgets page in my shop has vanished from Google's index. So typing in 'widgets' brings up nothing. It hasn't dropped in the rankings, it's just vanished. A few weeks ago I ranked 3 or 4. Then I dropped to about 6. A couple of days ago, i jumped back up to 5 and now it's vanished. If you type in 'buy widgets', or 'widgets online' or 'widgets australia', I have the #1 spot for all those, but for 'widgets', I just don't exist anymore. Could I have been penalised for writing too many posts and keyword linking internally? They're not keyword stuffed and they're well written. I just don't understand what's happened. Right now I"m freaking out about blogging and putting internal links on my website.
| sparrowdog0 -
Is this nofollow tag written wrong?
I'm doing a link audit and came across this nofollow tag: [http://www.jampaper.com/Envelopes](<a class=)">www.jampaper.com/Envelopes Does it matter that the nofollow tag is at the front? Shouldn't it after the URL?
| jampaper0 -
Can Google Read schema.org markup within Ajax?
Hi All, as a local business directory, we also display Openinghours on a business listing page. ex. http://www.goudengids.be/napoli-kontich-2550/
| TruvoDirectories
At the same time I also have schema.org markup for Openinghours implemented.
But, for technical reasons (performance), the openinghours (and the markup alongside) are displayed using AJAX. I'm wondering if google is able to read the markup. The rich snippet tool and markup plugings like Semantic Inspector can't "see" the markup for openinghours. Any advice here?0 -
Rel no follow question
Hello, I probably already know the answer to this question. But, When you use a rel no follow tag on an internal link or external link. Will the google bot still navigate to the link, in question? Thanks for your help.
| PeterRota0 -
Is it a good idea to use an old domain name for a new product
Hi guys, I have a domain name XYZ.com which hosts the site of a technology service company as of now. The company however didn't do well and shut down a few years ago. Now, that company wants to launch a new set of technology products and wants to use the same domain name. Is it a good idea. The issues that I can see here are: 1. Google has previous pages indexed 2. There are a couple of subdomains totally irrelevant to the business. like employees.xyz.com there. 3. Can the previous indexing be completely undone. Regards, Mayank
| mayanksaxena0 -
Multiple Sitemaps
Hello everyone! I am in the process of updating the sitemap of an ecommerce website and I was thinking to upload three different sitemaps for different part (general/categories and subcategories/productgroups and products) of the site in order to keep them easy to update in the future. Am I allowed to do so? would that be a good idea? Open to suggestion 🙂
| PremioOscar0 -
Local Search: Technically optimised for Reviews & Stars, but not showing in SERPS
Hi, for over a year now we actively use schema.org into our yellow pages platform.
| TruvoDirectories
Simultaneaously we managed to set up a review platform to attract more users to write reviews. We also monitor closely local search experts like (blumenthal and co 😉 ). So I learned in this post http://blumenthals.com/blog/2013/07/19/how-many-reviews-to-get-the-star-treatment-somewhere-between-4-and-5/ that it takes you 4-5 reviews to get the star treatment by Google. But at this moment, I cannot find any star treatment. For example on this listing http://www.goudengids.be/hollywok-kortrijk-kortrijk-8500/1/ you can notice the presence of 6 review (http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.goudengids.be%2Fhollywok-kortrijk-kortrijk-8500%2F1%2F) but in Google itself it is not displayed as such. So my question is: in your experience, are there any other parameters that will trigger the stars to appear?0 -
Author schema and Wordpress Author Page
Hi everyone, Has anyone tried using the author schema on their Wordpress author page or on their G+ profile or on their Moz profile? Would it be a good idea to always use it where you publish? I publish on several blogs Thanks Carla example: Use it here - http://www.posicionamientowebenbuscadores.com/blog/author/carla/ http://moz.com/community/users/392216 It seems like I would be over doing it.
| Carla_Dawson0 -
How long does it take to reindex the website
Generally speaking, how long does it take for Google to recrawl/reindex an (ecommerce) website? After changing a number of product subcategories from 'noindex' back to 'index', I regenerated the sitemap and have fetched as Google in WMT. This was a couple of weeks ago and no action yet. Second question: Does Google treat these pages as if they're brand new? I 'noindexed' them back in April, and they were ranking ok then. (I had noindexed them on the back of advice from my SEO, due to concerns about these pages being seen as duplicate content). Help!
| Coraltoes770 -
Why are some pages now duplicate content?
It is probably a silly question, but all of a sudden, the following pages of one of my clients are reported as Duplicate content. I cannot understand why. They weren't before... http://www.ciaoitalia.nl/product/pizza-originale/mediterranea-halal
| MarketingEnergy
http://www.ciaoitalia.nl/product/pizza-originale/gyros-halal
http://www.ciaoitalia.nl/product/pizza-originale/döner-halal
http://www.ciaoitalia.nl/product/pizza-originale/vegetariana
http://www.ciaoitalia.nl/product/pizza-originale/seizoen-pizza-estate
http://www.ciaoitalia.nl/product/pizza-originale/contadina
http://www.ciaoitalia.nl/product/pizza-originale/4-stagioni
http://www.ciaoitalia.nl/product/pizza-originale/shoarma Thanks for any help in the right direction 🙂 | |
| |
| |
| |
| |
| |
| |
| | <colgroup><col style="mso-width-source: userset; mso-width-alt: 17225; width: 353pt;" width="471"></colgroup>
| http://www.ciaoitalia.nl/product/pizza-originale/mediterranea-halal |
| http://www.ciaoitalia.nl/product/pizza-originale/gyros-halal |
| http://www.ciaoitalia.nl/product/pizza-originale/döner-halal |
| http://www.ciaoitalia.nl/product/pizza-originale/vegetariana |
| http://www.ciaoitalia.nl/product/pizza-originale/seizoen-pizza-estate |
| http://www.ciaoitalia.nl/product/pizza-originale/contadina |
| http://www.ciaoitalia.nl/product/pizza-originale/4-stagioni |
| http://www.ciaoitalia.nl/product/pizza-originale/shoarma |0 -
What are we doing wrong with Rich Snippets?
So, a client webstore has rich snippets on all products, and it seems they are working fine, and are showing up, the only problem is, that the price for an article is not showing up. The part of the code, that shows the price is this: Redna cena:
| Red_Orbit
29,99 € Is the problem that we have the itemprop="price" in a meta tag? I've read around the internet that if you have a lot of meta, this can be a problem.... Can we change it into: Redna cena: Would this work, or is there another thing that we can try? The URL for this article is http://www.bigbang.si/igre/sleeping-dogs-le-x360-4989980 -
Google rankings strange behaviour - our site can only be found when searching repeatedly
Hello, We are experiencing something very odd at the moment I hope somebody could shed some light on this. The rankings of our site dropped from page 2 to page 15 approx. 9 months ago. At first we thought we had been penalised and filed a consideration request. Google got back to us saying that there was no manual actions applied to our site. We have been working very hard to try to get the ranking up again and it seems to be improving. Now, according to several serps monitoring services, we are on page 2/3 again for the term "holiday lettings". However, the really strange thing is that when we search for this term on Google UK, our site is nowhere to be found. If you then right away hit the search button again searching for the same term, then voila! our website is on www.alphaholidaylettings.com page 2 / 3! We tried this on many different computers at different locations (private and public computers), making sure we have logged out from Google Accounts (so that customised search results are not returned). We even tried the computers at various retail outlets including different Apple stores. The results are the same. Essentially, we are never found when someone search for us for the first time, our site only shows up if you search for the same term for the second or third time. We just could not understand why this is happening. Somebody told me it could be due to "Google dance" when indices on different servers are being updated, but this has now been going on for nearly 3 months. Has anyone experienced similar situations or have any advice? Many thanks!
| forgottenlife0 -
20 000 duplicates in Moz crawl due to Joomla URL parameters. How to fix?
We have a problem of massive duplicate content in Joomla. Here is an example of the "base" URL: http://www.binary-options.biz/index.php/Web-Pages/binary-options-platforms.html For some reason Joomla creates many versions of this URL, for example: http://www.binary-options.biz/index.php/Web-Pages/binary-options-platforms.html?q=/index.php/Web-Pages/binary-options-platforms.html?q=/index.php/Web-Pages/binary-options-platforms.html?q=/index.php/Web-Pages/binary-options-platforms.html?q=/index.php/Web-Pages/binary-options-platforms.html?q=/index.php/Web-Pages/binary-options-platforms.html?q=/index.php/Web-Pages/binary-options-platforms.html?q=/index.php/Web-Pages/binary-options-platforms.html or http://www.binary-options.biz/index.php/Web-Pages/binary-options-platforms.html?q=/index.php/Web-Pages/binary-options-platforms.html?q=/index.php/Web-Pages/binary-options-platforms.html?q=/index.php/Web-Pages/binary-options-platforms.html So it lists the URL parameter ?q= and then repeats part of the beforegoing URL. This leads to tens of thousands duplicate pages in our content heavy site. Any ideas how to fix this? Thanks so much!
| Xmanic0 -
Penguin 2.1 update, ranking dropped.
Hi, My website was hit by Google's new update like never before, first my ranking dropped back in May when Google rolled out with their second Penguin update, back then i was outsourcing my SEO since most of the time i was working on optimizing my local maps. As soon as noticed my traffic dropped i start doing SEO my self, I removed over optimized keywords on website ( title, meta description) then i analyzed my link profile and found that i had a lot of commercial anchor text linking from spammy websites, mostly blog comments. Its been over 4 months since I used Google's disavow tools in hoping that will help me get my raking back, but i still see these spammy links in my profile, 90% of them are nofollow anyways , so i'm not sure if disavow tool helped me at all. With this Penguin 2.1 update my website ended up on page 4-10 for 90% of my keywords http://screencast.com/t/3bcw8LUxpj Im not sure what would be the best way out this, should i buy a new domain and start fresh?Please let me know if you want to take a look on my link profile and give your opinion. Looking forward to your help!
| mezozcorp0 -
Webmaster Tools Manual Actions - Should I Disavow Spammy Links??
My website has a manual action against it in webmaster tools stating; Unnatural links to your site—impacts links Google has detected a pattern of unnatural artificial, deceptive, or manipulative links pointing to pages on this site. Some links may be outside of the webmaster’s control, so for this incident we are taking targeted action on the unnatural links instead of on the site’s ranking as a whole I have checked the link profile of my site and there are over 4,000 spammy links from one particular website which I am guessing this manual action refers to. There is no way that I will be able to get these links removed so should I be using Google's Disavow Tool or is there no need? Any ideas would be appreciated!!
| Pete40 -
Can you noindex a page, but still index an image on that page?
If a blog is centered around visual images, and we have specific pages with high quality content that we plan to index and drive our traffic, but we have many pages with our images...what is the best way to go about getting these images indexed? We want to noindex all the pages with just images because they are thin content... Can you noindex,follow a page, but still index the images on that page? Please explain how to go about this concept.....
| WebServiceConsulting.com0 -
Can I use a 410'd page again at a later time?
I have old pages on my site that I want to 410 so they are totally removed, but later down the road if I want to utilize that URL again, can I just remove the 410 error code and put new content on that page and have it indexed again?
| WebServiceConsulting.com0 -
Domain Migration Information
Hi, We are in the process of switching from *.net to *.com and I am looking for some resources on this. Any suggestions?
| EcomLkwd0 -
How to Remove a website from your Bing Webmaster Tools account
I have a site in Bing Webmaster Tools that I no longer work on. I can't seem to find where to delete this website from my webmaster tools account. Anyone know how (there doesn't seem to be anything obvious under Bing Help or on a Google Search).
| TopFloor0 -
If Google's index contains multiple URLs for my homepage, does that mean the canonical tag is not working?
I have a site which is using canonical tags on all pages, however not all duplicate versions of the homepage are 301'd due to a limitation in the hosting platform. So some site visitors get www.example.com/default.aspx while others just get www.example.com. I can see the correct canonical tag on the source code of both versions of this homepage, but when I search Google for the specific URL "www.example.com/default.aspx" I see that they've indexed that specific URL as well as the "clean" one. Is this a concern... shouldn't Google only show me the clean URL?
| JMagary0 -
Umbrella company and multiple domains
I'm really sorry for asking this question yet again. I have searched through previous answers but couldn't see something exactly like this I think. There is a website called example .com. It is a sort of umbrella company for 4 other separate domains within it - 4 separate companies. The Home page of the "umbrella" company website is example.com. It is just an image with no content except navigation on it to direct to the 4 company websites. The other pages of website example.com are the 4 separate companies domains. So on the navigation bar there is : Home page = example.com company1page = company1domain.com company2page= company2domain.com etc. etc. Clicking "home" will take you back to example.com (which is just an image). How bad or good is this structure for SEO? Would you recommend any changes to help them rank better? The "home" page has no authority or links, and neither do 3 out of the 4 other domains. The 4 companies websites are independent in content (although theme is the same). What's bringing them altogether is under this umbrella website - example.com. Thank you
| AL123al0 -
Image Height/Width attributes, how important are they and should a best practice site include this as std
Hi How important are the image height/width attributes and would you expect a best practice site to have them included ? I hear not having them can slow down a page load time is that correct ? Any other issues from not having them ? I know some re social sharing (i know bufferapp prefers images with h/w attributes to draw into their selection of image options when you post) Most importantly though would you expect them to be intrinsic to sites that have been designed according to best practice guidelines ? Thanks
| Dan-Lawrence0 -
Multi-domain content and meta data feed
Hi, I am working with a client whose web developer has offered to build a CMS that auto-feeds meta-data and product descriptions (on-page content) to two different websites which have two completely different URL's (primary domain names) associated with them. Please see screenshots attached for examples. The entire reason this has been offered is to avoid duplicate content issues. The client has two E-Commerce websites but only one content management system that can update both simultaneously. The work-around shown in the screenshots is the developers attempt at ensuring that both sites have unique meta data and on-page content associated with each product. Can anyone advise whether they foresee that this may cause any issues from an SEO perspective. Thanks in advance wM3ngsj.png KtBun98.png
| SteveK640 -
XML Sitemap without PHP
Is it possible to generate an XML sitemap for a site without PHP? If so, how?
| jeffreytrull11
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.