Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • When I have a long article about a single topic with sub-topics I can make it user friendlier when I limit the text and hide text just showing the next headlines, by using expandable-collapsible div's. My doubt is if Google is really able to read onclick textlinks (with javaScript) or if it could be "seen" as hidden text? I think I read in the SEOmoz Users Guide, that all javaScript "manipulated" contend will not be crawled. So from SEOmoz's Point of View I should better make use of old school named anchors and a side-navigation to jump to the sub-topics? (I had a similar question in my post before, but I did not use the perfect terms to describe what I really wanted. Also my text is not too long (<1000 Words) that I should use pagination with rel="next" and rel="prev" attributes.) THANKS for every answer 🙂

    | inlinear
    0

  • Hey guys, Just signed up for a pro account and I am getting Duplicate Page Title warnings on links that are duplicate, rewritten for SEO, but have a canonical href tag. I have two sets of links in my store: SEO friendly: http://www.mysite.com/item/iphone-case Operational link: http://www.mysite.com/shop/product.php?pid=11 This operational link however has a href canonical tag pointing to the SEO friendly link as being the primary link. My question is; Do I need to worry about this Duplicate Page Title Warning if I am using a canonical tag on the Operational link pointing to the SEO friendly link? Thanks!

    | jason360
    0

  • Back in March 2011 this conversation happened. Rand:    You don't want rel=canonicals. Duane:    Only end state URL. That's the only thing I want in a sitemap.xml. We have a very tight threshold on how clean your sitemap needs to be. When people are learning about how to build sitemaps, it's really critical that they understand that this isn't something that you do once and forget about. This is an ongoing maintenance item, and it has a big impact on how Bing views your website. What we want is end state URLs and we want hyper-clean. We want only a couple of percentage points of error. Is this the same with Google?

    | DoRM
    0

  • Hi, For the example I will use a computers e-commerce store... I'm working on creating guides for the store - 
    How to choose a laptop
    How to choose a desktop I believe that each guide will be great on its own and that it answers a specific question (meaning that someone looking for a laptop will search specifically laptop info and the same goes for desktop). This is why I didn't creating a "How to choose a computer" guide. I also want each guide to have all information and not to start sending the user to secondary pages in order to fill in missing info. However, even though there are several details that are different between the laptops and desktops, like importance of weight, screen size etc., a lot of things the checklist (like deciding on how much memory is needed, graphic card, core etc.) are the same. Please advise on how to pursue it. Should I just write two guides and make sure that the same duplicated content ideas are simply written in a different way?

    | BeytzNet
    0

  • Hi, I asked our IT team to be able to write custom page titles in our CMS and they came up with a solution that writes the title dynamically with javascript. When I look on the page, I see the title in the browser, but when I look in the source code, I see the original page title. I am thinking that Google won't see the new javascript title, so it will not be indexed and have no impact on SEO. Am I right ?

    | jfmonfette
    0

  • Hi, We've took over a website last July and no matter what we do we just can't get it ranking in Google, even for noncompetitive terms. here is the website in question: http://www.alignandsmile.co.uk Ideally the client would like to rank for Canary Wharf but that location is competitive, the site doesn't even rank for 'Dentist New Providence Wharf E14' despite it being included in the title tag on the home page and in the content throughout the website. Directories with Align and Smile's business information do rank however. I opened a case with google through Webmaster tools and they  'reviewed your site and found no manual actions by the webspam team that might affect your site's ranking in Google.' So I'm a bit stuck. The site ranks top for the keyphrase in Bing and Yahoo...we are really struggling with Google! Any help would be much appreciated. many thanks Marcus

    | dentaldesign
    0

  • Hi, Someone has copied our website design layout and I want to claim in Google. Can you guys suggest me the options to report in Google as like DMCA for content OR we can claim in DMCA for design as well? Please guide and provide me the steps. How can we deal with duplicate website theme?

    | RuchiPardal
    0

  • I have a job site and I am planning to introduce a search feature. The question I have is, is it a good idea to index search results even if the query parameters are not there? Example: A user searches for "marketing jobs in New York that pay more than 50000$". A random page will be generated like example.com/job-result/marketing-jobs-in-new-york-that-pay-more-than-50000/ For any search that gets executed, the same procedure would be followed. This would result in a large number of search result pages automatically set up for long tail keywords. Do you think this is a good idea? Or is it a bad idea based on all the recent Google algorithm updates?

    | jombay
    0

  • My site is http://www.clairehegarty.co.uk/ Hi, my site has always done amazing in the rankings, for a few years i have been number one for the word gastric band hypnotherapy as well as many other keywords which includes hypno band.  but in the past couple of weeks i have seen some of my keywords drop and end up on pages two and three of google instead of page one. Can anyone please give me advice on what i need to do to change this situation please

    | ClaireH-184886
    0

  • can disavowing links cause deindexing from google ? we had about 2.5M pages indexed until dec 30th , since then it s dropped to about 600K We received a unnatural link warning in july, got hit in september, since then we are suffering substantially in ranks and also in business revenue. We've used the disavow tool, and also got tons of links removed within the past 3 , 4 months. Since october we havent got any response from google about what is going on despite sending another recon in Nov 2012. Now the site is getting deindexed. what should we do at this point? Any help is greatly appreciated. here is the url http://goo.gl/Ai17f Thank you nick

    | orion68
    0

  • I am trying to find out the best way to do this. Do you use hreview? Thanks,

    | netviper
    0

  • I'm trying to do my best to control the link juice from my home page to the most important category landing pages on my client's e-commerce site. I have a couple questions regarding how to NOT pass link juice to insignificant pages and how best to pass juice to my most important pages. INSIGNIFICANT PAGES: How do you tag links to not pass juice to unimportant pages. For example, my client has a "Contact" page off of there home page. Now we aren't trying to drive traffic to the contact page, so I'm worried about the link juice from the home page being passed to it. Would you tag the Contact link with a "no follow" tag, so it doesn't pass the juice, but then include it in a sitemap so it gets indexed? Are there best practices for this sort of stuff?

    | Santaur
    0

  • Good day! I am not sure if my company has a Canonicalization issue? When typing in www.cushingco.com the site redirects to http://www.cushingco.com/index.shtml A visitor can also type in http://cushingco.com/index.shtml into a web browser and land on our homepage (and the url will be http://www.cushingco.com/index.shtml) A majority of websites that link to our company point to: http://www.cushingco.com/index.shtml We are in the process of cleaning up citations and pulling together a content marketing strategy/editorial calendar. I want to be sure folks interested in linking to us have the right url. Please ask me any questions to help narrow down what we might be doing incorrectly. Thanks in advance!! Jon

    | SEOSponge
    0

  • Hi there, I have a rather large site that has duplicate content on many pages due to how it's being spidered by google. I was hoping I could set the internal link to this page as "nofollow." My question is that I have hundreds of other sites with backlinks to these duplicate content pages.. will this affect me negatively if I tell google not to index the duplicated pages?

    | trialminecraftserverfinder
    0

  • As new as them come to SEO so please be gentle....  I have a wordpress site setup for my photography business.  Looking at my crawl diagnostics I see several 4xx (client error) alerts.  These all show up to non existent pages on my site IE: | http://www.robertswanigan.com/happy-birthday-sara/109,97,105,108,116,111,58,104,116,116,112,58,47,47,109,97,105,108,116,111,58,105,110,102,111,64,114,111,98,101,114,116,115,119,97,110,105,103,97,110,46,99,111,109 | Totally lost on what could be causing this. Thanks in advance for any help!

    | Swanny811
    0

  • Hi I was always told that you only want a .co.uk or a .com is this still true? I am setting up an ecommerce site and the ideal .co.uk is gone! Am I better to have a longer ulr with maybe a hifen or a uk in it or is it ok to have a .biz or a .net these days? You help would be greatly appreciated! Thanks James

    | JamesBryant
    0

  • I have recently started an e-commerce website and have now changed the url structure and added another level to my category pages. So where it before was www.website.com/shirts it is now www.website.com/clothes/shirts. So I added the clothes category (just an example) before the shirt category and am now finding that the old url is still found in the search index and is still live on my site. How could this be? I use wordpress and simply change the urls in the backend. The products are still under www.website.com/product/blue-shirt-123 so they won't be affected but I suppose it now means I have duplicate category pages? So my question is: Should I 301 the the old category page (www.website.com/shirts)to the new url (www.website.com/clothes/shirts). And how can the old url still be live on my site? If this was a bit unclear, please let me know. Appreciate your replies!

    | bitte
    0

  • I am moving my whole site over to wordpress (150+pgs). In the process I assigned pages to appropriate parent pages via "page attributes". I was really excited about this. I like how it organizes everything in the pages dashboard. I also think that the sitemap that comes with my theme can create something really great for visitors with this info. What I realized after doing that is that it changed my url to include the parent page. Basically, the url is now "domain.com/parent-page/child-page.html". This is rather disasterous because the url's of these newly created child pages on my old site are simple "domain.com/child-page". Not that they're defined as parent or child pages on my existing dreamweaver/html site... but you know what I mean - Right?! I got a plugin called "Permalink Editor" to let me customize the url. So, I went through all of the child pages and got rid of the parent page in the url. Then when I woke up this morning I realized that what I've created is a "permalink alias". That sounds a little bit scary to me. Perhaps like google could consider it spam and like I'm trying to "sculpt link flow". I'm not... I'm just trying to recreate my site as it is in wordpress. I want the site to be exactly the same in terms of the url's. But, I want the many benefit's of wordpress' CMS. Should I go an unassign all of the parent/child pages in the "Page Attributes". Or, am I being paranoid and should I leave it as is? fyi - this is the first page that came up with I searched for permalink alias. It looks kind of black-hatty to me?!
    - http://www.seodesignsolutions.com/blog/wordpress-seo/seo-ultimate-4-7/ Thanks so much. I look forward to a response!

    | nsjadmin
    0

  • Hi, We recently changed our eCommerce site structure a bit and separated our product reviews onto a a different page. There were a couple of reasons we did this : We used pagination on the product page which meant we got duplicate content warnings. We didn't want to show all the reviews on the product page because this was bad for UX (and diluted our keywords). We thought having a single page was better than paginated content, or at least safer for indexing. We found that Googlebot quite often got stuck in loops and we didn't want to bury the reviews way down in the site structure. We wanted to reduce our bounce rate a little, so having a different reviews page could help with this. In the process of doing this we tidied up our microformats a bit too. The product page used to have to three main microformats; hProduct hReview-Aggregate hReview The product page now only has hProduct and hReview-Aggregate (which is now nested inside the hProduct). This means the reviews page has hReview-Aggregate and hReviews for each review itself. We've taken care to make sure that we're specifying that it's a product review and the URL of that product. However, we've noticed over the past few weeks that Google has stopped feeding the reviews into the SERPs for product pages, and is instead only feeding them in for the reviews pages. Is there any way to separate the reviews out and get Google to use the Microformats for both pages? Would using microdata be a better way to implement this? Thanks,
    James

    | OptiBacUK
    0

  • Hi there, I have been a pro member of SEOmoz for a while now but this is my question in the forum and although I have looked through so much helpful information I was wondering if someone could give me some further advice and guidance? I have a 3 year old ecommerce website personalisedmugs.co.uk which until May 2012 had some excellent growth, we then lost around 50% of traffic due to reduced organic rankings in google. We then noticed a further drop again in September. From researching information I believe this drop was from the penguin update and EMD update? Since these updates we have: *Stopped working with a company in India whom was looking after SEO for us for 18 months redeveloped/designed website and upgraded software version constantly refreshed website with content as we always have done Modified internal anchor text (this did seem keyword rich) My next steps I believe before giving up 😞 is checking our links coming into website? Is anybody able to please help me with regards to our links or point me in the right direction. I have no idea where to start or what do now? Someone may see something really obvious so any help or guidance is greatly appreciated to assist me in gaining some UK organic rankings back. Kind Regards, Mark

    | SparkyMarky
    0

  • Hi there, I was wondering if anyone was an expert on galleries and using canonical URL's? URL: http://www.tecsew.com/gallery In short I'm doing SEO for a site and it has a large gallery (3000+ images) where each specific image has it's own page and each category (there's 200+) also has its own page. Now, what I'm thinking is that this should be reduced and asking Google to index/rank each page is wrong (I also think this because the quality of the pages are relatively low i.e little text & content etc) Therefore, what should be suggested/done to the gallery? Should just the main gallery categories get indexed (i.e http://www.tecsew.com/3d-cad-showcase)? Or should I continue to allow Google to trawl through all of it? Or should canonical URL's be used? Any help would be greatly appreciated. Best Wishes, Charlie S

    | media.street
    0

  • We have a large number of product pages that contain links to a .pdf of the technical specs for that product. These are all set up to open in a new window when the end user clicks. If these pages are being crawled, and a bot follows the link for the .pdf, is there any way for that bot to continue to crawl the site, or does it get stuck on that dangling page because it doesn't contain any links back to the site (it's a .pdf) and the "back" button doesn't work because the page opened in a new window? If this situation effectively stops the bot in its tracks and it can't crawl any further, what's the best way to fix this? 1. Add a rel="nofollow" attribute 2. Don't open the link in a new window so the back button remains finctional 3. Both 1 and 2 or 4. Create specs on the page instead of relying on a .pdf Here's an example page: http://www.ccisolutions.com/StoreFront/product/mackie-cfx12-mkii-compact-mixer  - The technical spec .pdf is located under the "Downloads" tab [the content is all on one page in the source code - the tabs are just a design element] Thoughts and suggestions would be greatly appreciated. Dana

    | danatanseo
    0

  • What are the best practices for generating SEO-friendly headlines? dashes between words? underscores between words? etc. Looking for a programatically generated solution that's using editor-written headlines to produce an SEO-friendly URL Thanks.

    | ShaneHolladay
    0

  • Hi Mozzers, recently I received a project to promote a hotel website in a third world country. They have no street names, no landline phone, no zip-code. So far I tried to give a good address description in all social networks and on the homepage (footer) and signed into hotel directories. Suddently a new website of another hotel came up on google and made it up to number 1. They put a fake telefon number (landline) on the website. Is that a good idea of localizing a business? Do you have recommendations for me how to enhance. Thanks

    | reisefm
    0

  • Should you use not use slashes and use all dashes or use just a few slashes and the rest with dashes? For example, domain.com/category/brand/product-color-etc  OR  domain.com/anythinghere-color-dimensions-etc Which structure would you rather go for and why?

    | Zookeeper
    0

  • I have a site that was ranking in the top two for my search terms. We had a funky url (it contained hyphens) and was advised to change it for SEO, so I setup a perm redirect through my web host (before it was a temporary one I think) At the same time I installed a sitemap plugin for Wordpress and also registered for a Google Places account. I can't remember the exact order I did this -- does it matter? Anyway, within a couple days of doing the above, my ranking dropped to the bottom of the second page. I would like to fix this, but I'm not sure. I need help please!

    | fsvatousek
    0

  • Hi, i have just been into google webmaster tools and i have noticed that five of my websites are no longer verified. i have tried putting the code back into the head and also i have tried verifying it through google analaystics but nothing is working can anyone let me know what has happened and if anyone has noticed this regards

    | ClaireH-184886
    0

  • I will be the first to admit I am never really 100% sure when to use canonical urls. I have a quick question and I am not really sure if this is a situation for a canonical or not. I am looking at a my friends building website and there are issues with what pages are ranking. Basically there homepage is focusing on the building refurbishment location but for some reason in internal page is ranking for that keyword and it is not mentioned at all on that page. Would this be a time to add the homepage url and a canonical on the ranking page (using yoast plugin) to tell Google that the homepage is the preferred page? Thanks Paul

    | propertyhunter
    0

  • Hi all, Looking through the lovely SEOMOZ report, by far its biggest complaint is that of perceived duplicate content.  Its hard to avoid given the nature of eCommerce sites that oestensibly list products in a consistent framework. Most advice about duplicate content is about canonicalisation, but thats not really relevant when you have two different products being perceived as the same. Thing is, I might have ignored it but google ignores about 40% of our site map for I suspect the same reason.  Basically I dont want us to appear "Spammy".  Actually we do go to a lot of time to photograph and put a little flavour text for each product (in progress). I guess my question is, that given over 700 products, why 300ish of them would be considered duplicates and the remaning not? Here is a URL and one of its "duplicates" according to the SEOMOZ report: http://www.1010direct.com/DGV-DD1165-970-53/details.aspx
    http://www.1010direct.com/TDV-019-GOLD-50/details.aspx Thanks for any help people

    | fretts
    0

  • Hi: I got a report indication 17 rel canonical notices. What does this mean in simple language and how do i go about fixing things?

    | Shaaps
    0

  • Hi google is having problem accessing my site. each day it is bringing up access denied errors and when i have checked what this means i have the following Access denied errors In general, Google discovers content by following links from one page to another. To crawl a page, Googlebot must be able to access it. If you’re seeing unexpected Access Denied errors, it may be for the following reasons: Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content. (Tip: You can get around this by removing this requirement for user-agent Googlebot.) Your robots.txt file is blocking Google from accessing your whole site or individual URLs or directories. Test that your robots.txt is working as expected. The Test robots.txt tool lets you see exactly how Googlebot will interpret the contents of your robots.txt file. The Google user-agent is Googlebot. (How to verify that a user-agent really is Googlebot.) The Fetch as Google tool helps you understand exactly how your site appears to Googlebot. This can be very useful when troubleshooting problems with your site's content or discoverability in search results. Your server requires users to authenticate using a proxy, or your hosting provider may be blocking Google from accessing your site. Now i have contacted my hosting company who said there is not a problem but said to read the following page http://www.tmdhosting.com/kb/technical-questions/other/robots-txt-file-to-improve-the-way-search-bots-crawl/ i have read it and as far as i can see i have my file set up right which is listed below. they said if i still have problems then i need to contact google. can anyone please give me advice on what to do. the errors are responce code 403 User-agent: *
    Disallow: /administrator/
    Disallow: /cache/
    Disallow: /components/
    Disallow: /includes/
    Disallow: /installation/
    Disallow: /language/
    Disallow: /libraries/
    Disallow: /media/
    Disallow: /modules/
    Disallow: /plugins/
    Disallow: /templates/
    Disallow: /tmp/
    Disallow: /xmlrpc/

    | ClaireH-184886
    0

  • On my e-commerce site, I have user reviews that cycle in the header section of my category pages.  They appear/cycle via a snippet of code that the review program provided me with. My question is...b/c the actual user-generated content is not in the page content does the google-bot not see this content? Does it not treat the page as having fresh content even though the reviews are new? Does the bot only see the code that provides the reviews? Thanks in advance.  Hopefully this question is clear enough.

    | IOSC
    0

  • Our own site and client sites (that we design) are either hosted on the same IP or within same C Class range of IPs Will there be an issue with Google if we include a link on the client site home page footer (as agreed with client) that links to a specific project page on our website for that client e.g they are not linking to our home page?

    | NeilD
    0

  • I use one Godaddy shared Linux hosting account for 4 separate websites. In Google Webamster Tools, specifally "Sitr Errors," I noticed that inner pages from another site are being listed as a broken link in the original unique-now-shared site. I checked and the files are not mi-installed. My question is, should each of the four sites have a unique hosting plan and/or static IP? Thanks, Eric

    | monthelie1
    0

  • I have a small site and write original blog content for my small audience. There is a much larger, highly relevant site that is willing to accept guest blogs and they don't require original content. It is one of the largest sites within my niche and many potential customers of mine are there. When I create a new article I first post to my blog, and then share it with G+, twitter, FB, linkedin. I wait a day. By this time G has seen the links that point to my article and has indexed it. Then I post a copy of the article on the much larger site. I have a rel=author tag within the article but the larger site adds "nofollow" to that tag. I have tried putting a link rel=canonical tag in the article but the larger site strips that tag out. So G sees a copy of my content on this larger site. I'm hoping they realize it was posted a day later than the original version on my blog. But if not will my blog get labeled as a scraper? Second: when I Google the exact blog title I see my article on the larger site shows up as the #1 search result but (1) there is no rich snippet with my author creds (maybe because the author tag was marked nofollow?), and (2) the original version of the article from my blog is not in the results (I'm guessing it was stripped out as duplicate). There are benefits for my article being on the larger site, since many of my potential customers are there and the article does include a link back to my site (the link is nofollow). But I'm wondering if (1) I can fix things so my original article shows up in the search results, or (2) am I hurting myself with this strategy (having G possibly label me a scraper)? I do rank for other phrases in G, so I know my site hasn't had a wholesale penalty of some kind.

    | scanlin
    0

  • I just discovered that my company's 'dev website' (which mirrors our actual website, but which is where we add content before we put new content to our actual website) is being indexed by Google.  My first thought is that I should add a rel=canonical tag to the actual website, so that Google knows that this duplicate content from the dev site is to be ignored.  Is that the right move?  Are there other things I should do? Thanks!

    | williammarlow
    0

  • Hello there Please help! I am getting this kind of error in the whole site. http://www.mileycyrus-online.co.uk/leaked-hannah-montana-the-movie-pictures.html/comments Running on wordpress site. I chagned the template few times.. most of the error ends with a /comments. Infact all my post has the same issue: http://www.mileycyrus-online.co.uk/miley-cyrus-at-golden-globes-ceremony.html/comments http://www.mileycyrus-online.co.uk/miley-cyrus-at-president-obamas-inauguration-concert.html/comments 404 Error.

    | ExpertSolutions
    0

  • Hello, On 9/11/12, we submitted a reconsideration request to Google for http://macpokeronline.com, at the time we received penalties from both penguin and manual removal. We have since worked on cleaning up our link profile, and  got this response from Google: We received a request from a site owner to reconsider how we index the following site: http://www.macpokeronline.com/. We've now reviewed your site. When we review a site, we check to see if it's in violation of our Webmaster Guidelines. If we don't find any problems, we'll reconsider our indexing of your site. If your site still doesn't appear in our search results, check our Help Center for steps you can take. I honestly don't even know how to take this, we always showed up #1 while doing a site search, so it is kind of irrelevant to us in this case. Is this the reply of them accepting our request? Thanks Zach

    | Zachary_Russell
    0

  • I'm doing a SEO site audit, and I've discovered that the site uses a Content Delivery Network (CDN) that's being crawled and indexed by Google. There are two sub-domains from the CDN that are being crawled and indexed. A small number of organic search visitors have come through these two sub domains. So the CDN based content is out-ranking the root domain, in a small number of cases. It's a huge duplicate content issue (tens of thousands of URLs being crawled) - what's the best way to prevent the crawling and indexing of a CDN like this? Exclude via robots.txt? Additionally, the use of relative canonical tags (instead of absolute) appear to be contributing to this problem as well. As I understand it, these canonical tags are telling the SEs that each sub domain is the "home" of the content/URL. Thanks! Scott

    | Scott-Thomas
    0

  • Hello Community! We are rewriting URLs to better rank and provide better visual usability to our visitors. Would it be better to serve a URL that looks like this: www.domain.com/category-subcategory or www.domain.com/category/subcategory Please note the slight difference--the 2nd URL calls out a category that has a subcategory under it. Would it give us more value? Does it make a difference? Thanks in advance!

    | JCorp
    0

  • Hey SEOMozers, I'm working with a client that has a suspicious traffic pattern going on. In October, a referral domain called profitclicking.com started passing visits to the site. Almost, in parallel the overall visits decreased anywhere from 35 to 50%. After checking out profitclicking.com more, it promises more traffic "with no SEO knowledge". The client doesn't think that this service was signed up for internally. Regardless, it obviously smells pretty fishy, and I'm searching for a way I can disallow traffic from this site. Could I simply just write a simple disallow statement in the robots.txt and be done with it? Just wanted to see if anyone else had any other ideas before recommending a solution. Thanks!

    | kylehungate
    0

  • It was suggested in Quick Sprout's Advanced SEO guide that it's good form to place your Feedburner RSS link into the header tag of your blog. Anyone know if this needs to be done for every page header of the blog, or just the home/main/index page? Thanks

    | Martin_S
    0

  • Do search engines pay attention to periods in abbreviated queries? If I use Mt. Bachelor all over my site, would SE's not rank my site well for queries that use Mt Bachelor?

    | Shawn_Huber
    0

  • My domain authority is 35 (homepage Page Authority = 45) and my website has been up for years: www.rainchainsdirect.com Most random pages on my site (like this one) have a Page Authority of around 20. However, as a whole, the individual pages of my products rank exceptionally low. Like these: http://www.rainchainsdirect.com/products/copper-channel-link-rain-chain (Page Authority = 1) http://www.rainchainsdirect.com/collections/todays-deals/products/contempo-chain (Page Authority = 1) I was thinking that for whatever reason they have such low authority, that it may explain why these pages rank lower in google for specific searches using my exact product name (in other words, other sites that are piggybacking of my unique products are ranking higher for my product in a specific name search than the original product itself on my site) In any event, I'm trying to get some perspective on why these pages remain with the same non-existent Page Authority.  Can anyone help to shed some light on why and what can be done about it? Thanks!

    | csblev
    0

  • For a long time our site map used to be http://www.efurniturehouse.com/sitemap.xml recently our hosting company changed the site map to: http://www.efurniturehouse.com/xml-sitemap.ashx I went ahead and submitted the new site maps to both Google Webmaster and Bing. I submitted the Google one on Monday and it states PENDING. ( A day later this pending) I just submitted the map to Bing. I now have 2 site maps on each. 1)Is having 2 a problem Will they ignore the old site map or can we delete and if so when can we delete I appreciate your input Regards Tony www.eFurnitureHouse.com

    | OCFurniture
    0

  • I have a client who is consolidating multiple EMD domains into a single domain for SEO reasons and for practical reason, like not having to produce content and perform SEO for 20 domains. My question is this: Do I need to 301 every single page from these old EMD domains? I bill this client hourly and while I could take the time to write 301s for literally thousands of pages I feel that this might not be the best use of his money, that I could strategically 301 the landing pages that get traffic and then route everything else to the new root domain...thoughts? I've researched this and have not been able to hear a really strong opinion yet.

    | BrianJGomez
    0

  • I have a regular site map on my site and I also have a Wordpress site installed within it that we use for blog/news content.  I currently have an auto-sitemap generator installed in Wordpress which automatically updates the sitemap and submits it to the search engines each time the blog is updated. The question I have (which I think I know the answer to but I just want to confirm) is do I have to include all of the articles within the blog in the main site's sitemap despite the Wordpress sitemap having them in there already? If I do include the articles in the main website's sitemap, they would also be in the Wordpress sitemap as well, which is redundant. Redundancy is not good, so I just want to make sure.

    | iresqkeith
    0

  • Do the urls towards the top get higher priority?

    | iresqkeith
    0

  • Hi, My menu has a image with links to some of the main pages on the site and text underneath it explaining what the banner is. Will it be beneficial or harmful to have the text hyperlinked to the same pages the images go to?

    | theLotter
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.