Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi everyone, I've read a lot about different techniques to fix duplicate content problems caused by eCommerce faceted navigation (e.g. redundant URL combinations of colors, sizes, etc.). From what I've seen suggested methods include using AJAX or JavaScript to make the links functional for users only and prevent bots from crawling through them. I was wondering if this technique would work instead? If we detect that the user is a robot, instead of displaying a link, we simply display its anchor text. So what would be for a human COLOR < li > < a href = red >red < /a > < /li >
    < li > < a href = blue>blue < /a > < /li > Would be for a robot COLOR < li > red < /li >
    < li > blue < /li > Any reason I shouldn't do this? Thanks! *** edit Another reason to fix this is crawl budget since robots can waste their time going through every possible combination of facet. This is also something I'm looking to fix.

    | anthematic
    0

  • In the on page report indicates that the maximum is 15. What are the best? It includes keywords on title, description and images names?

    | Naghirniac
    0

  • I cant really get my head around this one. I've read a few times when building links make sure you pick up so low value links as well. So here is an example (and lets say each link takes half hour to get): I got 5 hours of link building and this is what I have managed to get with the time. 1. 10 high value links all with PA/DA 50-60+ 2. 5 high value links with PA/DA 50-60+ AND another 5 low value links with PA/DA 10-. Surely #1 beats #2  hands down?

    | activitysuper
    0

  • Hello, I have a blog www.digitaldiscovery.eu and I have been working the link building. Now I have a few links pointing into my blog and in Google Webmaster and in Open Site Explorer I can see the URL of those websites. In scale from 1 to 10 how usefull is to have a blogroll in my blog pointing back to those high PR links? How usefull is this in link-building strategy? Tks in advance! PP

    | PedroM
    0

  • I run the website http://cvcsports.com for myself and my parents. We offer custom varsity jackets for athletes/companies/etc. We rank first in Google for "letterman jackets" and near the top for "varsity jackets". I really want to reach #1 for "varsity jackets" (we were briefly #1 a few days ago but didn't stay there). Does anyone have any advice on what I can do to achieve that? Thanks in advance for the tips!

    | BrandonDoyle
    0

  • Sometimes, its nice to get a pat on the back, and it's always helpful to share tactics that work! The question is this: what's your biggest success story? Care to share any pictures or stories on how you got to the top? What do you attribute your success to? Here's one of our success stories - one of our clients wanted to be on the first page for the keyword "Morse Code." We never thought we'd be able to get them on the top of such a competitive keyword, but alas! We never gave up. Since this is a hobby website and not a website generating a huge amount of cash, it was easier to get incoming links from other people who were interested in the topic. I'd say we'd attribute our success to having great content and getting others to write blog posts and join in on the fun. What's your success story and how did you do it? showing-off-seo.jpg

    | alhallinan
    0

  • I have just had my site updated and we have put index.php at the end of all the urls. Not long after the sites rankings dropped. Checking the backlinks, they all go to (example)  http://www.website.com  and not http://www.website.com/index.php. So could this change have effected rankings even though it redirects to the new url?

    | authoritysitebuilder
    0

  • I have a client who has been ranking well in the 7-pack for local searches, for 1.5+ years.  I recently noticed a competitor's Google Places link has little sitelinks attached, but my client's link doesn't have them.  This makes me sad. To provide a concise question: what can I do to help my client get sitelinks along with his Google Places listing in the 7-pack / blended / local results? Some example data: My client's business is called Ambiance Dental and his website is www.mycalgarydentist.com.  An example search to see what I'm talking about is "calgary family dentist".  The competitor that's showing sitelinks is www.aestheticdentalstudio.ca which has a title of "Dentist in Calgary | Cosmetic Treatment in Calgary".  The sitelinks you'll see are "Dr. Gordon Chee", "Links", "Dr. Alexa Geminiano".  Notice that my client doesn't have the same sitelinks. Some further data: If you do a a search for "calgary aesthetic dentist" you'll see the competitor's 1-box local result (is that what it's called?) with his Google Places data and sitelinks.  If you search for "calgary ambiance dentist" you'll get a similar layout SERP for my client, again with no sitelinks. My client's sitelinks: If you search for "ambiance dental calgary" you'll see that Google does offer sitelinks for his site, just not in Google Places it seems. My client's website: My client's website has the navigation coded as a list (UL) without any javascript or complicated code messing things up.  The competitor's navigation is built similarly, though he has about 40 more pages in his main navigation.  My client's page names are concise, which I've read helps with sitelinks, the website is coded very cleanly, the URLs of his site are clear and concise without a complicated folder structure, so it seems like we're doing everything right. I appreciate any input other mozzers can provide, and discussion on the topic.  I'm sure there are others who would benefit from local sitelinks as well!

    | Kenoshi
    0

  • to as many RSS feed directories as possible? Or would this have a similar negative impact that you'd get from submitting a site to loads to "potentially spammy" site directories?

    | PeterAlexLeigh
    0

  • REMOVE

    | REMOVE56
    0

  • Hi, I have one domain which was redirected today (the main domain,root) was redirected to a new subdomain of the main root domain.site was ranking for several keywords and had an authority of 35 on root domain. Site after 12 hours disappeared from google results.dont even rank for its brand name. What will happened to the new subdomain? Will rank again? Have the same keywords plus some more.old keywords are 3 and new are 2. For the old keywords on page optimization gets a c , and for the new a+c. For the new keywords don't have backlinks at all. When my site will start to rank for its old keywords? How long will be down ? Will pass the link juice and authority? If yes how soon ? Because I will loose to much money if stay longer like 1-2 weeks down. Thanks

    | nyanainc
    0

  • Hi,
    I have the opportunity to build a new website and use a domain name that is older than 5 years or buy a new domain name. The aged domain name is a .net and includes a keyword.
    The new domain would include the same keyword as well as the U.S. state abbreviation. Which one would you use and why? Thanks for your help!

    | peterwhitewebdesign
    0

  • We currently use power reviews for our item reviews but want to move towards something that can be hosted internally and thus forgo the cost or power reviews and actually see user generated content play a role in our site.  Any suggestions?

    | MichealGooden
    0

  • Hey Mozzers, I would first like to thank everyone in advance for replying to my question 😉 Actually, my question is 2-part: Hosting & Domains 1) We are currently researching product-related domains and would like to build-out review style mini-sites on WordPress that link back to our main site product pages. We're using X-Cart platform and X-Cart offers a WordPress module. My Dev. recommends installing a main WordPress mini-site template on my server and replicating this template under different domains/unique content, obviously ;-). -My questions is; For backlink purposes, would it be better to host these WordPress pages in a different location/server? 2) Domains (which domain extensions are the best): I have read mixed reviews on this subject ... a) Do dashes (i,e. brand-model.com) have an impact as well?? I read a post regarding this; http://www.commonsensemarketing.net/do-domain-name-extensions-matter/ - and the general feeling was that .com and .net ranked higher, faster but that .info wasn't a bad runner up. I was a bit excited to hear that .info wasn't a bad choice as they are actually "available" and cheap as well (under 3 bucks) until a comment was posted about a "Market Samurai" study. They reported testing 4 domain names (below) with the same article, date & time post . 1. domainname.com
    2. domainname.org
    3. domainname.net
    4. domain-name.com -My question  is: Can anyone give any advise on which domain extensions work better/rank higher faster? com / .net / .org / .info / ect? Also, is it better to have more product related keywords in the domain? Example, one of my products is the "Dogtra 280ncp Platinum". WordStream exact match tells me that "dogtra 280ncp" gets 210 searches per month and that "dogtra 280ncp platinum" gets another 91 searches per month. I'm guessing that its better to buy www.Dogtra280ncpPlatinum.com instead of www.Dogtra280ncp.com as we would pick up the searches for the "platinum" term as well? Question Summary: Is it better to host these mini-sites on another server than my main site? Which domain extensions work better? Is it better to use as many product related keywords in the domain as possible and maybe even throw modifiers in there as well such as "buy" or "review"? Thanks Again!
    Byron-

    | k9byron
    0

  • Scenario: Two sites, exactly the same with a form to capture customer details on the home page (e.g. name, address). Would Google rank a site that uses HTTPS over a site that uses HTTP? From what I've heard, they would trust the HTTPS site more than HTTP and therefore rank it higher. Forum opinions?

    | PeterAlexLeigh
    0

  • Hello Everybody, I've got a technical question about server responses. Imagine this scenario: www.domain.com/not-existing-page/ --> 404 & domain.com/not-existing-page/ --> www.domain.com/not-existing-page/ --> 404 I use Wordpress for my websites and I can't seem to be able to configure it (or the server where I have total control) to stop it doing this. Ideally, a non-existent url should return 404 instantly, not first redirect to the "corect" url and then return 404. Anyone experiencing this and can help? Here's a neat tool that allows you to quickly check server response codes - for those of you who are new to this: http://responsetester.appspot.com/ Much appreciated! Alex

    | pwpaneuro
    1

  • Hi which is the most quality site to outsource my backlinks? freelancer.com odesk.com any other? From elance I am very disappointed. Thanks

    | nyanainc
    0

  • The main pages of my .au site are all in .au, but once you go to the inner pages, the users will be directed to my .com site. The .com will act as the content for the top pages of the .au. Would that be ok?

    | MicroSourcing_PRM
    0

  • Hi, Im my time as an SEO I have never come across the following two scenarios, I am an advocate of using unique content, therefore always suggest and in cases demand that all content is written or re-written. This is the scenarios I am facing right now. For Example we have www.abc.com (has over 200 original recipes) and then we have www.xyz.com with the recipes but they are translated into another language as they are targeting different audiences, will Google penalize for duplicate content? The other issue is that the client got the recipes from www.abc.com (that have been translated) and use them in www.xyz.com aswell, both sites owned by the same company so its not pleagurism they have legal rights but I am not sure how Google will see it and if it will penalize the sites. Thanks!

    | M_8
    1

  • What is better from an SEO point of view a Page with Page Rank of 5 with 0 clicks linking to your site or a page with a Page Rank of 3 with 1000 clicks linking back to your site? Is link juice important? do search engines count Link Juice?

    | SEODinosaur
    0

  • When we start to study SEO and how google see our webpage, one important point is to have good content. But, for beginners like me, we get lost on this. Is not so black and white: what for you is a good content? the text amount matters? there is any trick that all good content websites need to have?

    | Naghirniac
    0

  • Is some Javascript SEO friendly? I know that Google Webmaster Guidelines states you should avoid the use of Javascript, (http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769), but does any one know if Google can read some Javascript or generally not?

    | nicole.healthline
    0

  • Should I use a video player and upload the videos for my website or should I put my videos at youtube and use youtube player?

    | Naghirniac
    0

  • I am reading a thread right now and I came across this statement: Search engines can view clicks only if websites have Google analytics or some toolbar installed. Obviously that's not the case with over 50% of the websites. That's why I don't agree with your comment. True or False?

    | SEODinosaur
    0

  • My site is : http://goo.gl/JgK1e My main keyword is : Plastic Bins i have been going back and forth between page 1 and 2 for this keyword and i was wondering if any of you could provide any guidance as to why i can't get on the top of page 1, and stay there... My site has been around for a while, we believe we have a great user experience, all unique, fresh content, and the lowest prices... I must be missing out on something major if I cannot get a steady page 1 ranking... Any thoughts? Thanks in advance...

    | Prime85
    0

  • I just recieved a error notification because our website is both http and https. http://www.quicklearn.com & https://www.quicklearn.com. My tech tells me that this isn't actually a problem? Is that true? If not, how can I address the duplicate content issue?

    | QuickLearnTraining
    0

  • Hi, I'm in the process of instructing developers to stop producing duplicate content, however a lot of duplicate content is already in Google's Index and I'm wondering if I should bother getting it removed... I'd appreciate it if you could let me know what you'd do... For example one 'type' of page is being crawled thousands of times, but it only has 7 instances in the index which don't rank for anything. For this example I'm thinking of just stopping Google from accessing that page 'type'. Do you think this is right? Do you normally meta NoIndex,follow the page, wait for the pages to be removed from Google's Index, and then stop the duplicate content from being crawled? Or do you just stop the pages from being crawled and let Google sort out its own Index in its own time? Thanks FashionLux

    | FashionLux
    0

  • Our IT department wants to make a change to our website and serve all the pages under SSL (https). This will be happening at the same time as a move from classic ASP to ASP.Net so the file extensions for non re-written urls will change (this doesn't equate to many). They will be ensuring everything is 301 redirected correctly. Even with this I can't help being very nervous about the change. We have tens of thousands of links to the website gained over many years, and I understand even with 301'ing them they will lose some of their value. We receive tens of thousands of natural visitors per day. Has anyone done anything like this before, or have any advice on whether it is the right thing to do?

    | BigMiniMan
    0

  • Hi guys and gals, Our CMS has just been updated to its latest version which finally adds support for rel=canonical. HUZZAH!!! However, it doesn't add the absolute URL of the page. There is a base ref tag which looks like <base <="" span="">href="http://shop.confetti.co.uk/" /> On a page such as http://shop.confetti.co.uk/branch/wedding-favours the canonical tag looks like rel="canonical" href="/branch/wedding-favours" /> Does Google recognise this as a legitimate canonical tag? The SEOmoz On-Page Report Card doesn't recognise it as such. Any help would be great, Thanks in advance, Brendan.

    | Confetti_Wedding
    0

  • Hi there, I have some keywords which varies difficulty from 1% -30 % . I can rank my urls with some blog comments from high pr blog pages? Is any way to rank them all fast? I am looking for the most cheap and easy way to rank them. One article of 1000 words lets say and some 200-300 blog comments are enough? Site is new, ranks for some other small keywords already.Has zero backlinks almost. Thanks

    | nyanainc
    0

  • I'm trying to re-fine my keyword research process and take any pointers you can give. Also, please share the tools you use these days 🙂 I need to make my process fast and efficient, right now it feels bulky.

    | Hyrule
    0

  • Hi I hope that all is going well in Seattle! I just make this site and I would like to be judged! site is http://mangakaotaku.com I am open for recommendations  and review. thanks

    | nyanainc
    0

  • Hi Mozzers I have launched www.carbodypanels4u.co.uk 3 weeks ago, It's a website that sells aftermarket car body panels. I want this website to rank on the first page for "Body Panels"           Postion 91 on google UK "Car Body Panels"    Position 33 Google UK The above are the two main keywords for the home page and I'm pleased with the progress we have made in 3 weeks, however I want to ensure I havent missed anything? Apart from Link Building, can anyone suggest anything else I can do on the website to improve my rankings. I was thinking of making all the makes on the home page to Header 2 tags? shivun

    | seohive-222720
    0

  • Hi there I was wondering if is any profit from an aged domain. Can anyone advice how to take out the most of seo benefits? White hat only techniques. Thanks in advance

    | nyanainc
    0

  • Hi All, I was wondering when I should write something as a post in my blog and when I should simply add an article in my articles section? What are the advantages of each way? Thanks

    | BeytzNet
    0

  • Where an existing website has duplicate content issues - specifically the www. and non-www. type; what is the most effective way to inform the searchers and spiders that there is only one page? I have a site where the ecommerce software (Shopfitter 4) allows a fair bit of meta data to be inserted into each product page but I am uncertain, after a couple of attempts to deduplicate some pages, which is the most effective way to ensure that the www related duplication is eliminated sitewide - there is such a solution. I have to own up to having looked at ,htaccess 301 redirects webmaster tools and become increasingly bamboozled by the conflicting advice as to which is the most effective way or combination to get rid of this problem.  too olod to learn new tricks I reckon 😉 Your help and clarification would be appreciated as this may help head off more fruitless work.

    | SkiBum
    0

  • Hello all if any one of you using prestashop for their ecommerce portal or for their clients... pelase suggest some good addons of prestashop for the SEO. thanks

    | idreams
    0

  • Mostly everyone seems to agree that /%category%/%postname%/ is the best blog structure. I'm thinking of changing my structure to that because now it's structured by date which is bad. But almost all of my posts are assigned to more than one category. Won't this create duplicate pages?

    | UnderRugSwept
    0

  • I have a client who has about 100 portfolio entries, each with its own HTML page. Those pages aren't getting indexed because of the way the main portfolio menu page works: It uses javascript to load the list of portfolio entries from an XML file along with metadata about each entry. Because it uses javascript, crawlers aren't seeing anything on the portfolio menu page. Here's a sample of the javascript used, this is one of many more lines of code: // load project xml try{ var req = new Request({ method: 'get', url: '/data/projects.xml', Normally I'd have them just manually add entries to the portfolio menu page, but part of the metadata that's getting loaded is project characteristics that are used to filter which portfolio entries are shown on page, such as client type (government, education, industrial, residential, industrial, etc.) and project type (depending on type of service that was provided). It's similar to filtering you'd see on an e-commerce site.  This has to stay, so the page needs to remain dynamic. I'm trying to summarize the alternate methods they could use to load that content onto the page instead of javascript (I assume that server side solutions are the only ones I'd want, unless there's another option I'm unaware of). I'm aware that PHP could probably load all of their portfolio entries in the XML file on the server side. I'd like to get some recommendations on other possible solutions. Please feel free to ask any clarifying questions. Thanks!

    | KaneJamison
    0

  • The website is for the attorney that serves several nearby cities. The main page is optimized for the biggest central city. I have several options how to go after the smaller surrounding cities: 1. Create optimized pages inside the main domain 2. Get more or less exact keyword domains for each city e.g. for the city ABC get yourABClawyer.com and then a) use 1 page websites that use the same template as main website and link all the menu items to the main website b)use 1 page website with a link "for more information go to our main website" c) point exact keyword domains to the optimized pages within the main domain. Which option would be the best in terms of SEO and user experience? Would people freak out if they click on the menu item and go to a different domain website even though it uses the same template (option 2a) Would I get more bounces with option 2b in your opinion? Would option 2c have any positive SEO effect? Should I not even bother with exact keyword domain and go with option 1?

    | SirMax
    1

  • Hello guys, I have a doubt. If I reedirect a url with a pagerank of 2 to a new URL, will I loose the PR? My problem is that I have a long url in one page wich is not effective to target a keyword that Im persuing. Im climbing in Google, however I want to 1º place and I dont think that with this long URL I will make it. Advices? Cheers! Pedro M Pereira

    | PedroM
    0

  • hii i am new to eCommerce . i am planning launch my shopping website that sells multiple products like amazon in magneto . can any one please suggest me some best and necessary magneto extension for seo and extensions that help increases the sales . 2)best seo tactics that need to be followed for muti product ecommerce site seo please specify the keyword for the seo methods i will research about them like .product level leverage

    | prakash.moturu
    0

  • sorry for my bad english is there any way to increase the ranking for a keyword for the entire site .i know that seo is done per page basis .my site contains 1000ds of posts and i cant get back links for each and every post .so i picked 4 keywords which are mostly used while searching my products , is there any method i can increase my ranking for those keywords like  increasing domain authority EXAMPLE :like if i want to increase my ranking for "buy laptop" .if any user searches In google with buy laptop i want my site or any of related pages that match the user search query  must show up in front

    | prakash.moturu
    0

  • We recently disallowed a wide variety of pages for www.udemy.com which we do not want google indexing (e.g., /tags or /lectures). Basically we don't want to spread our link juice around to all these pages that are never going to rank. We want to keep it focused on our core pages which are for our courses. We've added them as disallows in robots.txt, but after 2-3 weeks google is still showing them in it's index. When we lookup "site: udemy.com", for example, Google currently shows ~650,000 pages indexed... when really it should only be showing ~5,000 pages indexed. As another example, if you search for "site:udemy.com/tag", google shows 129,000 results. We've definitely added "/tag" into our robots.txt properly, so this should not be happening... Google showed be showing 0 results. Any ideas re: how we get Google to pay attention and re-index our site properly?

    | udemy
    0

  • I had this idea that I would reach out to webmasters in my niche and offer to teach them something - for example, how to set up a killer Facebook landing page.  In return I would ask them for a link from their site. I have a few quesitons: 1. Would this be considered "white hat"? 2. How would you word the email to the webmasters?  I was thinking of something like this: "Hi [webmaster name].  I was checking out your website and your Facebook page.  I was wondering if you would like me to show you how to create a great Facebook landing page that will gain you more fans.  (As an example, you can see my Facebook page here: [insert link].)  In return, all I ask for is a mention, in the form of a link, from your site.  What do you think?" What do you guys think?

    | MarieHaynes
    0

  • I know sound crazy, but things like this can happen. A client have a legal problem whit his domain name and is posible to be offline for 1 week to 3 weeks. The website is in the travel Sector, and have some good rankings on GOO. I did´t find fresh information on Internet about what can happen to rankins in the case of going offline for a week. So... any of yours have been through an experience like this ? Thanks. Matias

    | maty
    0

  • We are getting ready to re-launch our e-commerce site and are trying to decide how many products to list per category page.  Some of of our category pages have upwards of 100 products.  While I'd love to list ALL the products on the root category page (to reduce hassle for customer, to index more products on a higher PR page), I'm a little worried about having it be too long, and containing too many on-page links. Would love some guidance on: Maximum number of internal links on a page If Google frowns on really long category pages Anything else I should be considering when making this decision Thanks for your input!

    | AndrewY
    2

  • Hi, I was wondering what is the best way to run a blog? The options I thought of are: Completely separate domain with many links to my main site. blog.domain.com www.domain.com/blog Thanks

    | BeytzNet
    1

  • Google states that use of rel="alternate" hreflang="x" is recommended when: You translate only the template of your page, such as the navigation and footer, and keep the main content in a single language. This is common on pages that feature user-generated content, like a forum post. Your pages have broadly similar content within a single language, but the content has small regional variations. For example, you might have English-language content targeted at readers in the US, GB, and Ireland. Your site content is fully translated. For example, you have both German and English versions of each page. Does this mean that if I write new content in different language for a website hosted on my sub-domain, I should not use this tag? Regards, Shailendra Sial

    | IM_Learner
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.