Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • I submitted a link disavowal file for a client a few weeks ago and before doing that I read up on how to properly use the tool. My understanding is that if you received a manual penalty then you need to submit a reconsideration request after cleaning up links. We didn't receive a penalty so I didn't submit one. I'm wondering if anyone has used the tool (not stemming from a penalty) and if you did or didn't submit a recon. request, and what the results were. I've read that if a site is hit algorithmically, then filing a recon request won't help. Should I just do it anyway? Would be great to hear from anyone who has gone through a similar situation.

    | Vanessa12
    0

  • Sorry for the long title, but that's the whole question. Notes: New site is on same domain but URLs will change because URL structure was horrible Old site has awful SEO. Like real bad. Canonical tags point to dev. subdomain (which is still accessible and has robots.txt, so the end result is old site IS NOT INDEXED by Google) Old site has links and domain/page authority north of 50. I suspect some shady links but there have to be good links as well My guess is that since that are likely incoming links that are legitimate, I should still attempt to use 301s to the versions of the pages on the new site (note: the content on the new site will be different, but in general it'll be about the same thing as the old page, just much improved and more relevant). So yeah, I guess that's it. Even thought the old site's pages are not indexed, if the new site is set up properly, the 301s won't pass along the 'non-indexed' status, correct? Thanks in advance for any quick answers!

    | JDMcNamara
    0

  • A new client ( aqmp.com.br/ )call me yestarday and she told me since they moved on magento they droped down more than US$ 20.000 in sales revenue ( monthly)... I´ve just checked the webmaster tool and I´ve just discovered the number of crawled pages went from 3.260 to 75.000 since magento started... magento is creating lots of pages with queries like search and filters. Example: http://aqmp.com.br/acessorios/lencos.html http://aqmp.com.br/acessorios/lencos.html?mode=grid http://aqmp.com.br/acessorios/lencos.html?dir=desc&order=name Add a instruction on robots.txt is the best way to remove unnecessary pages of the search engine?

    | SeoMartin1
    0

  • A client's site currently uses the URL structure: www.website.com/�tegory%/%postname% Which I think is optimised fairly well, as the categories are keywords being targeted.  However, as they are using a category hierarchy, often times the URL looks like this: www.website.com/parent-category/child-category/some-post-titles-are-quite-long-as-they-are-long-tail-terms Best practise often dictates (such as point 3 in this Moz article) that shorter URLs are better for several reasons. So I'm left with a few options: Remove the category from the URL Flatten the category hierarchy Shorten post titles two a word or two - which would hurt my long tail search term traffic. Leave it as it is What do we think is the best route to take? Thanks in advance!

    | underscorelive
    0

  • Hi, We run a Magento website - When i log in to Google Webmaster Tools, I am getting this message: Severe health issues are found on your site. - <a class="GNHMM2RBFH">Check site health
       </a>Is robots.txt blocking important pages?  Some important page is blocked by robots.txt. Now, this is the weird part - the page being blocked is the admin page of magento - under 
    www.domain.com/index.php/admin/etc..... Now, this message just wont go away -  its been there for days now - so why does Google think this is an "important page"? It doesnt normally complain if you block other parts of the site ?? Any ideas? THanks

    | bjs2010
    0

  • Hi mozers, In WMT google says total indexed pages = 5080. If I do a site:domain.com commard it says 6080 results. But I've only got 2000 pages in my site that should be indexed. So I would like to see all the pages they have indexed so I can consider noindexing them or 404ing them. Many thanks, Julian.

    | julianhearn
    0

  • We are looking at buying a business that has a number of websites Is it against buying a business that has a Google News website and continue to use the site? Once the business is sold, would google remove the site from its News?

    | JohnPeters
    0

  • In last Panda update on 22nd January my site traffic reduced 30% to 40% but still some of my keywords are ranking on first and second page in SERP. With latest Penguin 2.0 update all of my keywords ranking is out of 100. Both times I send reconsideration request and get message that No Manual actions found on site. I just don't know what steps are better to get ranking back. Should I use disavow tool and remove backlinks to recover from Penguin or work more on creating quality links. My Site : http://goo.gl/sSBes Thanks, Steve

    | SteveSchmidt
    0

  • Hi Guys, I have a website that has plenty of links with parameters. For example:
    http://www.domainname.co.uk/index.php?app=ecom&ns=catshow&ref=Brandname-Golf-Shorts&sid=201v04gxs2hlozv161tfo43qk98583el I want to place a wildcard redirect on the .htaccess but don't know what exactly code for this. Ideally I want the URLs above to be: http://www.domainname.co.uk/Category/Brandname-Golf-Shorts Any help pls. Thanks,
    Brucz

    | UrbanMark
    0

  • We have a client about to enroll with us for SEO. The client has about 50 EMD sites, out of which 9 are ranking. An EMD has [Exact] match anchoring naturally, the sites in question are all EMDs the link profiles show it. The client wants to 301 the EMDs to a brand page.. We would want to 301, 9 EMD sites to the new site. Here is the thing, if the site domain has an exact match to the anchor text profile, when we 301 the page to www.brand.com/EMD will the link profile matter? One of the EMDs is on page one spot 2 if we do this change, will Google look at the new brand page (www.brand.com/EMD) as an unnatural link profile?

    | Bryan_Loconto
    0

  • I am looking for a list of SEO audit tools and strategies for a complex website. The things I am looking for include (but not limited to): finding all the subdomains of the website listing all the 301's, 302's, 404's, etc finding current canonical tags suggesting canonical tags for certain links listing / finding all current rel=nofollow's on the website listing internal links which use & don't use 'www.' finding duplicate content on additional domains owned by this website I know how to find some of the items above, but not sure if my methods are optimal and/or the most accurate. Thank you in advance for your input!

    | CTSupp
    0

  • I'm looking for insight into mobile sitemap best practices when building sites with responsive design.  If a mobile site has the same urls as the desktop site the mobile sitemap would be very similar to the regular sitemap. Is a mobile sitemap necessary for sites that utilize responsive design? If so, is there a way to have a mobile sitemap that simply references the regular sitemap or is a new sitemap that has all urls tagged with the "" tag with each url required?

    | AdamDorfman
    0

  • Hello All
    I have a question about my Magento setup.  I have lots of categories which have many products so the categories paginate.  I've seen info about making sure the Canonical tag doesn't simply send Search Engines back to the first page meaning the paginated pages won't get indexed.  I've also seen info about using the rel=next & rel=prev to help Search Engines understand the category pages are paginated... Is it okay to use both? I've made sure that: category/?p=1 has a canonical of category/ to make sure there isn't duplicate content. Here's an example of category/?p=2 meta data:
    http://website.com/category/?p=2" />
    http://website.com/category/" />
    http://website.com/category/?p=3" />

    | Vitalized
    0

  • I have just read http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world and I would like to know which option is the best fit for my case. I have the website http://www.hotelelgreco.gr and every image in image library http://www.hotelelgreco.gr/image-library.aspx has a different url but is considered duplicate with others of the library. Please suggest me what should i do.

    | socrateskirtsios
    0

  • Is there a good 301 code snippet to change just the root domain but keep the ending extensions? I just bid on a domain that I think would be much better for me moving forward, but do not want to have to try going through thousands of pages to do their 301 individually My site is almost 4 yrs old. Well established and has a large fanbase. Several of our social networks are under the name of the new branded domain, hence part of the desire to switch.

    | Atomicx
    0

  • Did someone ever experience some "collateral damages" when it's about "disallowing" some URLs? Some old URLs are still present on our website and while we are "cleaning" them off the site (which takes time), I would like to to avoid their indexation through the robots.txt file. The old URLs syntax is "/brand//13" while the new ones are "/brand/samsung/13." (note that there is 2 slash on the URL after the word "brand") Do I risk to erase from the SERPs the new good URLs if I add to the robots.txt file the line "Disallow: /brand//" ? I don't think so, but thank you to everyone who will be able to help me to clear this out 🙂

    | Kuantokusta
    0

  • I here conflicting reports on whether to show wordpress archive links on the blog or not.  Some say it is important for viewers to see, others say it is not and creates way too many links.  I think both have good points but for SEO purposes, I lean towards removing them. What do Moz users think?

    | seomozinator
    0

  • I am in the process of doing research on building content for some state to state transactions.   Through our PPC ad history I can see people have searched relatively evenly for full state names versus state abbreviations.  Texas vs TX or California vs. CA.  If I do a google search for one of our key terms with the state abbreviation, it seems that google returns results with the full state name and bolds the full state name in the meta description even though only the abbreviation, and not the full state name was part of the search.  I guess I'm trying to figure out if its worth me building out and targeting two sets of content...one around the full state names and one around the state abbreviations. Any advice?

    | ChrisClever
    0

  • Hi, I am just double checking to see if these parameters are ok - I have added an attachment to this post. We are using an e-commerce store and dealing with faceted navigation so I excluded a lot of parameters from being crawled as I didnt want them indexed. (they got indexed anyway!). Advice and recommendations on the use of GWT would be very helpful - please check my screenshot. thanks, B0gSmRu

    | bjs2010
    0

  • So I wanted to get everyone's opinion. Have a client in online retail on ASP and their developers built a mobile site a while back before we took the client on. For the sake of this post, just assume, resources are limited anddevelopers are not good (constantly break things we request to get fixed). They never installed analytics on the mobile site, so all I have to go off of is referral data on the main stores GA account for m.example.com However if I look to see what is indexed by doing site:m.example.com am not seeing many pages. The mobile site has a ton of internal links in GWT and am questioning its negative impact as there are no canonicals, no mobile sitemap present. In the ideal world, I would implement proper Mobile SEO practices but given the resources of no dev budget and devs not being good, I was thinking about noindexing the mobile site since I can RDP into the site and access robots. Thoughts?

    | Sean_Dawes
    0

  • We have asked a few questions related to this process on Moz and wanted to give a breakdown of our journey as it will likely be helpful to others! A couple of months ago, we updated our robots.txt file with several pages that we did not want to be indexed.  At the time, we weren't checking WMT as regularly as we should have been and in a few weeks, we found that apparently one of the robots.txt files we were blocking was a dynamic file that led to the blocking of over 950,000 of our pages according to webmaster tools.  Which page was causing this is still a mystery, but we quickly removed all of the entries. From research, most people say that things normalize in a few weeks, so we waited.  A few weeks passed and things did not normalize.  We searched, we asked and the number of "blocked" pages in WMT which had increased at a rate of a few hundred thousand a week were decreasing at a rate of a thousand a week.  At this rate it would be a year or more before the pages were unblocked. This did not change.  Two months later and we were still at 840,000 pages blocked. We posted on the Google Webmaster Forum and one of the mods there said that it would just take a long time to normalize.  Very frustrating indeed considering how quickly the pages had been blocked. We found a few places on the interwebs that suggested that if you have an issue/mistake with robots.txt that you can submit a reconsideration request.  This seemed to be our only hope.  So, we put together a detailed reconsideration request asking for help with our blocked pages issue. A few days later, to our horror, we did not get a message offering help with our robots.txt problem.  Instead, we received a message saying that we had received a penalty for inbound links that violate Google's terms of use.  Major backfire.  We used an SEO company years ago that posted a hundred or so blog posts for us.  To our knowledge, the links didn't even exist anymore.  They did.... So, we signed up for an account with removeem.com.  We quickly found many of the links posted by the SEO firm as they were easily recognizable via the anchor text.  We began the process of using removem to contact the owners of the blogs.  To our surprise, we got a number of removals right away!  Others we had to contact another time and many did not respond at all.  Those we could not find an email for, we tried posting comments on the blog. Once we felt we had removed as many as possible, we added the rest to a disavow list and uploaded it using the disavow tool in WMT.  Then we waited... A few days later, we already had a response.  DENIED.  In our request, we specifically asked that if the request were to be denied that Google provide some example links.  When they denied our request, they sent us an email and including a sample link.  It was an interesting example.  We actually already had this blog in removem.  The issue in this case was, our version was a domain name, i.e. www.domainname.com and the version google had was a wordpress sub domain, i.e. www.subdomain.wordpress.com. So, we went back to the drawing board.  This time we signed up for majestic SEO and tied it in with removem.  That added a few more links.  We also had records from the old SEO company we were able to go through and locate a number of new links.  We repeated the previous process, contacting site owners and keeping track of our progress.  We also went through the "sample links" in WMT as best as we could (we have a lot of them) to try to pinpoint any other potentials. We removed what we could and again, disavowed the rest.  A few days later, we had a message in WMT.  DENIED AGAIN!  This time it was very discouraging as it just didn't seem there were any more links to remove.  The difference this time, was that there was NOT an email from Google.  Only a message in WMT.  So, while we didn't know if we would receive a response, we responded to the original email asking for more example links, so we could better understand what the issue was. Several days passed we received an email back saying that THE PENALTY HAD BEEN LIFTED!  This was of course very good news and it appeared that our email to Google was reviewed and received well. So, the final hurdle was the reason that we originally contacted Google.  Our robots.txt issue.  We did not receive any information from Google related to the robots.txt issue we originally filed the reconsideration request for.  We didn't know if it had just been ignored, or if there was something that might be done about it.  So, as a last ditch final effort, we responded to the email once again and requested help as we did the other times with the robots.txt issue. The weekend passed and on Monday we checked WMT again.  The number of blocked pages had dropped over the weekend from 840,000 to 440,000!  Success!  We are still waiting and hoping that number will continue downward back to zero. So, some thoughts: 1.  Was our site manually penalized from the beginning, yet without a message in WMT?  Or, when we filed the reconsideration request, did the reviewer take a closer look at our site, see the old paid links and add the penalty at that time?  If the latter is the case then... 2.  Did our reconsideration request backfire?  Or, was it ultimately for the best? 3.  When asking for reconsideration, make your requests known?  If you want example links, ask for them.  It never hurts to ask!  If you want to be connected with Google via email, ask to be! 4.  If you receive an email from Google, don't be afraid to respond to it.  I wouldn't over do this or spam them.  Keep it to the bare minimum and don't pester them, but if you have something pertinent to say that you have not already said, then don't be afraid to ask. Hopefully our journey might help others who have similar issues and feel free to ask any further questions. Thanks for reading! TheCraig

    | TheCraig
    5

  • Hello everyone, I just ask one question: How to make Link Building for the E-Commerce sites?

    | backlinkmag
    0

  • Hi Mozzers, I am an SEO at uncommongoods.com and looking for your opinion on our site nav. Currently our nav & URLs are structured in 3 levels. From the top level down, they are: 1. Category ex: http://www.uncommongoods.com/home-garden 2. Subcat ex: http://www.uncommongoods.com/home-garden/bed-bath 3. Family ex:http://www.uncommongoods.com/home-garden/bed-bath/bath-accessories Right now, all levels are accessible from our top nav but we are considering removing the family pages. If we did that, Google could still find & crawl links to the family pages, but they would have to drill down to the subcat pages to find them. Do you guys think this would help or hurt our SEO efforts? Thanks! -Zack

    | znotes
    0

  • Hi here, I was studying our competitors SEO strategies, and I have noticed that one of our major competitors has setup something pretty weird from a SEO stand point for which I would like to know your thoughts about because I can't find a clear explanation for it. Here is the deal: the site is musicnotes.com, and their product pages are located inside the /sheetmusic/ directory, so if you want to see all their product pages indexed on Google, you can just type in Google: site:musicnotes.com inurl:/sheetmusic/ Then you will get about 290,000 indexed pages. No, here is the tricky part: try to click on one of those links, then you will get a 302 redirect to a page that includes a meta "noindex, nofollow" directive. Isn't that pretty weird? Why would they want to "nonidex, nofollow" a page from a 302 redirect? And how in the heck the redirecting page is still in the index?!! And how Google can allow that?! All this sounds weird to me and remind me spammy techniques of the 90s called "cloaking"... what do you think?

    | fablau
    0

  • Hi mozzers, We launched a mobile site a couples months ago following the parallel mobile structure with a URL:m.example.com The week later my moz crawl detected thousands of dups which I resolved by implementing canonical tags on the mobile version and rel=alternate onto the desktop version. The problem here is that I still also got Dups from that got generated by the CMS. ?device=mobile ?device=desktop One of the options to resolve those is to add canonicals on the desktop versions as well on top of the rel=alternate tag we just implemented. So my question here: is it dangerous to add rel=canonical and rel=alternate tags on the desktop version of the site or not? will it disrupt the rel=canonical on mobile? Thanks

    | Ideas-Money-Art
    0

  • Hi, 303 redirect is a good thing or not ? I have a homepage in 2 languages FR and EN > mywebsite.com/fr/ and mywebsite.com/en/. A 303 redirect is on mywebsite.com to mywebsite.com/fr/. Thanks D.

    | android_lyon
    0

  • Hi to all, Maybe this question is already answered (in that case sorry) but I didn't find it. Currently, with the latest changes is really useful to have a 'seo footer'. I mean, it seems that can give you more problems that benefits. In my case the idea of the footer is only to obtain more traffic. Having this in mind, I'm right thinking that is better don't write anything ? Thanks in advance

    | nsimpson
    0

  • Wow, I just found this awesome Infographic created by Brian Dean of Backlinko. It's really nice share by him, putting all efforts together into creating this Top 200 Ranking Factors what everybody wants to know now. I am just right now seeing and observing the whole things and points mentioned in this infographic and like to want you all of Moz members to discuss over this. 069581c30c35cb5a6294cfc6e3799db4.jpg

    | Futura
    1

  • I have a website that targets national keywords. I would like to be able to rank locally for these keywords as well without having the city in the title. What is the best strategy for this?

    | cprodigy29
    0

  • Google Advises to use Rel Canonical URL's to advise them which page with similiar information is more relevant. You are supposed to put a rel canonical on the non-preferred pages to point back to the desired page. How do you handle this with a product catalog using ajax, where the additional pages do not exist?  An example would be: <colgroup><col width="470"></colgroup>
    | .com/productcategory.aspx?page=1 /productcategory.aspx?page=2 /productcategory.aspx?page=3 /productcategory.aspx?page=4 The page=1,2,3 and 4 do not physically exist, they are simply referencing additional products I have rel canonical urls' on the main page www.examplesite.com/productcategory.aspx, but I am not 100% sure this is correct or how else it could be handled. Any Ideas Pro mozzers? |

    | eric_since1910.com
    0

  • We are using on our product pages review schema markup which showed up well in google SERP. Now we added schema  video markup as well and google shows in SERP now only the video snippet, but not the review snippet anymore. Any idea whether there may be a solution to show both video and review snippets?

    | lcourse
    0

  • Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio

    | fablau
    0

  • We have just recently launched a new website in Australia and as l am new to the SEO community, l was looking for a little advice on link building. Where is best to start? There are not many authorative websites for our industry. Are there specific websites that are good to link to? Are there any good tools to assist with this? Any help would be great. Thank you.

    | RobSchofield
    0

  • I'm currently get to grips with schema and one thing im using is author on my blog posts and seeing my photo etc on organic searches which are related. I see one of my competitors is using author on every page on their website, not just blog posts etc. Are there any recommendation when it should be used? Should it be site wide or is it really intended for blog posts etc? Would it be wrong for me to use on every page of my website as one of my businesses is myself as a lone person? This is what you get when searching for driving lessons in just about any town! https://www.google.co.uk/#gs_rn=15&gs_ri=psy-ab&tok=LS_DOrAHswmHC9_8AJZEJA&suggest=p&pq=driving instructor brighton&cp=20&gs_id=1k2&xhr=t&q=driving+lessons+crawley&es_nrs=true&pf=p&sclient=psy-ab&oq=driving+lessons+craw&gs_l=&pbx=1&bav=on.2,or.r_cp.r_qf.&bvm=bv.47244034,d.d2k&fp=45c2f917e11bca99&biw=1680&bih=843 Any comments welcome! Antony

    | Ant71
    0

  • What type of links do you think Penguin 2.0 targeted most - anchor text abuse , directory links, paid links, low quality guest posts, article directories etc????

    | DavidKonigsberg
    0

  • I would like to move all my  old domain content ( dicasdogoogle.com.br) with more than 1200 tutorials pages to a new one (seomartin.com)...  and then unify them. I´m using wordpress in both but the permalinks are different...   Any tips 4 me folks?

    | SeoMartin1
    0

  • Here is an interesting issue we are noticing lately: Google is always more scrambling and changing the title of our product pages in the SERPs results. Here is an example: Keyword: "bach arioso sheet music". We are down at the 6th spot, and the shown title is different from what's defined inside the TITLE tag of that page. And that appears often for other keywords/product pages. Why's that? How can we control that? It is hard for us to optimize titles and test CTR and other metrics if Google is showing them differently to the users. Similar issue with the description tag: sometimes Google instead of showing to the users the description tag contents, shows part of the text taken from the page even though the searched keywords are included both in the title and the description tag, and so I can't find justification to show text taken from the page instead... it is quite difficult to understand the motivation beyond all this! Any thoughts are very welcome. Thanks! Fab.

    | fablau
    0

  • Recently I've employing rich snippets using the guides from schema.org I find them a great way to let the search engines understand the content of my web pages, also author and publisher tags are all thrown into the migreat great love it. My question however is what to do with the existing title and description tags? Should they be left in? Do they cause conflicts with the search engines? Should I just ignore this gut feeling and leave them in. Any insights into the use of schema and a standard tags appreciated. Best David

    | David-E-Carey
    0

  • Can anybody shed some light on this. A competitor is ranking no.1 for the keyword "storage" which is a very competitive keyword in our industry. There url is www.publicselfstorage.com.au When using Site Explorer, l can see that they have a very low PA, low external following links etc compared to the companies ranking 2 and 3. http://www.storageking.com.au/melbourne.html http://www.kss.com.au/ How are they still managing to rank no.1?

    | RobSchofield
    0

  • Hello! We have an internal search engine for different email, postal, and phone data products on our website (75,000 product pages... calling all direct marketers!), I've noindexed all our dynamic search pages, but I'm wondering how else I can improve these pages. Should I reduce the amount of links on each page?
    Currently there are 20 search results per page. " <variable>Mailing List" has been a pretty good source of traffic for our product pages.
    Should I change the anchor text for all the product pages listed to include the added long-tail keyword, or would that be extremely spammy, having the word "Mailing List" 20+ times on my page? We have both static and dynamic search pages - here is one of static ones: http://www.consumerbase.com/direct-marketing-mailing-lists.html
    My main problem with adding the long tail KWs to the anchor text is that we still want our static search pages indexed.</variable> Thanks!

    | Travis-W
    0

  • OK. I've optimized sites before that are dedicated to 1, 2 or 3 products and or services. These sites inherently talk about one main thing - so the semantics of the content across the whole site reflect this. I get these ranked well on a local level. Now, take  an e-commerce site - which I am working on - 2000 products, all of which are quite varied - cookware, diningware, art, decor, outdoor, appliances... there is a lot of different semantics throughout the site's different pages. Does this influence the ranking possibilities? Your opinion and time is appreciated. Thanks in advance.

    | bjs2010
    0

  • Hi Mozers, Need some help on a CMS I've been working with over the last year. The CMS is built by a team of guys here in Washington State. Basically, I'm having issues with clients content on the blog system not getting ranking correctly at all. Here's a few problems I've noticed: Could you confirm and scale these problems based upon being, "not a problem" "a problem" and "critical must fix" 1. The title tag is pulling from the title of the article which is also automatically generating a URL with underscores instead of dashes. Is having a duplicate URL, Title, and Title tag spammy looking to search engines? Are underscores on long URL's confusing google? Where shorter one's are fine (i.e. domain/i_pad/
    (i.e.http://www.ductvacnw.com/blog/archives/2013/05/20/5_reasons_to_hire_a_professional_to_clean_your_air_ducts_and_vents), 2. The CMS is resolving all URL's with a canonical instead of a 301 redirect (I've told webmaster tools which preferred url should be indexed). Does using a canonical over a 301 redirect cause any confusion with Google? Is one better practice then the other? 3. The H1 tags on the blog pull from "blog category" instead of the title of the blog post. Is this is a problem? 4. The URl's are quite long with the added "archives/2013/05/20/5". Does this cause problems by pushing the main target keyword further away from the domain name? 5. I'm also noticing the blog post is actually not part of the breadcrumbs where we normally would expect that to populate after the blog category name, Problem? These are some of the things I've noticed and need clarification on. If you see anything else please let me know?

    | Keith-Eneix
    0

  • Hi guys, I have the following situation I would like some help. Because my client is in Brazil, I will make up fictional names so it's easier to understand. My client is a shoe store whose domain is mangabeira.com. That is the brand name and will always be the main domain and reference of the website. We were offered the domain shoes.com. There is no intention of changing the brand name or anything, but there would be a redirect that would send the user who to mangabeira.com. My question is how much impact would that complementary domain do to my SEO performance and how that redirect must be handled. Thanks.

    | LucasLopes
    0

  • I did a raw export from AHREFs yesterday and one of our sites has 18,000 backlinks coming from the same site. But they're all the same link, just with a different session ID. The structure of the URL is: [website].com/resources.php?UserID=10031529 And we have 18,000 of these with a different ID. Does Google read each of these as a unique backlink or does it realize there's just one link and the session ID is throwing it off? I read different opinions when researching this so I'm hoping the Moz community can give some concrete answers.

    | Kingof5
    0

  • Hi, In the case of a RED/GREEN/YELLOW coffeemaker for example, I have say 6 pages that are indexed in google. Now, I can write very unique content for each and that gives me 6 pages in SERPS. Or make it a configurable product? What is best, and how different would the description need to be - my feeling that just changing the word colour in the text would NOT be enough. Thanks, B

    | bjs2010
    0

  • Thanks in advance, I'm getting duplicate page titles because seomoz keeps crawling through my url parameters.  I added forcefiltersupdate to the URL parameters in webmaster tools but it has not seemed to have an effect. Below is an example of the duplicate content issue that I am having. http://qlineshop.com/OC/index.php?route=product/category&path=59_62&forcefiltersupdate=true&checkedfilters[]=a.13.13.387baf0199e7c9cc944fae94e96448fa Any thoughts?  Thanks again. -Patrick

    | bamron
    0

  • Hello After another penguin 2.0 update the website i've been working on dropped in rankings,some of keywords that i ranked in #1 are now on second and third page, you can see this screenshot here http://screencast.com/t/MramoXgTr 95% of my competitors were not even effected with this update at all, most of them don't even optimize their  website for SEO, rather they use paid directories. First thing i did is analyzed my backing profile using OSE, to my surprise i found a lot of low quality domains pointing to my pages with a keyword in anchor text. A lot of them blog commenting and low  quality article directories. Since i don't have control over these  links and i cant remove them i used Disavow tool to do the job. For  the past 3 months, i've been doing a lot of hight quality link building; such as
    press releases once in 2 months, squidoo lens and hubpages 3 posts a week for each keyword, youtube video, in fact my youtube video still ranks in #3 for high competitive search, i was involved in social media, posting tweets every week and Facebook posts. I really hope that someone can help me here with a good advice on getting my rankings back here's my website, let me know what do you think about it. Thank You

    | KentR
    0

  • I'm working on a site for a serviced apartment site http://www.alcove.co.in/ which offers apartments in 9 cities in India.  Site was ranking in 1st page of Google for “serviced apartment + city” for 7 cities until sometime in Jan 2013. However organic traffic has been gradually falling since sometime in September 2012 (40% fall this month over same period last year).  There’s been no sudden fall in traffic which we may link with any Penguin update. There have been no warning messages in Google WMT. Even today the site ranks in 1st page for 3 cities;  however  ‘Serviced apartments bangalore’ which  was  the biggest revenue earner, is not ranked in first 5 pages. My questions are whether will aggressive use of branded keywords in anchor text  will attract Penguin’s wrath, does Google makes allowance for case  when  company's name includes keywords. In our case, company name is Alcove Service apartments, could there be some other reason for fall in ranking/traffic? The distribution of anchors (external links, multiple links from same domain are counted) is : percent
    Keywords                        34%
    brand+keywords            43%
    Natural                               4%
    only brand                         11%
    URL                                     7% For the above, Brand = ‘Alcove Service apartments’ or  ‘Alcove Serviced apartments’ brand+keywords = various combinations of ‘alcove’ + [‘guest houses’ or ‘hotels’ or ‘accommodation’]  + city1 + city2… Intriguingly, Open Site Explorer analysis of domain metrics (Domain Authority, Followed Linking Root Domains, etc) ranks Alcove higher than all but one site appearing in 1st page of Google for ‘Serviced apartments bangalore’. Most  of alcove’s  links are from article directories (no spun articles were used), directories and link exchanges with relevant sites. Any suggestions and guidance on what we could do to remedy the situation would be greatly appreciated! Thanks

    | anand53
    0

  • Hi, Sometime ago, a few thousand pages got into Google's index - they were "product pop up" pages, exact duplicates of the actual product page but a "quick view". So I deleted them via GWT and also put in a Meta No Index on these pop up overlays to stop them being indexed and causing dupe content issues. They are no longer within the index as far as I can see, i do a site:www.mydomain.com/ajax and nothing appears - So can I block these off now with robots.txt to optimize my crawl budget? Thanks

    | bjs2010
    0

  • My issue is that we have two websites with the same content. For the sake of an example lets say they are: jackson.com jacksonboats.com When you go to jacksonboats.com, the website is an iframed version of jackson.com. However all of the companies email addresses are [email protected] so a 301 is not possible. What would be the best way to forward over the link juice from jacksonboats.com to jackson.com? I'm thinking a rel=canonical tag, but I wanted to ask first. Thanks,

    | BenGMKT
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.