Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Intermediate & Advanced SEO

Looking to level up your SEO techniques? Chat through more advanced approaches.


  • Hi, I found some excessive cross domain linking from a separate blog to the main company website. It sounds like best practice is to cut back on this, but I don't have any proof of this. I'm cautious about cutting off existing links; we removed two redundant domains that had a huge number of links pointing to the main site almost 1 year ago, but didn't see any correlated improvement in rankings or traffic per se. Hoping some people can share a success story after pruning off excessive cross linking either for their own website or for a client's. Thanks 🙂

    | ntcma
    0

  • Hi all, wondering if i could gather some views on the best approach for this please... We currently have a magento site up with about 150,000 pages (although only 9k indexed in Google as product pages are set to no index by default until the default manufacturer description has been rewritten). The indexed pages are mainly category pages, filtering options and a few search results. While none of the internal pages have massive DA - seem to average about 18-24 which isn't too bad for internal pages, I guess - I would like to transfer as much of this over to the new domain. My question is, is it really feasible to have an htaccess with about 10,000 301 redirects on the current domain? The server is pretty powerful so could probably serve the file without issue but would Google be happy with that? Would it be better to use the change url option in WMT instead. Ive never used that so not sure how that would work in this cause. Would it redirect users too? As a footnote, the site is changing because of branding reasons and not because of a penalty of the site. Thanks, Carl

    | daedriccarl
    0

  • Long time no Moz! Ive been away with some server related issues, installing an AD at the company I work for, but I'm back. Our SSL cert just expired and I'm trying to determine the pros and cons of making an entire site SSL vs just the URL. Our previous set up was just a single domain. I know Google has hinted toward SSL preference, and I know its a little early to know for certain how much that's going to help, but I just wanted to know what everybody thought? It expired yesterday, so I have to do something. And we lost our previous credentials so I can't just renew the old one. Thanks!

    | HashtagHustler
    0

  • Hi there, I'm using rel="prev" and rel="next" on paginated category pages. On 1st page I'm also setting a canonical tag, since that page happens to get hits to an URL with parameters. The site also uses mobile version of pages on a subdomain. Here's what markup the 1st desktop page has: Here's what markup the 2nd desktop page has: Here's what markup the 1st MOBILE page has: Here's what markup the 2nd MOBILE page has: Questions: 1. On desktop pages starting from page 2 to page X, if these pages get traffic to their versions with parameters, will I'll have duplicate issues or the canonical tag on 1st page makes me safe? 2. Should I use canonical tags on mobile pages starting from page 2 to page X? Are there any better solutions of avoiding duplicate content issues?

    | poiseo
    1

  • Do we need to have unique 3 digits in URL, like stated here in technical guidelines from Google -- https://support.google.com/news/publisher/answer/40787?hl=en&ref_topic=4359866. Or, is having and submitting Google News XML Sitemap a way around that -- https://support.google.com/news/publisher/answer/68323?hl=en

    | bonnierSEO
    0

  • Hi, I've recently joined the business and as part of the cleanup process I got told that we owned this domain preferredsafaris.com with some very similar content to our main site southernafricatravel.com. We're no longer owns the preferredsafaris.com domain but looking at Google's cache for it we realised that the title, meta description & page shown when looking at the 'cached page' is for our current domain even though it is showing the 'correct' URL there. I imagine this might have something to do with canonical set on those pages but the weird thing is all those pages now render 404 & do not show a canonical in the source code. I have used Google Removal Tool https://www.google.com/webmasters/tools/removals for all those URLs & Google says that it has removed them & yet they're still showing. What do you suggest? Any potential issue in regards to duplicate content here? Cheers, Julien

    | SouthernAfricaTravel
    0

  • anyone know a good tutorial on how to implement Leverage Browser Caching? Do I need something like cloud flare or can I add meta tags to do this?

    | Cocoonfxmedia
    0

  • In links to my site in Google Webmaster Tools I am showing over 28,000 links from an ip address. The ip address is the address that my server is hosted on. For example it shows 200.100.100.100/help, almost like there are two copies of my site, one under the domain name and one under the ip address. Is this bad? Or is it just showing up there and Google knows that it is the same since the ip and domain are from the same server?

    | EcommerceSite
    0

  • I am going through links and trying to figure out what to disavow. I found a domain under "Who links the Most" I wanted to see what the exact link was and I can't find it when I download all links. Why would that be?

    | EcommerceSite
    0

  • Hi all Our domian authority has increased from 39 to 42 last week. We have been improving our metadata and removing bad backlinks recently. Is there any other reason or updates last week that would have resulted in this increase? Thanks
    Gavin

    | gavinr
    0

  • Hi My blogs categories for the ecommerce site are by subject and are similar to the product landing pages. Example Domain.com/laptops that sells laptops Domain.com/blog/laptops that shows news and articles on laptops Within the blog posts the links of anchor laptop are to the store. What to do? Thanks

    | BeytzNet
    1

  • It is well known that when page A canonicals to page B, some link juice is lost (similar to a 301). So imagine I have the following pages: Page A: www.mysite.com/main-page which has the tag: <link rel="canonical" href="http: www.mysite.com="" main-page"=""></link rel="canonical" href="http:> Page B: www.mysite.com/main-page/sub-page which is a variation of Page A, so it has a tag I know that links to page B will lose some of their SEO value, as if I was 301ing from page B to page A. Question: What about this link: www.mysite.com/main-page?utm_medium=moz&utm_source=qa&utm_campaign=forum Will it also lose link juice since the query string is being stripped by the canonical tag? In terms of SEO, is this like a redirect?

    | YairSpolter
    0

  • Hey guys, I was wondering if you could tell me your thoughts about how a URL is perceived by the algo in 2014? For example: http://www.moneyexpert.com/reviews/credit-cards/amex-platinum/ and lets say http://www.moneyexpert.com/reviews_credit-cards_review_amex-platinum.html In the eyes of google do both different style of url generally help google understand the same result? or will the keyword rich html url have a bigger benefit? I am looking forward to your advice on this matter. I don't plan on doing a lot of SEO but rather letting nature take its course so to speak... so i just wanted to make sure i construct this site with 'best practice'.

    | irdeto
    0

  • hello everyone, I have this problem with the rich snippets: http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.visalietuva.lt%2Fimone%2Ffcr-media-lietuva-uab The problem is that it says some kind of error. But I can't figure it out what it is. We implemented the same code on our other websites: http://www.imones.lt/fcr-media-lietuva-uab and http://www.1588.lt/imone/fcr-media-lietuva-uab . The snippets appear on Google and works perfectly.
    The only site that has this problem is visalietuva.lt I attached the image to show what I mean. I really need tips for this one. gbozIrt.png

    | FCRMediaLietuva
    0

  • I am asking because I keep creating spam reports on google web master tools when I find them on competitors websites but nothing happen... Anyone else had better experience with spam reports?

    | max.favilli
    0

  • I'm new in SEO and heard by one of my friend that social signals are important for SEO of a website. If people have shared a website's url on their twitter, then it will automatically get rank in google. Is that true and how google sees this social sharing? and how can I do this for my website?

    | hammadrafique
    0

  • An ECWID rep stated in regards to an inquiry about how the ECWID url's are not customizable, that "an important thing is that it doesn't matter what these URLs look like, because search engines don't read anything after that # in URLs. " Example http://www.runningboards4less.com/general-motors#!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 Basically all of this: #!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 That is a snippet out of a conversation where ECWID said that dirty urls don't matter beyond a hashtag... Is that true? I haven't found any rule that Google or other search engines (Google is really the most important) don't index, read, or place value on the part of the url after a # tag.

    | Atlanta-SMO
    0

  • I am showing thousands of links from my servers ip address. What would cause that?

    | EcommerceSite
    0

  • Thank you kindly for taking the time to read this. The company I work with is a wedding chapel in Las Vegas. They've had the same domain since about 2001. Their organic placement has been stellar since about 2008. With the most recent Panda update some results did slip, but they are still strong & I feel that the SEPRs that slipped will be back up shortly (hopefully!) The company recently bought the url www.VegasWeddings.com which happens to be a generic key phrase, BUT ALSO IS THE NAME OF THE BUSINESS. They want to switch, but I am in a bit of a conundrum of this. It seems really risky, but also makes a lot of sense. Help? Insight? Anything? Thank you dearly!!!

    | leslieevarts
    0

  • Here is my issue and I've asked a related question on this one. Here is the back story. Site owner had a web designer build a duplicate copy of their site on their own domain in a sub folder without noindexing. The original site tanked, the webdesigner site started outranking for the branded keywords. Then the site owner moved to a new designer who rebuilt the site. That web designer decided to build a dev site using the dotted quad version of the site. It was isolated but then he accidentally requested one image file from the dotted quad to the official site. So Google again indexed a mirror duplicate site (the second time in 7 months). Between that and the site having a number of low word count pages it has suffered and looked like it got hit again with Panda. So the developer 301 the version to the correct version. I was rechecking it this morning and the dotted quad version is still indexed, but it no longer lets me look at the cache version. Out of experience, is this just Google getting ready to drop it from the index?

    | BCutrer
    0

  • I work on a site that has almost 20,000 urls in its site map. Google WMT claims 28,000 indexed and a search on Google shows 33,000. I'd like to find what the difference is. Is there a way to get an excel sheet with every url Google has indexed for a site? Thanks... Mike

    | 94501
    0

  • Hi all! I'm currently working on a migration for a large e-commerce site. The old one has around 2.5k urls, the new one 7.5k. I now need to sort out the redirects from one to the other. This is proving pretty tricky, as the URL structure has changed site wide. There doesn't seem to be any consistent rules either so using regex doesn't really work. By and large, the copy appears to be the same though. Does anybody know of a tool I can crawl the sites with that will export the crawled url and related copy into a spreadsheet? That way I can crawl both sites and compare the copy to match them up. Thanks!

    | Blink-SEO
    0

  • I have a few basic questions about mobile SEO. I'd appreciate if any of you fabulous Mozzers can enlighten me. Our site has a parallel mobile site with the same urls, using an m. domain for mobile and www. for desktop. On mobile pages, we have a rel="canonical" tag pointing to the matching desktop URL and on desktop pages we have a rel="alternate" tag pointing to the matching mobile URL. When someone visits a www. page using a mobile device, we 301 them to the mobile version. Questions: 1. Do I want my mobile pages to be indexed by Google? From Tom's (very helpful) answers here, it seems that I only want Google indexing the full site pages and if the mobile pages are indexed it's actually a duplicate content issue. This is really confusing to me since Google knows that it's not duplicate content based on the canonical tag. But - he makes a good point - what is the value of having the mobile page indexed if the same page on desktop is indexed (I know that Google is indexing both because I see them in search results. When I search on mobile Google serves the mobile page and when I search on desktop Google serves me the desktop page.)? Are these pages competing with each other? Currently, we are doing everything we can do ensure that our mobile pages are crawled (deeply) and indexed, but now I'm not sure what the value of this is? Please share your knowledge. 2. Is a mobile page's ranking affected by social shares of the desktop version of the same page? Currently, when someone uses the share buttons on our mobile site, we share the desktop url (www. - not m.). The reason we do this is that we are afraid that if people are sharing our content with 2 different url's (m.mysite.com/some_post and www.mysite.com/some_post) the share count will not be aggregated for both url's. What I'm wondering is: will this have a negative effect on mobile SEO, since it will seem to Google that our mobile pages have no shares, or is this not a problem, since the desktop pages have a rel="alternate" tag pointing to mobile pages, so Google gives the same ranking to the mobile page as the desktop page (which IS being shared)?

    | YairSpolter
    0

  • Let's say my website is aaaaa.com and company name is aaaaa Systems. When I search Google aaaaa my site do not come up at all. When I search for "aaaaa Systems" it comes up. But in WMT I see quite a few clicks from aaaaa as keyword. Most of the traffic is brand keywords only. I never received any manual penalty in WMT ever. Is the site penalized or regular algorithm issues?

    | ajiabs
    0

  • We're working with a site that has gone through a lot of changes over the years - ownership, complete site redesigns, different platforms, etc. - and we are finding that there are both a lot of pages and individual images that are returning 404 error codes in the Moz crawls. We're doing 301 redirects for the pages, but what would the best course of action be for the images? The images obviously don't exist on the site anymore and are therefore returning the 404 error codes. Should we do a 301 redirect to another similar image that is on the site now or redirect the images to an actual page? Or is there another solution that I'm not considering (besides doing nothing)? We'll go through the site to make sure that there aren't any pages within the site that are still linking to those images, which is probably where the 404 errors are coming from. Based on feedback below it sounds like once we do that, leaving them alone is a good option.

    | garrettkite
    0

  • Hi, I find my e-commerce pharmacy website is full of little snippets of duplicate content. In particular: -delivery info widget repeated on all the product pages -product category information repeated product pages (e.g. all medicines belonging to a certain category of medicines have identical side effects and I also include a generic snippet of the condition the medicine treats) Do you think it will harm my rankings to do this?

    | deelo555
    0

  • Hello, I have a site I am optimizing and I cant seem to get a particular listing onto the first page due to the fact google is indexing the wrong page. I have the following scenario. I have a client with multiple locations. To target the locations I set them up with URLs like this /<cityname>-wedding-planner.</cityname> The home page / is optimized for their port saint lucie location. the page /palm-city-wedding-planner is optimized for the palm city location. the page /stuart-wedding-planner is optimized for the stuart location. Google picks up the first two and indexes them properly, BUT the stuart location page doesnt get picked up at all, instead google lists / which is not optimized at all for stuart. How do I "let google know" to index the stuart landing page for the "stuart wedding planner" term? MOZ also shows the / page as being indexed for the stuart wedding planner term as well but I assume this is just a result of what its finding when it performs its searches.

    | mediagiant
    0

  • I have a new client who has a service centre covering each of 670 towns based in the uk. The site is set up so it can break down to a county or town, e.g.   www.caravanspareparts.co.uk/county-name/town-name. I'm wondering at what level to point the SEO. So if I'm building keywords to the site, looking to get local pages served up for local people it would take a lot of link building to link to each of the 670 towns. My suspicion is that if I link build at the county level, that the 'link juice' will trickle down to the towns underneath... e.g. If I link build to www.caravanspareparts.co.uk/lincolnshire, all the pages after that one (the town pages) will benefit... link text to this county-level page would be "caravan spares lincolnshire" etc... Does anyone know if my logic is valid or can recommend an alternative approach?

    | deployseo
    0

  • Hi, To date, we have gone with optimizing our website for branded search terms - we have done this by including the manufacturer + product name in the links to the product, product urls, page titles, h1's etc... Now, we are looking at optimising also for non branded terms - but all of our products are already optimized as above and I'm a bit lost as to how we would proceed with this process - my thoughts are: Create the generic categories, e.g, "Decorative Mirrors" But then, if they link through to the existing mirrors which are optimised for branded search, how do I go about it? Do I create a duplicate of the product and remove all the branded terms in the page title, h1, url and create new content for it? E.g., the same product but available twice... Doesnt seem logical from a user perspective but I cant think of another way to do this?? Thank you

    | bjs2010
    0

  • My company operates a commercial real estate web site in New York City.  We are competing against brands that can spend hundreds of thousands of dollars per year on marketing and SEO. I am an independent without employees. I am trying to replace generic home page text  (my web site is ww.nyc-officespace-leader.com) with more engaging language that will differentiate the site. In terms of SEO best practices, what constraints exist on home page text? Must it discuss the brand in a general way? Must it specifically describe the interior pages of a site? For new home page text, I wanted to take a  practical approach and provide visitors with a checklist that would be useful for their search. The check list would be in the form 500-800 words or very practical advice, specific items they should be aware of before commencing a real estate search. No other site is taking this approach and it could save tenants  time and grief. I am attaching a very rough mock up of the new text (forgive the grammar, really needs to be edited!). Is there any thing wrong with this approach? I may have stumbled on a good idea, however maybe nobody has taken this route because it is inappropriate for a home page. I would very much appreciate any comments or guidance anyone may have. Best, Alan Rosinsky 0d76Iv2

    | Kingalan1
    0

  • We have numerous simple websites that are not updated very often (maybe once every 12-18 months). We are trying to create the perfect XML sitemap for them. Which brings me to my 2 questions:- 1. Should we include the "lastmod" tag in the XML sitemap? 2. What would you set the "changefreq" at? Once a month? Remember these sites are never updated as they are simple sites for industry sectors such as building and property maintenance. Please justify your answer and explain why as we are trying to understand.

    | JohnW-UK
    0

  • We are moving to Sitecore where the standard out the box is that if you change page title it amends the URL as well. I am worried that this will lead to SEO issues and am considering whether we need to get it locked down so that if the page title is amended (only in a minor way) it does not also change the URL. I have never worked with readable URLs before - what are the implications of the URL not exactly matching the wording of the page title?

    | alzheimerssoc
    0

  • How much does Google value placement of unique content in the source code vs where it is visually displayed? I have a case where my unqiue content visually displays high on page for the user, but in the source code the unique quality content is below duplicate type content that appear across many other domains (think e-commerce category thumbs on left side of screen and 80% right side of screen unique stuff). I have the impression I am at a disadvantage because these pages have the unique / quality content lower in source code. Any thoughts on this?

    | khi5
    0

  • Hi, If you have been hit by Panda 4.1 and now putting fixes in place, for this example lets say you remove a load of dup content (and that's what caused the problem) - how long would it take for that fix to take affect? Do you have to wait for the next Panda update? or will it be noticed on the next crawl? Thanks.

    | followuk
    0

  • This is the scenario: A webstore has evolved into 7 sites in 3 shops: example.com/northamerica example.de/europe example.de/europe/en example.de/europe/fr example.de/europe/es example.de/europe /it example.com.au .com/northamerica .de/europe/en and .com.au all have mostly the same content on them (all 3 are in english). What would be the best way to avoid duplicate content? An answer would be very much appreciated!

    | SEO-Bas
    0

  • After years using a .net.au site, my client has purchased the .com.au version of the same domain. I've now set up a new, responsive website using a wordpress template with new content, but used a similar page structure. I've asked their web developer to now do a 301 permanent redirect on each old page from .net.au site to it's new .com.au page, but he has refused, saying it would be bad for long term SEO. Instead, he says they should run both sites (which I thought would cause duplicate content issues). Both domains are hosted with the same company. I thought as long as the 301 redirects were done on a page by page basis, there were no issues? I'm no SEO expert, (which he claims to be), so I just wanted to get another opinion on what best practice would be in this instance.

    | carolineraad
    0

  • Our new home page which is in development has been identified as being keyword stuffed for a particular search word. The problem is that the page includes a dynamic feed pulled in from our database. It would be similar to booking.com for example coming up as keyword stuffed for the word hotel. But hotels are their business and so any instance of the word hotel is probably relevant.  Our problem is similar. How detrimental would this be for SEO? And does anyone have any ideas how this can be worked round?

    | striple
    0

  • Hi! We just launched our new mobile site and I am trying to get the rel="alternate" tags put on the desktop site. The specs had the tags formatted like this: They ended up like this: My developer is telling me the order does not matter. Can anyone confirm? Does the order matter? Thank You!

    | shop.nordstrom
    0

  • Hi guys, We're thinking of changing the url structure of the tutorials (we call it knowledgebase) section on our website. We want to make it shorter URL so it be closer to the TLD. So, for the convenience we'll call them old page (www.domain.com/profiles/profile_id/kb/article_title) and new page (www.domain.com/kb/article_title) What I'm looking to do is change the url structure but keep the likes/shares we got from facebook. I thought of two ways to do it and would love to hear what the community members thinks is better. 1. Use rel=canonical I thought we might do a rel=canonical to the new page and add a "noindex" tag to the old page. In that way, the users will still be able to reach the old page, but the juice will still link to the new page and the old pages will disappear from Google SERP and the new pages will start to appear. I understand it will be pretty long process. But that's the only way likes will stay 2. Play with the og:url property Do the 301 redirect to the new page, but changing the og:url property inside that page to the old page url. It's a bit more tricky but might work. What do you think? Which way is better, or maybe there is a better way I'm not familiar with yet? Thanks so much for your help! Shaqd

    | ShaqD
    0

  • My firm (a real estate brokerage) works with an offshore Argentinian developer who is generally very reliable and efficient. We have requested that three pop-up forms be implemented. The pop-up forms  appear once visitors see a fixed number of property listings. Visitors will see form #1 after viewing three listings, form #2 after seeing another 3 listings and form #3 after another three listings. The pop-ups are in the form of a question. If the visitor clicks "Yes" the pop-up expands into a form, the visitor fills in the contact info and subsequent forms no longer appear. I have attached mock ups for  the pop-up and the expanded pop-up. I am concerned about the 9 hours of programming time to implement these pop-ups. Is this excessive or is it reasonable? For me, it seems like this would be pretty quick and standardized to accomplish with a Wordpress plugin. But maybe with testing and other tasksI am wrong.  I would appreciate some feedback on this. I do not want to underpay or overpay my developer. But I would feel reassured knowing that they are working productively and accomplishing tasks in a reasonable amount of time. Specs provided by my developer below: | 1- We have to implement the popup (based on approved mockup)
    2- We need to implement the form to be included into each popups (3) and validate proper data
    3- we need complete the logic for cookies's status.
    4- Cross-devise testing for those popups. This way not allow that you update the content, we need set a field in data base or think how can do this if you want.  | I look forward to your feedback. THANKS, Alan XRZUaEi bBdqsAq

    | Kingalan1
    0

  • Hey Moz Community, Can anyone explain why a website would have a PR4 Home page and most inner pages PR3 with only a DA12 and PA14 from OSE? The website in question is my Rotary club http://carymacgregorrotary.org. Thank you.
    Patrick

    | WhiteboardCreations
    0

  • A web-site has PR5 but in OSE I see that Domain Authority is only 26. I've also checked and the domain was registered in 2009. Is this normal?

    | ditoroin
    0

  • I work for a Theater news website. We have two sister sites, theatermania.com in the US and whatsonstage.com in London. Both sites have largely the same codebase and page layouts. We've implemented markup that allows google to show a search box for our site in its results page. For some reason, the search box is showing for one site but not the other: http://screencast.com/t/CSA62NT8 We're scratching our heads. Does anyone have any ideas?

    | TheaterMania
    0

  • Our site gets a decent level of search traffic and doesn't have any site-wide penalty issues, but one of our sections looks like it might be under some form of filter. Unfortunately for us, they're our buy pages! Check out http://www.carwow.co.uk/deals/Volkswagen/Golf it's unique content and I've built white hat links into it, including about 5 from university websites (.ac.uk domains DA70+). If you search something like "volkswagen golf deals" the pages on page 1 have weak thin content and pretty much no links. That content section wasn't always unique, in fact the vast majority of it may well be classed as dupe content as there's no Trim data and they look like this: http://www.carwow.co.uk/deals/Fiat/Punto While we never had much volume, the traffic on all /deals/ pages appears to drop significantly around the time of the May Panda update (4.0). We're planning on completely re-launching these pages with a new design, unique trim content and a paragraph (c.200 words) about the model. Am I right in assuming that there's a Panda filter on the /deals/ segment so regardless of what I do to one deals page it won't rank well, and we have to re-do the whole section?

    | Matt.Carwow
    0

  • Well, I have been scratching my head on this for days, I will try throwing the ball to you with hopes someone more experienced than me can help. The scenario is: e-commerce -> brand page -> SERP -> comparison between how two pages rank; one from my website, one from a competitor website. The brand is Michelin, the keyword is "pneumatici michelin" (equivalent in italian of “michelin tires”). I am not looking at SERP first page, where competition is surely much more fierce. I am looking at position 11: http://www.cambio-gomme.it/marchi/michelin/ And my page (not in the first 50): http://www.gomme-auto.it/pneumatici/michelin My page: MOZ Page Grade (for keyword “pneumatici michelin”): A External backlinks to the page: 1 Domain Authority: 29 Page Authority: 24 On-page SEO optimization: keyword density: 0.87% internal links: 145 external links: 3 page size: 108kb html size: 24kb words on page: 2077 link-words: 408 non-linked words: 1669 time to first byte: 0.419s Competitor page: MOZ Page Grade (for keyword “pneumatici michelin”): A External backlinks to the page: 0 Domain Authority: 26 Page Authority: 13 On-page SEO optimization: keyword density: 0.75% internal links: 70 external links: 1 page size: 31kb html size: 9kb words on page: 1521 link-words: 168 non-linked words: 1353 time to first byte: 0.373s Domain age is very similar, both websites launched close to each other in 2012. Ideas? Suggestion on other metrics to compare?

    | max.favilli
    0

  • I think it's best to give you an example to illustrate what I'm asking here. Current Brand Name: Keyword Driven Brand Name (keyworddrivenbrandname.com) New brand Name: KDBN (kdbn.com) What will the effects of this change be. I'm slightly worried that we have lots of links with the anchor text "Keyword Driven Brand Name" and we rank very well for terms like "Keyword Driven" and "Brand Name". I guess what I'm asking is, do we need to go and change all those anchors to KDBN and will this upset our search rankings. Or do we leave the existing anchors? But will Google see this as over-optimised anchor text and penalise our website? Decisions decisions! Also, should we leave the old brand name in our title tags, at least for the transitional period, i.e. KDBN | Targeted Keyword | Keyword Driven Brand Name Any help with this would be really appreciated, Many thanks

    | Townpages
    0

  • What's the maximum amount of html pages that one should put in a folder, to get the best SEO GoggleBot crawl?  I'm aware that there's a limit of 10,000 on most servers, but was curious to know if a lesser amount of pages would be better, for crawling and indexing purposes.  Also curious on peoples opinions on whether .jpg and .gif files should follow similiar rules.

    | alrockn
    0

  • Hiya Moz Community I hope you are all great, I have a question regarding one of my websites, I have the main site and 2 sub folder sites essentially, I decided to upgrade one of the sites and placed it in a different sub folder, I then set up a 301 redirect to the new location, so far so good, I have been having a look at my  link profile using AHrefs, inside there is an SEO report facility, I ran the report and I have over 500 pages returning a 403 or Forbidden error. my question is whether the Equity from those pages is being passed to the new site? I actually removed all the old site from Google Cache to avoid misleading visitors, I suppose I could set the re-directs up manually if I the equity is not being passed to the new site although I was under the impression it would be, or 85% - 90% of it would be anyway. The reason why I am asking is that I have seen a significant drop in rankings for keywords that my site has always ranked highly for. thought I would see if you guys can clear that up for me. Thanks and regards Wes Dunn

    | wesdunn1977
    0

  • We have a login required section of our website that is being crawled and reporting as potential issues in Webmaster Tools. I'm not sure what the best solution to this is - is it to make URLs requiring a login noindex/nocrawl? Right now, we have them 302 redirecting to the login page, since it's a temporary redirect, it seems like it isn't the right solution. Is a 301 better?

    | alecfwilson
    0

  • Hi, My web site georgerossphotography.com and my ecommerce site store.georgerossphotography.com each reside on different servers. georgerossphotography.com has a domain authority of 30 store.georgerossphotography.com has a domain authority of 30 Clearly, they are considered two individual sites but is there any way that I can boost the performance of the primary domain by passing along some for that good SEO juice from the sub-domain? Any input would be gratefully received. Regards,

    | sirgeorge
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.