Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

Category: Technical SEO

Discuss site health, structure, and other technical SEO strategies.


  • Mozzers: We have a instance where a client is looking to 301 a www.example.com to                                                                                                                           www.example.com/shop I know of several issues with this but wondered if anyone could chip in with any previous experiences of doing so, and what outcomes positive and negative came out of this. Issues I'm aware of: The root domain URL is the most linked page, a HTTP 301 redirect only passes about 90% of the value. you'll loose 10-15% of your link value of these links. navigational queries (i.e.: the "domain part" of "domain.tld") are less likely to produce google site-links less deep-crawling: google crawls top down - starts with the most linked page, which will most likely be your domain url. as this does not exist you waste this zero level of crawling depth. robots.txt is only allowed on the root of the domain. Your help as always is greatly appreciated. Sean

    | Yozzer
    0

  • I'm helping a client redesign their website and they want to have a home page that's primarily graphics and/or flash (or jquery).  If they are able to optimize all of their key sub-pages, what is the harm in terms of SEO?

    | EricVallee34
    0

  • We recently changed our CMS from php to .NET. The old CMS did not allow for folder structure in urls so every url was www.mydomain/name-of-page. In the new CMS we either have to have .aspx at the end of the url or a /. We opted for the /, but now my page rank is dead and Google webmaster tools says my existing links are now going through an intermediary page. Everything resolves to the right place, but looks like spiders see our new pages as being 302 redirected. Example of what's happening. Old page: www.mydomain/name-of-page New page: www.mydomain/name-of-page/ What should I do? Should I go in and 301 redirect the old pages? Will this get cleared up by itself in time?

    | rasiadmin1
    0

  • I have loveelectronics.co.uk, but I also own 10 other country code specific domains. I am short on links (i'm actually still setting up the website) and wondered that until i have country specific content, should I 301 redirect these websites to the homepage of my main site, or could I use them as links which would mean I have more linking root domains? Sorry if this is a beginner question, but it would be good to know so I can sort this.

    | jcarter
    0

  • I recently faced issue of Google recent algo update on my main website. I found my ranks were all of 5th to 7th page even with website name without .com at end we have 5th page shown our biz website. It was a news portal and behind in subdomain i was running my web hosting website. 1. I removed all news content from my website since i thought the news agency send me content send same content to others may cause in the issue so i removed the NEWS AREA 2. I am turn off all of my old subscriptions or membership of blog networks etc. to make sure i get proper good backlinks with good research etc. Is there anybody who can suggest me what shall i take more action ? Mean any kind of further good suggestion i will highly appreciate if anyone can help me with any suggestions of SEO. I know there are many people who knows lots about it thus i thought to ask to community.. I am also unsure its a Google panelty or a Google recent update negative changes to my website ? However i have already filled reconsideration request as an possible alternative by explaining google that we are no more NEWS CONTENT Website. Will wait for responses...

    | anand2010
    0

  • i am workin on a site www.progazon.ca, it's been up for almost a year now 11 months and still no pagerank. Is there something i can do. does it change something. thx

    | martinLachapelle
    0

  • I recently began working at a company called Uncommon Goods.  I ran a few different spider emulators on our homepage (uncommongoods.com) and I saw a 404 Error on SEO-browser.com as well as URL errors on Summit Media's emulator and SEOMoz's crawler. It seems there is a serious problem here. How is this affecting our site from an SEO standpoint? What are the repercussions? Also, I know we have a lot of javascript on our homepage..is this causing the 404? Any advice would be much appreciated. Thanks! -Zack

    | znotes
    0

  • Could changing one's physical address for a domain or going from public to private registration have a negative affect on rank? Other factors? Thanks!

    | 94501
    1

  • As we add PDF documents to our website, I want to take it up a notch. In terms of seo and software price, is Adobe Acrobat the only choice? Thanks! No Mac here. I should clarify that I can convert files to PDFs with Microsoft Word and add some basic info for the search engines such as title, keywords, author, and links. This article inspired me: www.seomoz.org/ugc/how-to-optimize-pdf-documents-for-search I can add links back to the page when I create the PDF, but we also have specific product PDFs that suppliers let us copy and serve from our server--why use their bandwidth. Much as you would stamp your name on a hard copy brochure the vendor supplies, I want to add a link to our page from those PDFs. That makes me think I should ask our supplier to give me a version with a link to our page. Then there is the question: is that ok to do? In the meantime, I will check TriviaChicken's suggestions and dream about a Mac, Allan. Thanks

    | zharriet
    0

  • Hi, My client just had there website redeveloped within wordpress. I just ran a crawl errors test for their website using Google Webmasters. I discovered that the client has about six hundred, 404 pages. Most of the error pages originated from their previous image gallery. I already have a custom 404 page set-up, but is there something else I should be doing? Is it worth while to 301 redirect every single page within the .htaccess file, or will Google filter these pages out of its index naturally? Thanks Mozers!

    | calindaniel
    0

  • Hi everyone, I have a small 8 page website I launched about 6 months ago. For the life of me I can not figure out why google is only indexing 3 of the 8 pages. The pages are not duplicate content in any way. I have good internal linking structure. At this time I dont have many inbound links from others, that will come in time. Am I missing something here? Can someone give me a clue? Thanks Tim Site: www.jparizonaweddingvideos.com

    | fasctimseo
    0

  • Our site has far too many pages for our 10K page PRO account which are not SEO worthy. In fact, only about 2000 pages qualify for SEO value. Limitations of the store software only permit me to use robots.txt to sculpt the rogerbot site crawl. However, I am having trouble getting this to work. Our biggest problem is the 35K individual product pages and the related shopping cart links (at least another 35K); these aren't needed as they duplicate the SEO-worthy content in the product category pages. The signature of a product page is that it is contained within a folder ending in -p. So I made the following addition to robots.txt: User-agent: rogerbot
    Disallow: /-p/ However, the latest crawl results show the 10K limit is still being exceeded. I went to Crawl Diagnostics and clicked on Export Latest Crawl to CSV. To my dismay I saw the report was overflowing with product page links: e.g. www.aspenfasteners.com/3-Star-tm-Bulbing-Type-Blind-Rivets-Anodized-p/rv006-316x039354-coan.htm The value for the column "Search Engine blocked by robots.txt" = FALSE; does this mean blocked for all search engines? Then it's correct. If it means "blocked for rogerbot? Then it shouldn't even be in the report, as the report seems to only contain 10K pages. Any thoughts or hints on trying to attain my goal would REALLY be appreciated, I've been trying for weeks now. Honestly - virtual beers for everyone! Carlo

    | AspenFasteners
    0

  • I moved a site earlier on in the year to a better server www.keyrs.co.uk, my main keywords being equity release - equity release calculator  and equity release schemes. Since this happened the ranking have gone down and the schemes and calculator terms and have hit positions 7-8 when they were 2-3. basically my question is open to all, i am looking to see what the problem is with these pages as it is driving me nuts. All tools on SEO moz show the pages are doing well, however i must be missing something. Mike

    | TomBarker82
    0

  • My website has been doing good slowly. I have been using seomoz recommendations. And it is a great help to see that my pages are slowly coming to the first page. I am also running PPC on google. I see there are many visitors to my website. But i do not get good conversion - or not getting customer buying products. My website : www.ommrudraksha.com My target keyword is : rudraksha

    | Ommrudraksha
    0

  • My website ommrudraksha has 3 links on every page. 1. Login 2. Register 3. My trolley My doubt is i do not want to give any weightage to these links. does these links will be calculated when page links are calculated ? Should i remove these as links and place these as buttons ? ( with look a like of link visually  ? )

    | Ommrudraksha
    0

  • I'm trying to make sense of video sitemaps so I can get one up and going but the set up seems unclear. We currently have 7 videos created and up on Youtube. I've got them embedded on the site to a "Video" landing page as well as having these product demo videos embedded on appropriate product detail pages. So when setting up the video sitemap it looks like I'll be using the  video:player_loctag as opposed to  video:content_locbecause I'm not linking to the file itself but rather a page it's hosted on. Correct? Additionally I'm adding the product detail page url here, not Youtube right? Lastly, do I need to insert an autoplay piece on the videos on the product detail page? I feel that would be an annoying user experience.</video:content_loc></video:player_loc> So part of my sitemap might look like this... <video:player_loc allow_embed="yes" autoplay="ap=[?]">http://website/ProductDetailURL</video:player_loc>

    | dgmiles
    0

  • Both SEOMOZ and Google webmaster tools report lots of 404 errors throughout my wordpress blog. I have the url structure set to category/title Most of the 404 errors seem to be that the crawler is looking for a /home.html page. Each time I add a new post I get more 404 errors. I could, of course, add 301 redirects but I presume there is an easy way to do this within the WP setup. Any ideas? Thanks

    | bjalc2011
    0

  • I have adjusted a setting in my CMS and the URL's have changed from http://www.ensorbuilding.com/section.php/43/1/firestone-epdm-rubbercover-flat-roofing to http://www.ensorbuilding.com/section/43/1/firestone-epdm-rubbercover-flat-roofing This has changed all the URL's on the website not just this example. As you can see , the .php extension has now been removed but people can still access the .php version of the page. What I want is a site-wide 301 redirect but can not figure out how to implement it? Any help is appreciated 🙂 Thanks

    | danielmckay7
    0

  • Hi, We are just a small company trying to understand this seo business. :O)  I hope you will give us your input on how we look and how we can improve. www.bakerbay.com Each week I go down the seomoz crawl and change the duplicate page titles it increases the duplicate page content. Then the next week it seems to find other titles that have been fixed that are afluwie now. Please help. We are a bead company and alot of our products have a similar description but have different colors and items numbers. Could you please advise me on how to fix these errors and increase our ranking? All the Best, Beth

    | BakerBayBeadCo
    0

  • Hi, We ran a site diagnostic and it came back with thousands of pages that have more than 100 internal links on a page; however, the actual number of links on those pages seems to be far less than what was reported. Any ideas? Thanks! Phil UPDATE: So we've looked at the source code and realized that for each product we link to the product page in multiple ways - from the product image, product title and price. So we have three internal links to the same page from each product listing, which is being counted by the SEOMoz crawler as hundreds of links on each page. But in terms of the Googlebot, is this as egregious as having hundreds of links to different pages or does it not matter as much?

    | beso
    1

  • I just bought a domain about a week ago and instantly ranked number 4 for for my keywords with the domain keyword bonus. I created a landing page off the root of my domain while I'm building out my main site. I accidentally did a 301 redirect instead of a 302 from my root to my landing paging and this resulted in me losing my position and only being about to find my domain in the google if I searched for my domain specifically. Anyway to regain my original position? I have removed the redirect. Have I been put in the sandbox?

    | JohnTurner79
    0

  • I'm trying to gain a better understanding of the concept of flat-site architecture, especially with larger sites.  I'd like to see how larger sites accomplish getting their visitors to the majority of their pages within 3 to 4 clicks. Does anyone know of any actual websites that exemplifies the practice of flat-site architecture?

    | EricVallee34
    0

  • A client would like to set up a knowledge base to work in conjunction with their website and we are tossing up whether to go with a hosted solution (and therefore set up as a subdomain) or find a solution that we host on the clients domain (which will presumably have more SEO benefit).  We are leaning towards the latter (although are mindful that we need to balance the client’s desire for a quality KB solution).  Appreciate your feedback.

    | E2E
    0

  • Who has the fastest hosting company? what major provider has fastest service for page load times? Looking for affordability like godaddy.

    | bozzie311
    0

  • This is not a question but something to share. If you click on all of these links and compare the results you will see why _ is not a good thing to have in your URLs. http://www.google.com/search?q=blue http://www.google.com/search?q=b.l.u.e http://www.google.com/search?q=b-l-u-e http://www.google.com/search?q=b_l_u_e http://www.google.com/search?q=b%20l%20u%20e If you have any other examples of working separators please comment.

    | Dan-Petrovic
    3

  • Total, total noob question, I know - but is rogerbot performance bound because of bandwidth and processing capacity? I understand if it is, but I am wondering for those of us with very large sites if we would be able to offload the burden on SEOmoz resources by running our own local licensed version of rogerbot, crawl the sites we want and the upload the data to SEOmoz for analysis. If this was possible would we be getting more immediate results?

    | AspenFasteners
    0

  • Thought I'd see what the asking side of Q&A feels like 😉 We've been hearing for forever that the internet is running out of IP addresses, but I finally encountered the reality of it. I just realized that one of my sites is on a shared IP (hosted by Hosting.com, formerly HostMySite.com). My other sites with them included a unique IP, so I was surprised to discover this. They claim it's due to limitations on their IP allocations. Hosting.com doesn't have the option to buy a unique IP, but some other hosts do. I noticed, though, that many of them are using IPv6 for the new accounts. Has anyone had practical experience with having an IPv6 address and is there any impact on SEO?

    | Dr-Pete
    0

  • We have our complete set of scriptures online, including the Bible at http://lds.org/scriptures.  Users can browse to any of the volumes of scriptures. We've improved the user experience by allowing users to link to specific verses in context which will scroll to and highlight the linked verse.  However, this creates a significant amount of duplicate content.  For example, these links: http://lds.org/scriptures/nt/james/1.5 http://lds.org/scriptures/nt/james/1.5-10 http://lds.org/scriptures/nt/james/1 All of those will link to the same chapter in the book of James, yet the first two will highlight the verse 5 and verses 5-10 respectively.  This is a good user experience because in other sections of our site and on blogs throughout the world webmasters link to specific verses so the reader can see the verse in context of the rest of the chapter. Another bible site has separate html pages for each verse individually and tends to outrank us because of this (and possibly some other reasons) for long tail chapter/verse queries.  However, our tests indicated that the current version is preferred by users. We have a sitemap ready to publish which includes a URL for every chapter/verse.  We hope this will improve indexing of some of the more popular verses.  However, Googlebot is going to see some duplicate content as it crawls that sitemap! So the question is: is the sitemap a good idea realizing that we can't revert back to including each chapter/verse on its own unique page?  We are also going to recommend that we create unique titles for each of the verses and pass a portion of the text from the verse into the meta description.  Will this perhaps be enough to satisfy Googlebot that the pages are in fact unique? They certainly are from a user perspective. Thanks all for taking the time!

    | LDS-SEO
    0

  • I have separate sites for my blog and main website. I'd like to link them in a way that enables the blog to boost my main site's SEO. Is there an easy way to do this? Thanks in advance for any advice...

    | matt-14567
    0

  • Hi, I have recently started working in-house for a company and one site development was started and completed just as I joined. A new area of the site has been developed, but the developers have developed this new section in php, which cannot be hosted on the windows server the site is running on (they tell me, is this correct?) They want to add the new section as a subdomain - http://newarea.example.co.uk/ whereas I would have preferred the section added as a new subfolder. I plan to ensure that future developments to not have this problem, but is the best solution to work with the subdomain (in this instance it may not be too bad as it is a niche area of the site), or can I redirect the pages hosted on the sub-domain to a subfolder, and is this recommended? Thanks for your time.

    | LSLPS
    0

  • As of a day ago, the SERPs in Google are showing our listing with NO meta description at all and the incorrect title.  Plus the Title is varying based on the keywords searched. Info: Something I just had done was have the multiple versions of their home page (duplicate content, about 40 URLs or so) 301 redirected the the appropriate place.  I think they accidentally did 302s. Anyone seen this before? Thanks

    | poolguy
    0

  • I want to move a site of mine that ranks #1 for many keywords to a new domain name.  I have already redirected many smaller less important sites to the new domain, but have held off on my most popular site.   If I redirect the entire site with a 301 redirect, what can I expect with my number one ranking, particular for coveted search terms..thanks for the input.

    | insync
    0

  • Using the .htaccess file how do I rewrite a url from www.exampleurl.com/index.php?page=example to www.exampleurl.com/example removing index.php?page= Any help is muchly appreciated

    | CraigAddyman
    0

  • Hello, I would like to have expert inputs about the best way to manage temporary content? In my case, I've a page (ex : mydomain.com/agenda) where I have listing of temporary article, with a lifetime of 1 month to 6 months for some of them. My articles also have a specific url like for ex : mydomain.com/agenda/12-02-2011/thenameofmyarticle/ As you can guess, I got hundreds of 404 😞 I'm already using canonical tag, should I use a  in the listing page? I'm a bit lost here..

    | Alexandre_
    0

  • I am on the process of redoing my site and would like to know what the the top three CMS providers you suggest.   The more information, the better. Thanks

    | IntegrisLoans
    0

  • We have an old web site which currently has good traffic and search ranking. However, the old design is not helping us convert traffic into customers and we have decided to re-design the web site. Due to challenges resolving 4XX issues in the current setup, we will be moving the site to a new CMS and hosting provider. The domain will remain the same. The plan is to create exactly the same pages in the new CMS, as what we have today. And to use the same URLs for each page. Content will remain the same in step one. We will only apply a new layout and design. Besides keeping the URLs the same as in the old system. What else should we be aware of when doing a web site migration, that might impact our search ranking?

    | petersen
    0

  • I am curious to find out what is the blogging platform of choice for enterprize level companies (employees more than 500, revenue more tan 150M). What would be the best solution from SEO point of view? I have used Wordpress in the past for small companies and feel that is the best. We are currently using Telligent. Is anybody using it?

    | Amjath
    0

  • Hi all, I have a very weird question about external links to our site from our own domain. According to GWMT we have 603,404,378 links from our own domain to our domain (see screen 1) We noticed when we drilled down that this is from disabled sub-domains like m.jump.co.za. In the past we used to redirect all traffic from sub-domains to our primary www domain. But it seems that for some time in the past that google had access to crawl some of our sub-domains, but in december 2010 we fixed this so that all sub-domain traffic redirects (301) to our primary domain. Example http://m.jump.co.za/search/ipod/ redirected to http://www.jump.co.za/search/ipod/ The weird part is that the number of external links kept on growing and is now sitting on a massive number. On 8 April 2011 we took a different approach and we created a landing page for m.jump.co.za and all other requests generated 404 errors. We added all the directories to the robots.txt and we also manually removed all the directories from GWMT. Now 3 weeks later, and the number of external links just keeps on growing: Here is some stats: 11-Apr-11 - 543 747 534 12-Apr-11 - 554 066 716 13-Apr-11 - 554 066 716 14-Apr-11 - 554 066 716 15-Apr-11 - 521 528 014 16-Apr-11 - 515 098 895 17-Apr-11 - 515 098 895 18-Apr-11 - 515 098 895 19-Apr-11 - 520 404 181 20-Apr-11 - 520 404 181 21-Apr-11 - 520 404 181 26-Apr-11 - 520 404 181 27-Apr-11 - 520 404 181 28-Apr-11 - 603 404 378 I am now thinking of cleaning the robots.txt and re-including all the excluded directories from GWMT and to see if google will be able to get rid of all these links. What do you think is the best solution to get rid of all these invalid pages. moz1.PNG moz2.PNG moz3.PNG

    | JacoRoux
    0

  • Hi all, I have a question on copied content and syndicated content - Obviously copying content directly form another website is a big no no, but wanted to know how Google views syndicated content and if it views this differently? If you have syndicated content on your website, can you penalised from the lastest Panda update and is there a viable solutiion to address this? Mnay thanks Simon

    | simonsw
    0

  • Hi, in an effort to prepare our page for the Panda we dramatically reduced the number of pages that can be indexed (from 100k down to 4k). All the remaining pages are being equipped with unique and valuable content. We still have the other pages around, since they represent searches with filter combination which we deem are less interesting to the majority of users (hence they are not indexed). So I am wondering if we should mask links to these non-indexed pages with JS, such that Link-Juice doesn't get lost to those. Currently the targeted pages are non-index via "noindex, follow"  - we might de-index them with robots.txt though, if the "site:" query doesn't show improvements. Thanks, Sebastian

    | derderko
    0

  • The Symptoms About a year ago, our site EZWatch-Security-Cameras.com had a PageRank of 5. Several months ago it sunk to a 4 and we were a little worried, but it wasn't anything to really sweat over. At the end of january we noticed it had dropped again to a PR3, again we were a little more worried. When the farmer update hit we suddenly dropped to a PR1 but our traffic wasn't seriously affected, and in march most of the pages regained their PageRank. I noticed this morning that our homepage rank has once again dropped to a PR1. I am waiting to see if there has been any significant drop in traffic, but I haven't spotted anything that stands out significant, aside from an increase in the average cost for our paid search account of about 5%. The Problems We've Spotted Keep in mind that our current website is fairly old (2005) and we are ready to launch a new one. Our current website is running on X-Cart, and we have a few modules added on. Problem 1 - One such module handles a custom kit builder, this area has not been restricted by crawlers and it could be generating a large amount of needless page crawls. Problem 2 - Another module allows "SEO friendly URL's" according to the developer, but what actually happens is a visitor could type in any-url-they-like-for-product-id**-p-11111.html**, where the underlined section is any character string (or lack of), followed by either a product or category indicator and the id for said item. This causes a massive amount of virtual page duplications, and the module is encrypted so we aren't able to modify it to include rel="canonical" tags. Obviously this causes massive amounts of seemingly duplicate content. Problem 3 - In addition to the regular URL duplication, we also recently acquired the domain EZWatch.com (our brand name, easier to remember). That domain name responds with the content from our regular website, and it will be the primary domain name when we change shopping carts. With the second domain name the content could also be considered a duplication. The Solutions We're Working On The website we use was designed in 2005, and we believe that it's reached the end of its useful life. Over the past several months we have been working on an entirely new shopping cart platform, designed from the ground up to be more efficient operationally-speaking, and to provide more SEO control. The new site will be ready to launch within days, and we will start using the new Domain name at the same time. We are planning on doing page-to-page301 redirects for all pages with at least 1 visit within the past 180+ days, according to our Google Analytics reports. We are also including rel="canonical" on all pages. We will also be restricting dynamic sections of our website via the robots.txt file. So What More Can We Do? With your collective SEO experience, what other factors could also be contributing to this decline?

    | EZWatchPro
    0

  • I have a site that ranks high on the first page for it's main keyword at both Bing and Yahoo but horribly at Google.  It's a domain I recently acquired and am in the process of optimizing. My goal is to improve the relevancy for the site in Google so that the site shows up better for it's main keyword.  With that said I've been working on building valuable links to the page and I would like some opinions on why the homepage is not ranking for the main keyword.  Instead I have a junky content page that is ranking for the term. So in the event that you have a exact match domain showing up very high in Bing and Yahoo but not in Google for the homepage, what factors would you look at? Add in the complexity that a page other than the homepage is making grounds on the exact match keyword having moved up from "not in the top 100" to the 50's, what's my best solution to ranking the homepage? The site is optimized well and most inbound links predominantly point to the homepage.

    | DotCar
    0

  • Google has indexed mysite.com/ and mysitem.com/\/ (no idea why).  If you click on the /%5C? URL it takes you to mysite.com//.  I have a rel=canonical tag on it that goes to mysite.com/ but I was wondering if there was another way to correct the issue.

    | BryanPhelps-BigLeapWeb
    0

  • Hello guys! We are in process of buying a new domain. How can we be sure that this domain is not blacklisted and are there any steps to take in order to be sure that whatever we are buying is actually in "good shape"? Thanks much!

    | echo1
    0

  • I'm a novice...I've just run my crawl diagnostics and I wonder how important is it to a) Have meta-descriptions on every page, b) Have all titles less than 70 characters? Thanks in advance. Dan.

    | danfk
    0

  • I'm looking to install a word press caching plugin to help speed up my site. My question is which plugin or method is best practice to insure my ranking are not hurt? Thanks

    | mmaes
    0

  • We recently removed /img and /imgp from our robots.txt file thus allowing googlebot to crawl our image folders.  Not sure why we had these blocked in the first place, but we opened them up in response to an email from Google Product Search about not being able to crawl images - which can/has hurt our traffic from Google Shopping. My question is:  will allowing Google to crawl our image files eat up our 'crawl allowance'?  We wouldn't want Google to not crawl/index certain pages, and ding our organic traffic, because more of our allotted crawl bandwidth is getting chewed up crawling image files. Outside of the non-detailed crawl stat graphs from Webmaster Tools, what's the best way to check how frequently/ deeply our site is getting crawled? Thanks all!

    | evoNick
    0

  • In my crawl diagnostics it showing a Rel=Canonical error on almost every page. I'm using wordpress. Is there a default wordpress problem that would cause this?

    | mmaes
    0

  • Here's a question I can't seem to find an answer to. Does web hosting within a targeting city make a different in the engines? For example, a site targeting the Denver area, with web hosting in Denver. Will this boast the ranking or is targeting limited to countries? Thanks!

    | mkoster
    0

  • I'm being docked for too many on page links on every page on the site, and I believe it is because the drop down nav has about 130 links in it. It's because we have a few levels of dropdowns, so you can get to any page from the main page.  The site is here - http://www.ibethel.org/ Is what I'm doing just a bad practice and the dropdowns shouldn't give as much information? Or is there something different I should do with the links? Maybe a no-follow on the last tier of dropdown?

    | BethelMedia
    0

Got a burning SEO question?

Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


Start my free trial


Looks like your connection to Moz was lost, please wait while we try to reconnect.