Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Category: White Hat / Black Hat SEO
Dig into white hat and black hat SEO trends.
-
Is it wrong to have the same page represented twice in the Nav?
Hi Mozzers, I have a client that have 3 pages represented twice in the Nav. There are not duplicates since they land with the same URL. It seems odd to have this situation but I guess it make sense for my client to have those represented twice since these pages could fall into multiple categories? Is it a bad practice for SEO or is it a waste to have those in the NAV? Should I require to eliminate the extras? Thanks!
| Ideas-Money-Art0 -
Server down - What will happen to the SERP?
Hi everybody, we have a lot of websites (about 100) on one Server in Italy. This Server crashed 5 days ago and now it should go online (I hope!). What will happen to the SERP? What shall I do to recover the rank of every key? New links, new content, just wait...what? Tnks 😉
| Sognando0 -
Avoiding the "sorry we have no imagery here" G-maps error
Hi there, we recently did a redesign on a big site and added Gmaps locations to almost every page since we are related to Real State, Listings, Details, search results all have a map embedded. While looking at GWT I found that the top keywords on our site (which is in spanish) are the following. have here imagery sorry After a quick search I found out this is a Gmaps bug, when Google Bot accesses the Pages it throws an error out with this text repeated several times. If you do a search for "sorry we have no imagery here" you will see lots of sites with this issue. My question is, Is this affecting the overall SEO since Bots are actually crawling and indexing this hence its being reported by GWT, Should I cloak this to robots? Has anyone noticed this or has been able to fix it? Thanks in advance!
| makote0 -
Black Hat SEO Case Study - Private Link Network - How is this still working?
I have been studying my competitor's link building strategies and one guy (affiliate) in particular really caught my attention. He has been using a strategy that has been working really well for the past six months or so. How well? He owns about 80% of search results for highly competitive keywords, in multiple industries, that add up to about 200,000 searches per month in total. As far as I can tell it's a private link network. Using Ahref and Open Site Explorer, I found out that he owns 1000s of bought domains, all linking to his sites. Recently, all he's been doing is essentially buying high pr domains, redesigning the site and adding new content to rank for his keywords. I reported his link-wheel scheme to Google and posted a message on the webmaster forum - no luck there. So I'm wondering how is he getting away with this? Isn't Google's algorithm sophisticated enough to catch something as obvious as this? Everyone preaches about White Hat SEO, but how can honest marketers/SEOs compete with guys like him? Any thoughts would be very helpful. I can include some of the reports I've gathered if anyone is interested to study this further. thanks!
| howardd0 -
Can I 301 redirect old URLs to staging URLs (ex. staging.newdomain.com) for testing?
I will temporarily remove a few pages from my old website and redirect them to a new domain but in staging domain. Once the redirection is successful, I will remove the redirection rules in my .htaccess and get the removed pages back to live. Thanks in advance!
| esiow20130 -
Tags on WordPress Sites, Good or bad?
My main concern is about the entire tags strategy. The whole concept has really been first seen by myself on WordPress which seems to be bringing positive results to these sites and now there are even plugins that auto generate tags. Can someone detail more about the pros and cons of tags? I was under the impression that google does not want 1000's of pages auto generated just because of a simple tag keyword, and then show relevant content to that specific tag. Usually these are just like search results pages... how are tag pages beneficial? Is there something going on behind the scenes with wordpress tags that actually bring benefits to these wp blogs? Setting a custom coded tag feature on a custom site just seems to create numerous spammy pages. I understand these pages may be good from a user perspective, but what about from an SEO perspective and getting indexed and driving traffic... Indexed and driving traffic is my main concern here, so as a recap I'd like to understand the pros and cons about tags on wp vs custom coded sites, and the correct way to set these up for SEO purposes.
| WebServiceConsulting.com1 -
Website not listing in google - screaming frog shows 500 error? What could the issue be?
Hey, http://www.interconnect.org.uk/ - the site seems to load fine, but for some reason the site is not getting indexed. I tried running the site on screaming frog, and it gives a 500 error code, which suggests it can't access the site? I'm guessing this is the same problem google is having, do you have any ideas as to why this may be and how I can rectify this? Thanks, Andrew
| Heehaw0 -
Asynchronous loading of product prices bad for SEO?
We are currently looking into improving our TTFB on our ecommerce site. A huge improvement would be to asynchronously load the product prices on the product list pages. The product detail page – on which the product is ordered- will be left untouched. The idea is that all content like product data, images and other static content is sent to the browser first(first byte). The product prices depend on a set of user variables like delivery location, vat inclusive/exclusive,… etc. So they would requested via an ajax call to reduce the TTFB. My question is whether google considers this as black hat SEO or not?
| jef22200 -
Indexing content behind a login
Hi, I manage a website within the pharmaceutical industry where only healthcare professionals are allowed to access the content. For this reason most of the content is behind a login. My challenge is that we have a massive amount of interesting and unique content available on the site and I want the healthcare professionals to find this via Google! At the moment if a user tries to access this content they are prompted to register / login. My question is that if I look for the Google Bot user agent and allow this to access and index the content will this be classed as cloaking? I'm assuming that it will. If so, how can I get around this? We have a number of open landing pages but we're limited to what indexable content we can have on these pages! I look forward to all of your suggestions as I'm struggling for ideas now! Thanks Steve
| stever9990 -
What is the difference between Redirect 301 and RedirectMatch 301 in .htaccess?
Thanks in advance!
| esiow20130 -
Multiple stores for the same niche
I started developing a new niche of products in my country about 3 years ago. That's when I opened my first store. Everything went fine, until a year ago, when someone I thought was a friend secretly stole my idea and made his own competing store. I was pretty upset when I caught him and decided to make it as difficult as possible for him, so I made another 4 stores, trying to get him as low as possible in the search results. The new sites have similar products (although not 100% identical), slightly different titles, images and prices. They look different and are built on different e-commerce platforms. They are all hosted on the same server, have roughly the same backlinks, use the same Google account for Analytics, have the same support phone numbers etc etc. I wasn't thinking that I'm doing something fishy, so I didn't try to hide anything. Trouble is that those sites, after doing fine for a few months, dropped like bricks in the search results, almost to the point that they can't be found at all. At the moment, the only site that ranks relatively well is the original one and a couple of secondary pages with no importance from one of the other sites. How did this happen? Does Google have something against this practice? Did they take action by themselves when they realized that I was trying to monopolize this niche, or did my competitor report me for some kind of webspam? And more importantly, what do I do now? Do I shutdown all but my original site and 301 redirect users to it from the others? Can I report my competitor for engaging in the same practice? (He fought back and now he has 3-4 sites, some of which still rank kind of OKish, also he has no idea about web development, SEO or marketing, he just crudely copies what I do and is slowly but surely starting to do better than me).
| pandronic0 -
Are link directories still effective? is there a risk?
We've contracted a traditional SEO firm, mostly for link building. As part of their plan they want to submit our site to a large list of link directories, and we're not sure if that's a good option. As far as we know, those directories have been ineffective for a long time now, and we're wondering if there is the chance of getting penalized by google. When I asked the agency their opinion about that, they gave me the following answer - Updated and optimized by us - We are partnered with these sites and control quality of these sites. Unique Class C IP address - Links from unique Referring Class C IP plays a very important role in SEO. Powered by high PR backlinks Domain Authority (DA) Score of over 20 These directories are well categorized. So they actually control those directories themselves, which we think is even worse. I'm wondering what does the Moz community think about link directory submission - is there still something to be gained there, is there any risk involved, etc. Thanks!
| binpress0 -
Does Ezine articles still make any good?
In the past many of the articles we posted in our blog we post on Ezine articles. After Penguin still make any sense to post on Ezine? Can the post on Ezine make any bad or Good to our ranking? What kind of tactics are guys using to promote articles/post in your blog?
| Felip30 -
Whats up with google scrapping keywords metrics
I've done a bit of reading on google now "scrapping" the keywords metrics from the analytics. I am trying to understand why the hell they would do that? To force people to run multiple adwords campaign to setup different keywords scenario? It just doesn't make sense to me...If i am a blogger or i run an ecommerce site...and i get a lot of visit regarding a particular post through a keyword they clicked on organically. Why would Google wanna hide this from people? It's great Data for us to carry on writing relevant content that appeals to people and therefore serves the need of those same people? There is the idea of doing White Hat SEO and focus on getting strong links and great content etc... How do we know we have great content if we are not seeing what is appealing to people in terms of keywords and how they found us organically... Is google trying to squash SEO as a profession? What do you guys think?
| theseolab0 -
A Branded Local Search Strategy utilizing Microsites?
Howdy Moz, Over and over we hear of folks using microsites in addition to their main brand for targeting keyword specific niches. The main point of concern most folks have is either in duplicate content or being penalized by Google, which is also our concern. However, in one of our niches we notice a lot of competitors have set up secondary websites to rank in addition to the main website (basically take up more room on the SERPS). They are currently utilizing different domains, on different IPs, on different servers, etc. We verified because we called and they all rang to the same competitors. So our thought was why not take the fight to them (so to speak) but with a branding and content strategy. The company has many good content pieces that we can utilize, like company mottos, missions statements, special projects, community outreach that can be turned into microsites with unique content. Our strategy idea is the take a company called "ACME Plumbing" and brand for specific keywords with locations like sacramentoplumberwarranty.com where the site's content revolves around plumber warranty info, measures of a good warranty, plumbing warranty news (newsworthy issues), blogs, RCS - you get the idea...and send both referral traffic and link to the main site. The ideal is to then repeat the process with another company aspect like napaplumbingprojects.com where the content of the site is focused on cool projects, images, RCS, etc. Again, referring traffic and link juice to the main site. We realize that this adds the amount of RCS that needs to be done, but that's exactly why we're here. Also, any thoughts of intentionally tying in the brand to the location so you get urls like acmeplumbingsacarmento.com?
| AaronHenry1 -
Macrae's Blue Book Directory LIsting
Does anyone know more information about this directory? Is it a good quality directory that I should pay to get listed on?
| EcomLkwd0 -
Hiding content or links in responsive design
Hi, I found a lot of information about responsive design and SEO, mostly theories no real experiment and I'd like to find a clear answer if someone tested that. Google says:
| NurunMTL
Sites that use responsive web design, i.e. sites that serve all devices on the same set of URLs, with each URL serving the same HTML to all devices and using just CSS to change how the page is rendered on the device
https://developers.google.com/webmasters/smartphone-sites/details For usability reasons sometimes you need to hide content or links completely (not accessible at all by the visitor) on your page for small resolutions (mobile) using CSS ("visibility:hidden" or "display:none") Is this counted as hidden content and could penalize your site or not? What do you guys do when you create responsive design websites? Thanks! GaB0 -
Noindexing Thin Content Pages: Good or Bad?
If you have massive pages with super thin content (such as pagination pages) and you noindex them, once they are removed from googles index (and if these pages aren't viewable to the user and/or don't get any traffic) is it smart to completely remove them (404?) or is there any valid reason that they should be kept? If you noindex them, should you keep all URLs in the sitemap so that google will recrawl and notice the noindex tag? If you noindex them, and then remove the sitemap, can Google still recrawl and recognize the noindex tag on their own?
| WebServiceConsulting.com0 -
A site is using their competitors names in their Meta Keywords and Descriptions
I can't imagine this is a White Hat SEO technique, but they don't seem to be punished for it by Google - yet. How does Google treat the use of your competitors names in your meta keywords/descriptions? Is it a good idea?
| PeterConnor0 -
Should I Do a Social Bookmarking Campaign and a Tier 2 Linking?
I don't see anything bad in manually creating links on different (about 50) social bookmarking services. Is this method labeled as White Hat? I was wondering if it would be fine to create Tier 2 linking (probably blog comments) for indexing of the social bookmarking links? Please share your thoughts on the topic.
| zorsto0 -
Can one business operate under more than one website?
Is it possible for a business to rank organically for the same keyword multiple times with different web addresses? Say if I sell car keys and I wanted to rank for "buy new car keys" and I set up two different website say ibuycarkeys.com and carkeycity.com and then operate under both of these, would Google frown upon this?
| steve2150 -
Benefit of using 410 gone over 404 ??
It seems like it takes Google Webmaster Tools to forever realize that some pages, well, are just gone. Truth is, the 30k plus pages in 404 errors, were due to a big site URL architecture change. I wonder, is there any benefit of using 410 GONE as a temporary measure to speed things up for this case? Or, when would you use a 410 gone? Thanks
| bjs20100 -
What means a back door link. Please explain and I will give you credit
Some one is asking me to do a back door link to each other, what dose it mean?
| Joseph-Green-SEO0 -
Does IP Blacklist cause SEO issues?
Hi, Our IP was recently blacklisted - we had a malicious script sending out bulk mail in a Joomla installation. Does it hurt our SEO if we have a domain hosted on that IP? Any solid evidence? Thanks.
| bjs20100 -
How to Get Backlinks to a Coupon Code Website
Hello Guys, I run a coupon code website, which by its very nature does not contain the most compelling of content. As you can probably understand, not many people are going to want to link to a page which lists a number of coupons relating to a specific online retailer. I am really struggling to come up with new and innovative ways of attracting links and wondered if anybody was in a similar position to me or could offer some advice. Would love to get some feedback. Thanks!
| Marc-FIMA1 -
Footer Link in International Parent Company Websites Causing Penalty?
Still waiting to look at the analytics for the timeframe, but we do know that the top keyword dropped on or about April 23, 2012 from the #1 ranking in Google - something they had held for years, and traffic dropped over 15% that month and further slips since. Just looked at Google Webmaster Tools and see over 2.3MM backlinks from "sister" compainies from their footers. One has over 700,000, the rest about 50,000 on average and all going to the home page, and all using the same anchor text, which is both a branded keyword, as well as a generic keyword, the same one they ranked #1 for. They are all "nofollows" but we are trying to confirm if the nofollow was before or after they got hit, but regardless, Google has found them. To also add, most of sites are from their international sites, so .de, .pl, .es, .nl and other Eurpean country extensions. Of course based on this, I would assume the footer links and timing, was result of the Penguin update and spam. The one issue, is that the other US "sister" companies listed in the same footer, did not see a drop, in fact some had increase traffic. And one of them has the same issue with the brand name, where it is both a brand name and a generic keyword. The only note that I will make about any of the other domains is that they do not drive the traffic this one used to. There is at least a 100,000+ visitor difference among the main site, and this additional sister sites also listed in the footer. I think I'm on the right track with the footer links, even though the other sites that have the same footer links do not seem to be suffering as much, but wanted to see if anyone else had a different opinion or theory. Thanks!
| LeverSEO
Jen Davis0 -
Rel author and duplicate content
I have a question if a page who a im the only author, my web will duplicate content with the blog posts and the author post as they are the same. ¿what is your suggestion in that case? thanks
| maestrosonrisas0 -
"NOINDEX,FOLLOW" same as "NOINDEX, FOLLOW" ?
Notice the space between them - I am trying to debug my application and sometimes it put in a space - Will this small difference matter to the bots?
| bjs20100 -
Starting every page title with the keyword
I've read everywhere that it's vital to get your target keyword to the front of the title that you're writing up. Taking into account that Google likes things looking natural I wanted to check if my writing title's like this for example: "Photographers Miami- Find the right Equipment and Accessories" ..Repeated for every page (maybe a page on photography in miami, one on videography in Orlando etc) is a smart way to write titles or if by clearly stacking keywords at the front of every title won't be as beneficial as other ways of doing it?
| xcyte0 -
href="#" and href="javascript.void()" links. Is there a difference SEO wise?
I am currently working a site re-design and we are looking at if href="#" and href="javascript.void()" have an impact on the site? We were initially looking at getting the links per page down but I am thinking that rel=nofollow is the best method for this. Anyone had any experience with this? Thanks in advanced
| clickermediainc0 -
Cross linking websites of the same company, is it a good idea
As a user I think it is beneficial because those websites are segmented to answer to each customer needs, so I wonder if I should continue to do it or avoid it as much as possible if it damages rankings...
| mcany0 -
Hover texts for hyperlinks
I've seen certain websites upon hovering your mouse over the hyperlink, text is displayed. Concept is similar to the IMG ALT tag. Do you think there is value when it comes to hyperlinks? It's already an anchor link. Thoughts?
| Bio-RadAbs0 -
Google places VS position one ranking above the places.
Hi Guys, Will creating a new Google places listing for a business have any effect their current position one spot for their major geo location keyword? I.e restaurants perth - say they are ranking no 1 above all the places listings if they set up a places listing would they lose that position and merge with all the other places accounts? Or would they have that listing as well as the places listing? I have been advised it could be detrimental to set up the places account if this is the case does anyone know any ways around this issue as the business really needs a places page for google maps etc. Appreciate some guidance Thanks. BC
| Bodie0 -
Would having a + plus sign between keywords in meta title have an effect on SEO?
I have seen one of my clients' competitors do this in their meta title and it got me a little intrigued... I understand that google uses the + sign as an operator in adwords, and to a certain extent, as a search tool, but would it help or make any difference to the SEO in the meta title/data (eg. 'SEO+Marketing+Services')? Thanks
| LexisClick10 -
Deny visitors by referrer in .htaccess to clean up spammy links?
I want to lead off by saying that I do not recommend trying this. My gut tells me that this is a bad idea, but I want to start a conversation about why. Since penguin a few weeks ago, one of the most common topics of conversation in almost every SEO/Webmaster forum is "how to remove spammy links". As Ryan Kent pointed out, it is almost impossible to remove all of these links, as these webmasters and previous link builders rarely respond. This is particularly concerning given that he also points out that Google is very adamant that ALL of these links are removed. After a handful of sleepless nights and some research, I found out that you can block traffic from specific referring sites using your.htaccess file. My thinking is that by blocking traffic from the domains with the spammy links, you could prevent Google from crawling from those sites to yours, thus indicating that you do not want to take credit for the link. I think there are two parts to the conversation... Would this work? Google would still see the link on the offending domain, but by blocking that domain are you preventing any strength or penalty associated with that domain from impacting your site? If for whatever reason this would nto work, would a tweak in the algorithm by Google to allow this practice be beneficial to both Google and the SEO community? This would certainly save those of us tasked with cleaning up previous work by shoddy link builders a lot of time and allow us to focus on what Google wants in creating high quality sites. Thoughts?
| highlyrelevant0 -
Disavow - Broken links
I have a client who dealt with an SEO that created not great links for their site. http://www.golfamigos.co.uk/ When I drilled down in opensiteexplorer there are quite a few links where the sites do not exist anymore - so I thought I could test out Disavow out on them .. maybe just about 6 - then we are building good quality links to try and tackle this problem with a more positive approach. I just wondered what the consensus was?
| lauratagdigital0 -
Backlinks for the same IP address
Hi Everyone I've been doing a backlink clean up as my site has dropped quite a lot in the search engine results over the last 4 months. While doing the backlink clean up I cam e across 20 different domains all based in the Washington/ VA area all with the same IP address. To make matters worse the contents and link to my site are all duplicated. Is this seen as bad practice from Google's perspective i.e. a link network.?? I look forward to hearing you comments Many thanks Jonathan
| JonnytheB0 -
Why do websites use different URLS for mobile and desktop
Although Google and Bing have recommended that the same URL be used for serving desktop and mobile websites, portals like airbnb are using different URLS to serve mobile and web users. Does anyone know why this is being done even though it is not GOOD for SEO?
| razasaeed0 -
Site dropped suddenly. Is it due to htaccess?
I had a new site that was ranking on the first page for 5 keywords. My site was hacked recently and I went through a lot of trouble to restore it. Last night, I discovered that my site was nowhere to be found but when i searched site: mysite.com, it was still ranking which means it was not penalized. I discovered the issue to be a .htaccess and it have been resolved. My question is now that the .htaccess issue is resolved , will my site be restored back to the first page? Is there additional things that i should do? I have notified google by submitting my site
| semoney0 -
Off-page SEO and link building
Hi everyone! I work for a marketing company; for one of our clients' sites, we are working with an independent SEO consultant for on-page help (it's a large site) as well as off-page SEO. Following a meeting with the consultant, I had a few red flags with his off-page practices – however, I'm not sure if I'm just inexperienced and this is just "how it works" or if we should shy away from these methods. He plans to: guest blog do press release marketing comment on blogs He does not plan to consult with us in advance regarding the content that is produced, or where it is posted. In addition, he doesn't plan on producing a report of what was posted where. When I asked about these things, he told me they haven't encountered any problems before. I'm not saying it was spam-my, but I'm more not sure if these methods are leaning in the direction of "growing out of date," or the direction of "black-hat, run away, dude." Any thoughts on this would be crazy appreciated! Thanks, Casey
| CaseyDaline0 -
Closing down site and redirecting its traffic to another
OK - so we currently own two websites that are in the same industry. Site A is our main site which hosts real estate listings and rentals in Canada and the US. Site B hosts rentals in Canada only. We are shutting down site B to concentrate solely on Site A, and will be looking to redirect all traffic from Site B to Site A, ie. user lands on Toronto Rentals page on Site B, we're looking to forward them off to Toronto Rentals page on Site A, and so on. Site A has all the same locations and property types as Site B. On to the question: We are trying to figure out the best method of doing this that will appease both users and the Google machine. Here's what we've come up with (2 options): When user hits Site B via Google/bookmark/whatever, do we: 1. Automatically/instantly (301) redirect them to the applicable page on Site A? 2. Present them with a splash page of sorts ("This page has been moved to Site A. Please click the following link <insert anchor="" text="" rich="" url="" here="">to visit the new page.").</insert> We're worried that option #1 might confuse some users and are not sure how crawlers might react to thousands of instant redirects like that. Option #2 would be most beneficial to the end-user (we're thinking) as they're being notified, on page, of what's going on. Crawlers would still be able to follow the URL that is presented within the splash write-up. Thoughts? We've never done this before. It's basically like one site acquiring another site; however, in this case, we already owned both sites. We just don't have time to take care of Site B any longer due to the massive growth of Site A. Thanks for any/all help. Marc
| THB0 -
Does posting on Craigslist damage our SEO or reuptation?
We have a website that's a single person barbershop. She has been promoting on Craigslist, and that is outranking the website in the SERPs. However, the craigslist results showing up are actually expired and don't link to anything. They just seem to be cached by Craigslist. My question is, is Craigslist considered to generally not be a good avenue for directing inbound links for services on your site? Or is it a good strategy to use Craigslist to build link traffic for service businesses? I get mixed responses when I search for this. Thanks eYtdHtg.png
| smallpotatoes0 -
How Is Your Approach Towards Adult SEO?
I would like to know how SEOMoz community members approach adult SEO. How do you approach a project when you get one (if you do it that is). If you dont do adult SEO, why do you not do it? Is it because it's much more difficult than normal SEO or do you not want to associate yourself with that industry?
| ConversionChamp0 -
Niche Directories
Hello, I like the concept that a directory is good only if you would still want the link if it passed no link juice. DMOZ and Best of the Web fall into that category. And so do some niche directories. How do you determine whether to go with a niche directory or not? Also, what's the ratio of how many niche directory links you'd want compared to other types of links (being very safe) 3:1? 4:1? 5:1? Does it matter what your other links are? Thanks.
| BobGW1 -
Rel Noindex Nofollow tag vs meta noindex nofollow
Hi Mozzers I have a bit of thing I was pondering about this morning and would love to hear your opinion on it. So we had a bit of an issue on our client's website in the beginning of the year. I tried to find a way around it by using wild cards in my robots.txt but because different search engines treat wild cards differently it dint work out so well and only some search engines understood what I was trying to do. so here goes, I had a parameter on a big amount of URLs on the website with ?filter being pushed from the database we make use of filters on the site to filter out content for users to find what they are looking for much easier, concluding to database driven ?filter URLs (those ugly &^% URLs we all hate so much*. So what we looking to do is implementing nofollow noindex on all the internal links pointing to it the ?filter parameter URLs, however my SEO sense is telling me that the noindex nofollow should rather be on the individual ?filter parameter URL's metadata robots instead of all the internal links pointing the parameter URLs. Am I right in thinking this way? (reason why we want to put it on the internal links atm is because the of the development company states that they don't have control over the metadata of these database driven parameter URLs) If I am not mistaken noindex nofollow on the internal links could be seen as page rank sculpting where as onpage meta robots noindex nofolow is more of a comand like your robots.txt Anyone tested this before or have some more knowledge on the small detail of noindex nofollow? PS: canonical tags is also not doable at this point because we still in the process of cleaning out all the parameter URLs so +- 70% of the URLs doesn't have an SEO friendly URL yet to be canonicalized to. Would love to hear your thoughts on this. Thanks, Chris Captivate.
| DROIDSTERS0 -
Rollover design & SEO
After reading this article http://www.seomoz.org/blog/designing-for-seo some questions came up from my developers. In the article it says "One potential solution to this problem is a mouse-over. Initially when viewed, the panel will look as it does on the left hand side (exactly as the designer want it), yet when a user rolls over the image the panel changes into what you see on the right hand side (exactly what the SEO wants)." My developers say" Having text in the rollovers is almost like hiding text and everyone knows in SEO that you should never hide text. "In the article he explains that it is not hidden text since its visible & readable by the engines.What are everyone's thoughts on this? Completely acceptable or iffy?Thanks
| DCochrane0 -
Site being targeted by hardcore porn links
We noticed recently a huge amount of referral traffic coming to a client's site from various hard cord porn sites. One of the sites has become the 4th largest referrer and there are maybe 20 other sites sending traffic. I did a Whois look up on some of the sites and they're all registered to various people & companies, most of them are pretty shady looking. I don't know if the sites have been hacked or are deliberately sending traffic to my client's site, but it's obviously a concern. The client's site was compromised a few months ago and had a bunch of spam links inserted into the homepage code. Has anyone else seen this before? Any ideas why someone would do this, what the risks are and how we fix it? All help & suggestions greatly appreciated, many thanks in advance. MB.
| MattBarker0 -
Why would links that were deleted by me 3 months ago still show up in reports?
I inadvertently created a mini link farm some time back by linking all of my parked domains (2000 plus) to some of my live websites (I was green and didn't think linking between the same owner sites / domains was an issue). These websites were doing well until Penguin and although I did not get any 'bad link' advices from Google I figure I was hit by Penguin. So about 3 or 4 months ago I painstakingly deleted ALL links from all of those domains that I still own (only 500 or so - the others were allowed to lapse). None of those domains have any links linking out at all but old links from those domains are still showing up in WMT and in SEOmoz and every other link tracking report I have run. So why would these links still be reported? How long do old links stay in the internet archives? This may sound like a strange question but do links 'remain with a domain for a given period of time regardless'? Are links archived before being 'thrown out' of the web. I know Google keeps archives of data that has expired, been deleted, website closed etc, etc for about 3 years or so (?). In an effort to correct a situation I have spent countless hours manually deleting thousands of links but they won't go away. Looking for some insight here please. cheers, Mike
| shags380 -
A Straight Answer to Outsourcing Backlinking, Directory Submission and Social Bookmarking
Hey SEOmoz Community! I've spent a bit of time now reading about SEO in books as well as online here within the SEOmoz community. However, I've still struggled to find a straight answer to whether or not directory submissions to non-penalized websites is acceptable.I suspect the reason I haven't found a straight YES or NO answer is because it isn't so straightforward and I respect that. My dilemma is as follows: I want to raise the domain authority for a few websites that I optimize for. I've submitted and gotten listed a bunch of excellent backlinks, however it still is a painfully slow process. My clients understandably want to see results faster, and because they have virtually no past outsourced link-building campaigns, I am beginning to think that I can invest some money for outsourcing directory submissions. I see more and more people talking about the latest Penguin updates, and how many of these sites are now penalized. BUT, is there any harm to submitting to directories such as the ones on SEOmoz's spreadsheet that aren't penalized? My concern is that in the future these will be penalized anyways, and is there a chance then that my site will also be de-listed from Google? At what point does Google completely 'blacklist' your site from its engine? Furthermore, I don't understand how Google can penalize a website to the point of de-listing it, because what would prevent other competitors from sending mass spammy back-links to another? What it all comes down to: At this point, are verified mass directory submissions through outsourcing still much more beneficial than detrimental to the ranking of a website? Thanks SEOmoz community, Sheldon
| swzhai0 -
Are expired domains for Godaddy backlinks already reset by googel ?
When does Google actual reset backlinks to a domain. If i am buying expired domains from godaddy action for linking purpose am i wasting my time.Also if that the case whats the point of buying expired domains with many links pointing to them. If the backlinks to the expired domain still show up under my Google webmaster tool window does that mean Google counts that link?
| Ryguy870
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.