Toxic Link Removal
-
Greetings Moz Community:
Recently I received an site audit from a MOZ certified SEO firm.
The audit concluded that technically the site did not have major problems (unique content, good architecture). But the audit identified a high number of toxic links. Out of 1,300 links approximately 40% were classified as suspicious, 55% as toxic and 5% as healthy.
After identifying the specific toxic links, the SEO firm wants to make a Google disavow request, then manually request that the links be removed, and then make final disavow request of Google for the removal of remaining bad links. They believe that they can get about 60% of the bad links removed.
Only after the removal process is complete do they think it would be appropriate to start building new links.
Is there a risk that this strategy will result in a drop of traffic with so many links removed (even if they are bad)?
For me (and I am a novice) it would seem more prudent to build links at the same time that toxic links are being removed. According to the SEO firm, the value of the new links in the eyes of Google would be reduced if there were many toxic links to the site; that this approach would be a waste of resources.
While I want to move forward efficiently I absolutely want to avoid a risk of a drop of traffic.
I might add that I have not received any messages from Google regarding bad links. But my firm did engage in link building in several instances and our traffic did drop after the Penguin update of April 2012.
Also, is there value in having a professional SEO firm remove the links and build new ones? Or is this something I can do on my own? I like the idea of having a pro take care of this, but the costs (Audit, coding, design, content strategy, local SEO, link removal, link building, copywriting) are really adding up.
Any thoughts???
THANKS,
Alan -
Hi Jen:
That is very helpful, thanks!!
The one point I did not understand is the last one one regarding checking to see if the c-blocks are varied. Could you please elaborate.
Also, do you think it would be risky for me as an amateur to do this on my own, that link removal would be better left in the hands of a professional? I am working with a reputable SEO firm, but they are requesting almost $3,800 to identify and remove approximately 225 domains that have toxic links to my site. If I use a professional SEO firm I would probably want to conserve my resources for link building ($2,500/month). But I don't want to be penny wise and pound foolish. So do you think I could disavow bad links on my own?
Also, would you suggest any software of tools for doing so?
Thanks so much.
Alan -
I think Jen gave a great response and you should read it twice!.
A couple of things you might consider if you want to do this on your own, RMOOV.com is an amazing tool for contacting webmasters and asking/tracking link removal. Link Detox is another great affordable tool to evaluate links. If you still have a relationship with the firm you used to buy links, you might see if they can remove those links for you. The reality is that most webmasters won't respond to your requests to remove links. So if you can get the ones who created them to remove them, you will have more success.
I don't see why it would be bad to build good, real links in the mean time or at any time! Hard to believe that would be the advice a MOZ recommended firm gave you. Maybe they were trying to explain that great content is what matters?
Good luck!
-
Hi Alan,
Hmm I don't see their logic in saying the value of good links would be reduced. It's true that the toxic links may be harming your rankings (even if you don't have a manual penalty) and so you might not see the effect of the good links straight away. But once the bad links are sorted, then you will.
You could do the disavows yourself as long as you're confident as to what makes a good / bad link. When we're cleaning links up, we:
- Check to see if the linking domain is listed in Google – if not, we disavow;
- Check what the website actually is – if it is low quality, a spam directory, adult-themed, sites with viruses, gambling, etc, we disavow (this is to protect your brand as well as to clean up your link profile);
- Check and see if there are a huge number of links from a single website – in some cases we disavow; and finally,
- We check to ensure the link is within Google’s webmaster guidelines – for example, genuine recommendations in forums, genuine blog comment links that are on a relevant article, or genuine reviews are fine.
- We also look at links for c-blocks before the process begins and then check after disavow to see if they are varied or if anything’s been missed.
Hope that helps.
Jen
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Links from a nonexistent domain, what do we do?
Our website is receiving 15 links that I believe are negatively impacting us. The problem is, this website linking to us no longer exists. The domain is not even hosted. The website linking to us is: thepurpleelephantboutique . com/ How do we fix/resolve this issue?
Technical SEO | | spadedesign0 -
Optimizing internal links or over-optimizing?
For a while I hated the look of the internal links page of Google Web Master Tools account for a certain site. With a total of 120+K pages, the top internal link was the one pointing to "FAQ". With around 1M links. That was due to the fact, on every single page, both the header and the footer where presenting 5 links to the most popular questions. The traffic of those FAQ pages is non-existent, the anchor text is not SEO interesting, and theoretically 1M useless internal links is detrimental for page juice flow. So I removed them. Replacing the anchor with javascript to keep the functionality. I actually left only 1 “pure” link to the FAQ page in the footer (site wide). And overnight, the internal links page of that GWT account disappeared. Blank, no links. Now... Mhhh... I feel like... Ops! Yes I am getting paranoid at the idea the sudden disappearance of 1M internal links was not appreciated by google bot. Anyone had similar experience? Could this be seen by google bot as over-optimizing and be penalized? Did I possibly triggered a manual review of the website removing 1M internal links? I remember Matt Cutts saying adding or removing 1M pages (pages) would trigger a flag at google spam team and lead to a manual review, but 1M internal links? Any idea?
Technical SEO | | max.favilli0 -
404's in WMT are old pages and referrer links no longer linking to them.
Within the last 6 days, Google Webmaster Tools has shown a jump in 404's - around 7000. The 404 pages are from our old browse from an old platform, we no longer use them or link to them. I don't know how Google is finding these pages, when I check the referrer links, they are either 404's themselves or the page exists but the link to the 404 in question is not on the page or in the source code. The sitemap is also often referenced as a referrer but these links are definitely not in our sitemap and haven't been for some time. So it looks to me like the referrer data is outdated. Is that possible? But somehow these pages are still being found, any ideas on how I can diagnose the problem and find out how google is finding them?
Technical SEO | | rock220 -
Bad link profile?
Hi Mozzers! We have recently been handed this client due to the former SEO company building up a bad link profile, which resulted in the site dropping off the search results all together. Forcing them to get a new domain. This happened in July last year and we are unsure whether it would be wise to submit a reconsideration request and then 301 their old sites pages to the new domain. Basically I'm asking whether you can spot any spammy links being built in their profile. Here is the old domain: http://www.claimssolicitors.co.uk/ It would be great if you could help me out! 🙂 Thanks
Technical SEO | | Webrevolve0 -
What is the best way of generating links with our exchanging them
I have read that google do not look kindly at sites that exchange links so i am trying to find a way of generating links to a new site. I am building a new site and need to start to get google to index it and build important links to generate traffic. I have looked at link exchange sites but have read that this is not great with google and it is better if you have sites where there is just one way linking. I do not want to buy links and would like to find a way of generating free links which can help build up traffic and the status of my new site. Any help would be great
Technical SEO | | ClaireH-1848860 -
Javascript funtion as link? Why not show up?
We joined our Chamber of Commerce for the "link" as much as anything. After 9 months of having a link from our local chamber it has never showed up anywhere. You can see the link on my Chambers page, and you can click on it and it works. But it does not show up anywhere else....Not in any backlink checker, not in SEOmoz, not in Google Webmaster Tools. When I hover over our link on their page I see "javascript:encodeclick........my url" Is this link worth anything? What is a javascriptencodeclick? Does Google know it exists and give me credit for it? Our Chamber is clueless... they hire someone to do their website. Their webmasters response to my question was: Hi, These links look like this because this is just the way our system parses URLs that are entered into the membership directory so they can be clickable when displayed in the lister. These links will not have a negative effect on Google or SEO indexing purposes if that is what you are concerned about. They are not encoded or encrypted, this just happens to be the name of the Javascript function.
Technical SEO | | SCyardman0 -
Sitemap with links and images together
Hi there, my e-commerce platform (Magento with "Mageworx SEO Enterprise" plugin) is generating an sitemap.xml that mix text (links) and images, and the result is something like that: <url><loc>http://www.e-lustre.com.br/abajur/abajur-keops</loc>
Technical SEO | | e-Lustre
<lastmod>2011-04-10</lastmod>
<changefreq>daily</changefreq>
<priority>1.0</priority>
image:imageimage:lochttp://www.e-lustre.com.br/catalog/product/image/size/250x250/e/-/e- lustre_mantra_0030_grande.jpg</image:loc></image:image></url> WebmasterTools accepts it, but recognize it as an image sitemap. Have you seen that kind of sitemap, and most important, do you think that's a problem ? Full file: http://www.e-lustre.com.br/sitemap.xml0 -
Canonical Link for Duplicate Content
A client of ours uses some unique keyword tracking for their landing pages where they append certain metrics in a query string, and pulls that information out dynamically to learn more about their traffic (kind of like Google's UTM tracking). Non-the-less these query strings are now being indexed as separate pages in Google and Yahoo and are being flagged as duplicate content/title tags by the SEOmoz tools. For example: Base Page: www.domain.com/page.html
Technical SEO | | kchandler
Tracking: www.domain.com/page.html?keyword=keyword#source=source Now both of these are being indexed even though it is only one page. So i suggested placing an canonical link tag in the header point back to the base page to start discrediting the tracking URLs: But this means that the base pages will be pointing to themselves as well, would that be an issue? Is their a better way to solve this issue without removing the query tracking all togther? Thanks - Kyle Chandler0