Remember this 16th of October, 2012, it’s somewhat of a monumental date for SEO’s. While speaking at PubCon, Matt Cutts announced yesterday, that Google would be releasing the long awaited disavow tool. The tool is meant to assist webmasters in discounting suspicious, or spammy links to their sites that may have been penalizing or confining them to Google’s sandbox. This means that webmasters have suddenly just been given a lot more control over their link profiles and how their sites are perceived by Google’s crawlers.
With the recent changes to Google’s search algorithms, link building tactics which may have once been considered white hat are now entering the realm of the gray and black. This concern has become so widespread that the directories and sites which SEOs once flocked to for link submissions have begun charging fees to remove those very same links (twilight zone, I know!). In response, webmasters have been nagging Google at every front in pursuit of a disavow tool. But now that it’s finally here what are the implications? We’ll walk you through using the tool and then address some of those questions. If you’re already familiar with how the tool works you can go ahead and skip to the second heading.
It’s important to note that the tool in itself will not be enough to remove previous penalties placed on sites affected by updates like Panda and Penguin. Webmasters will also need to supplement their disavowed links with a reconsideration request. With that, let’s move on to the good stuff.
How to Use the Disavow tool
Google has launched a dedicated page for the tool, where users can upload their disavowed links. In order to access the page, ensure that you are logged into the webmaster account under which the site you are trying to disavow links for has been added. Once you’ve reached the disavow page you’ll have the option to select the site in question.
Google’s working hard to limit the tool’s use though so you’ll have to select disavow one more time! On the following screen you will be able to upload your links.
The link formatting is actually quite simple and should be submitted through a plain text file. You will have two options: disavow entire domains, or disavow individual links. Users will be able to add comments to their documents using # (hash) tags for the purposes of documenting or elaborating on the domains listed. The following code provided by Google illustrates how these commands would look.
# Contacted owner of spamdomain1.com on 7/1/2012 to # ask for link removal but got no response domain:spamdomain1.com # Owner of spamdomain2.com removed most links, but missed these http://www.spamdomain2.com/contentA.html http://www.spamdomain2.com/contentB.html http://www.spamdomain2.com/contentC.html
In this example, the user is disavowing all links from the domain “spamdomain1.com” this is signified by the “domain:” operator in front of the URL. Following this are three absolute URLs with no operators. These are individual pages on www.spamdomain2.com which are linking to the users site. Lines with (#) prefacing the text are considered comments and are solely for the purposes of documentation. Simple enough right?
When to Use the Disavow Tool
Using the tool, is simple really; the real question is when to use it. According to Cutts, the disavow tool should mainly be used by webmasters who have received an unnatural link warning. In case you’ve never gotten such a warning (that’s a good thing) it would look something like this:
But Cutts never explicitly says that it should only be used by those who have received this warning, it’s more of a suggestion and there’s an important reason for this. For “power users” who understand the effect that certain types of links can have on a site, they will be able to stop those links in their tracks. So what exactly does that mean? Let’s take a look at a case where a link from a highly authoritative site, which would typically never register as an unnatural link, became cause for concern.
In July of 2012 I began an SEO campaign on a brand new travel and hospitality related site. As a (small) part of my efforts to establish an online social presence for this site I began targeting highly authoritative, relevant blogs to make quality comments on. One such blog was the Smith Travel Blog (STB), which had some pretty decent OSE metrics:
I left a comment on one of their posts, documented the link, subscribed to the comment feed and moved on with my campaign. Over the course of the next few weeks I saw my link profile skyrocket from 0 to around 1,400 links. As you can imagine this was certainly alarming; I knew that I had obtained some links over the time frame, but nowhere near this amount.
Logging into ahrefs , I examined the new and lost links for the week and found that the Smith Travel Blog had accounted for some 90% of those links. This comment had clearly been aggregated across a majority of the site’s automated comment feeds and was showing up as a sitewide link. Three problems here:
- Sitewide links for a comment slug with little/no relevance to any other pages on the site.
- An abrupt and rapid increase in inbound links
- The fallout from the abrupt and rapid decrease of links as new comments are published.
You can see the resulting link chart below, where over a period of 2 months more than 1400 links were registered and then lost. There was a noticeable effect on the site as it was held in Sandbox for nearly 3 months before ranking in the top 100 for any of its target keywords. Thankfully, the site has since recovered from this as more natural and quality links have been added to the profile.
click to enlarge
The important point here however is that even though this site has never received an unnatural link warning it could have benefited tremendously from the disavow tool. Had the tool been available I would have been able to upload the following line to my file:
# Posted blog comment which was aggregated sitewide on domain.com; # Tried reaching out to webmaster but no response. domain:domain.com
This would have effectively discounted any links from the STB site and would have likely kept me out of the sandbox.
The Tool and its Implications
While most SEOs will be elated at the release of the disavow tool, there is also some cause for caution. The tool is enables webmasters to discount spammy links to their sites, and so this opens the floodgates for experimentation, a stepping stone to unleashing a rampage of web spam.
At its initial launch the tool will comb disavowed links algorithmically. In other words, there won’t be a human factor in determining if a link should be disavowed. This means that webmasters will be able to test link tactics on a variety of sites and measure the impact of those tactics on search rankings. If a specific tactic triggers negative signals, then that user can simply disavow those experimental links. This shouldn’t be misconstrued as a criticism of the tool itself; Everyone in the office here is definitely excited about it, but there is also a consensus that the potential for abuse is significant.
Let’s take a look at an example of how a webmaster might exploit disavow. Please note that we do not condone such tests and that any tests that you do run are at your own risk. Suppose we buy 3 established domains: domain1.com, domain2.com and domain3.com.
On domain1.com one might add site-wide links in the footer to an external site.
If the site we are linking to experiences an immediate (1-2 days) boost in rankings, we can determine that this tactic has a short term effect. But if we see that the days following this period rankings retreat back to original levels or drop further we can take this a negative indicator.
But the site that we are linking to will have other variables influencing its rankings. Another way we can measure this tactic is by looking at the drop in rankings of the site hosting the footer link.
For domain2.com one might create a subdomain on sites like Squidoo, or Webs and regularly publish anchor rich content linking back to domain2.
Likewise, if we see that regular publication of anchor rich content is bringing rankings down we can determine this is a negative trigger. On the other hand we could also experiment solely with posting high quality articles to these sites, with no anchor text, but including a bio link. Again the idea here is to observe and measure.
On domain3.com one might create backlinks to the site from a list of similar directories (with similar domain names, OSE metrics, pagerank, and geographical location)
This will enable webspammers to test which directory sites pass PR, the benefit here is obvious. Avoid low quality directories that either penalize, or have no impact on your site. Submit to sites that have demonstrated a transferal of site/page authority.
In all cases we are able to measure the impact on search rankings and then react appropriately by either disavowing links we have seen to negatively impact rankings, or amplify tactics that seem to work until we test them to breaking point, or a threshold at which we know we are over-using the tactic.
So, What’s Wrong with Testing?
As an SEO I’d like to say “nothing”, in my personal opinion observation and experimentation are what separate the pro’s from the amateurs. But the potential for abuse here is rampant, and many folks will be experimenting with tools like SeNuke or Xrumer again, blasting their content across the web to determine what still works and what doesn’t.
We’re going to have to rely on Google to keep those types of characters under wraps, and they likely will, but probably not during the initial launch phase.
Of course the implications on web spam and the disavow tool are far more wide reaching, but being that I’m trying to keep up with Google’s freshness algorithm, this needs to be published ASAP.
So I want to ask the readers to chime in here: How will disavow impact the nature of spam and the web beyond SEOs ability to manage their link profiles? How do you think Google will respond to abuse of the tool? Would love to hear your thoughts and suggestions in the comments below.
See other great posts on the disavow tool at: