Last week, Google’s Head of Webspam, Matt Cutts, released a highly important video to shed more light on using the Google Disavow Tool to negate the effects of low quality links on a domain. This new video detailed whether the Disavow Tool should be used even if a site hasn’t received a manual penalty warning in Webmaster Tools:
If you are unable to watch the above video, the short answer is yes, although it should be used as a last resort. This is in line with what Cutts has previously said about the tool – however, in earlier videos on the topic, he suggests that the tool should only be used along with a reinclusion request:
This change opens up a whole new avenue for webmasters who have had penalisation issues due to low quality link building, negative SEO from competitors or even scraper sites linking to them.
The cynics among us may think this new video from Cutts is Google simply opening the floodgates to try and gather as much information on low quality links as possible, so they can further improve their algorithm and hit penalised sites harder. This does not seem to be the case however, as it correlates with improvements seen in a client that recently joined Return On Digital, whose lower quality links were disavowed, without having a manual penalty. Whilst previous Google algorithm updates had seen traffic drops, the graph below highlights the traffic improvement seen from the latest Penguin update (signified by the red line) after we had submitted a disavow file containing any links against Google’s guidelines.
Be warned, submitting a disavow file for a site without a manual penalty is not a quick fix, from our work we have seen that disavow lists are taken into account when Google updates their algorithm. As the gap between the Penguin 2.0 and 2.1 updates was just under 4 months, it can take a while for a site to begin to recover.
So, whilst Matt Cutts’ latest video has been met with little fanfare compared to his other videos, it is a highly important piece of information that offers a lifeline to those who were worried that their sites would be hit eventually and webmasters who thought their websites were already lost.