Whilst working on a client site last week, we started to notice crawl errors via Google Webmaster Tools when we saw over 100 404s were reported:
Initially, there was confusion because the style of URL was one that the blog did not produce – all URLs are SEO friendly and keyword rich. Whilst obviously the pages did not exist, the next issue was that the links were not, and to our knowledge never had been, on the pages that Google were reporting them to be from. So when a link is reported on the 14th and you check on the 15th with no sign of it, knowing that no changes to the site have been made, then you know something is amiss.
As 404s shouldn’t affect a site’s performance and as we couldn’t find any sign of how the pages were being picked up, the errors were marked as fixed, only for them to come back a few days later!
There is more on this issue on SEO Round Table which includes comment from Disqus who basically say it’s Google’s fault – but hopefully the two can work together to resolve the issue.