Up until now, there has been no aspect of the search engine rankings algorithm that rewards a site which loads quicker over those that take a while. However, at PubCon Matt Cutts revelaed that in 2010 this could well be introduced as Google look to see that the faster sites appear at the top of the rankings.
For me personally this is a good thing, not because we host our sites on fast servers, but because I’ve always been one for ensuring your site’s pages conform to standards and produce valid HTML – not that you’d notice, but a page written with valid HTML will load marginally faster than a page that is littered with errors and warnings – blink and you will miss the difference though.
As a web developer in a previous life/job, with a keen eye and attention to detail when it comes to building sites, I’ve always looked to produce websites that are always valid HTML and I’m hoping one day sites with fewer errors will be favoured to those that are littered with “bad” HTML – for me it’s another aspect of good search engine optimisation.
I’m not holding my breath though, even the Google home page in all it’s simplicity has 41 errors and 2 warnings.