Talk to us today 0845 86 22 122

Blog.

19th July 2011 |  Written by Dave Ashworth

Is valid HTML and CSS set to become a ranking factor?

As a web developer turned SEO type, I’m always happy to see Google using technical factors within their ranking algorithm.

When page speed came in, I immediately set out to make my client’s sites run faster than their competitors and in pretty much every instance this was acheived – basically because you’ll more than likely find a lot of SEO’s, for whatever reason, don’t address the technical side of the website when it comes to optimising.  More than likely the reason is the developer is far too busy on World Of Warcraft to be concerned with things like download speed and the speeding up of…

We also saw some evidence, in the form of increased rankings, to show that taking the time to do so was worth the effort.

So, it again “excites” me in some ways to see Google’s webmaster team talking about validation of CSS and HTML as a measurement of code quality, mainly because they inform us of their “goal of always shipping perfectly valid code”.

This by no way means that the validity of HTML and CSS is or ever will be a ranking factor, but if it’s important for Google to ensure it’s own pages validate, then maybe somewhere down the line they will reward sites that attempt to do the same – it’s certainly something I’ve always hoped they will do.

For me, I will always endeavor to get valid HTML/CSS on all my sites – at least certainly the bits that I code.  I wouldn’t suggest you obsess about 100% validation because very often 3rd party plugins etc will come along and spoil the party.

It’s easy enough to check too, just use the Web Developer plugin for Chrome/Firefox which allows you to check both live and locally.

The number one issue you are more than likely to face is the use of the “&” within URLs and page titles – simply replace this with &

Another common issue I find is that YouTube’s embed code doesn’t validate – but with a tweak or two it’s easily remedied.

For me, this is best practice stuff that I do anyway, but this post certainly gives you something to think about moving forward – I think the more factors introduced which aren’t as open to black hat method the better (why would you try to manipulate code to be anything other than valid?).

But as I said, I wouldn’t obsess about it just yet, not at least till Google get their own house in order:

And yes, am aware this page and the rest of the blog doesn’t validate, I’ll blame that on WordPress and the social media plugins…

The rest of the site validates just fine.

UPDATE

Following on from this article, it has provoked a little discussion in my Twittersphere:

@ As long as it's not inaccessible code it has nothing to do with quality of content.
@neil_yeomans
Neil Yeomans

Let me point out, I don’t think that page speed and valid HTML should be a more important ranking factor than quality content and links etc – neither do I think that a site should be penalised for not having valid code – but I do think that moving forward, in particular with mobile search growing rapidly, that sites which aim to make mobile searching a better experience for the user – which valid HTML and page speed both do, then there should be some reward for sites that make the effort.

And if you’re still not convinced, then maybe this post about valid code on mobiles published on the Google Webmaster blog will get you thinking otherwise.

Your Comments

  1. I know it is unwise to question Google, but their measurement of validity seems too tight to me…the 0 seems to house too low a figure.

    Any web dev knows that a server-side loop can repeat a HTML error, so although it’s only done once, it can be flagged 10+ times.

    As for CSS, forget it. With the syntax of many CSS3 elements still not set, to use them you need vendor prefixes; all of which are flagged as errors (even when validating for CSS3). For example, using just 1 gradient can give you 5 errors.

    To categorise any site with more than 10 as the worst just seems ludicrous. As the above demonstrates, it doesn’t take much for a perfectly well-built site to spiral into 30+ errors. 10 should house the very worst sites, sites that are tearing up the web with 250+ errors…not the site

    The page-speed inclusion was a good move, but for HTML & CSS validation to become a ranking factor would mean that either Google would either have to take a more lenient stance, or just punish everyone. Whilst HTML can always do with a good tidy, the issues with CSS3 essentially means that you’d be punishing people trying to push the industry forward.

  2. Yep, I agree that it is a tough ranking factor to use fairly – and you’re spot on re: CSS, I spent years adding in “invalid” code just to get IE and Mozilla to both display min-height correctly, there’d have to be a level playing field across a browsers with regards how they render HTML and CSS.

    Though with the increase in mobile search, the importance for correctly rendering valid HTML will become important – I just think, why not, if 2 sites have great content, great links etc and are on a level playing field, then why not rank the one with “more valid” HTML higher?

    But as I said, I’ve a technical background with years of HTML and CSS under my belt, so I am bias towards that way of thinking.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>