There's been quite a response to Internet Explorer 8's proposed system of Version Targeting. If you haven't heard of this, here's a quick overview: the IE team is proposing that there be a metadata tag recognizable by IE 8 that will instruct it how to render a page (as IE 6, IE 7, or IE 8). If the tag isn't present, then the default will be IE 7.

Why is this needed? Because the IE developers have a huge dilemma: version capability. They have loads and loads of sites designed specifically for IE 6 (for various reasons, some nefarious -won't go there) and had to make many compromises when designing IE 7 to maintain backwards compatibility while receiving a lot of flack from the web design community that IE 7 didn't do enough to address standards compliance. IE 8 just having passed the Acid2 test, there will probably be a lot of breaking changes in it that could harm tons of existing sites.

This seems to me like an elegant solution. They can effectively freeze existing sites to use IE 7 (since they don't specify the meta tag and will get an IE 8 rendering) while allowing web designers to explicitly take advantage of more advanced features and better standards compliance of IE 8.

From a developer perspective, this seems natural. I'm used to writing .Net code that's bound to a specific version of the .Net runtime, just as Java developers are with the JRE, PHP developers with the PHP runtime, etc. Rendering differences of HTML, CSS, and JavaScript across browsers are brutal for developers, and even though practices like progressive enhancement can help, it seems that version targeting (if widely supported) could make developers lives a lot easier. Software engineering is hard enough.

Instinctively, however, this may give off a bad Microsoft anti-competitive IE-only vibe. It reminds me a little of pages detecting your browser and denying access if you're not running IE. Browser sniffing has a fairly unreliable history as a technique and browser makers have often been misleading to put it mildly. In the past, Opera has placed "MSIE" in their user agent string, and at one point Safari had the phrase "like Gecko" in theirs (with unintended consequences that you can probably imagine if you've ever had to parse a string).

This is not like browser detection - it goes the other way. The metadata signal allows a web page to establish a contract between the page and the browser.

From what I've read, there seems to be a lot of negative perspectives on this issue, but here are a few voices I've run across that seem to have a more practical view of the situation: