I've been meaning to post some version of this forever, and it's been getting in the way of me blogging more. So ... here goes... incomplete and warty as this post is.
I've come to think that many of these differences might be from a "implementation" vs. "specification" view, but I'll have to say more about that later....
The ongoing battle for future control over HTML is dominated not only by the usual forces ("whose technology wins?") but also some very polarized views of what standards are, what they should be, how standards should work and so forth. The debate over these prinicples has really slowed down the development of web standards. Many of these issues were also presented at the November 2010 W3C Technical Plenary in the "HTML Next" session.
I've written down some of these polarized viewpoints, as an extreme position and a counterposition.
Matching Reality:
- Standards should be written to "match reality": the standard should follow what (some, all, most, the important, the open source) systems have implemented (or are willing to implement in the very near future.)
- Standards should try to "lead reality": The standard should try to move things in directions that improve modularity, reliability, and other values.
Of course, having standards that do not "match reality" in the long run is not a good situation, but the question is whether backward compatibility with (admittedly buggy) implementations should dominate the discussion of "where standards should go". If new standards always match the "reality" of existing content and systems, then you could never add any features at all. But if you're willing to add new features, why not also try to 'fix' things that are misimplemented or done badly? There does need to be a transition plan (how to make changes in a way that doesn't break existing content or viewers), but that's often feasible.
Precision:
- Standards should precisely specify behavior, and give sufficient details for how to implement something "compatible" with the what is currently deployed, sufficiently that no user will complain that some implementation doesn't work "the same". Such behavior MUST be mandated by the standard.
- Standards should minimize the compliance requirements to allow widest possible range of implementations; "interoperability" doesn't necessarily mean that even badly written web pages must be supported. Conformance ("MUST") should be used very sparingly.
Personally, I'm more on the "blue" side: the more precisely behavior is specified, the narrower the applicability of the standard. There's a tradeoff, but it seems better to err on the side of under- rather than over-specifying, if a standard is going to have a long-term value. If a subset of implementations want a more precise guideline, doing so could be in a separate implementation guide or profile.
Leading:
- Standards should lead the community and add exciting new features. New features should ideally appear first in the standard.
- Standards should follow innovative practice only after wide experience with technology. Sample implementations should be widely reviwed and tested; only after wide experience with technology should it be added to the standard.
In general standards should follow innovation. Refinements during the standardization phase might be seen as "leading", in order to satisfy the broader requirements brought to bear as the standard gets reviewed. There's a compromise, but looking for innovation from a committee.... well, we all know about "design by committee".
Extensibility:
- Non-standard extensions should be avoided. Ideally, we should eliminate any non-standard extensions; everyone's experience should be the same.
- Non-standard extensions are valuable. Innovations have (and will continue to) come from competing (non-standard) extensions, including plugins. Not all plugins are universally deployed; sites can choose to use non-standard extensions if they want.
In the past, plugins and other non-standard extensions have fueled new features; why should this trend stop? There are trade-offs, but moves to eliminate non-standard extensions or make them less viable are conter-productive.
Modularity:
- Modularity is disruptive. Independent evolution of components leads to divergence and confusion. Independent committees go their own way. Subsets just mean unwanted choices and chaos.
- Modularity is valuable. Specifying technology into smaller separate parts is beneficial: the ability to choose subsets extends the range of applications; modules can evolve independently.
Modularity is important, but it has to be done "right". Architecture recapitulations organizational structure; separate committes with independent specs requires a great deal of good-faith effort to coordinate, and there's not a lot of "good faith" going around.
Timely:
- Standards take too long, move faster. Implementing and shipping the latest proposal is a good way to validate proposed standards and get technology in the hands of users. Standards that take years aren't interesting.
- Encouraging users to deploy experimental extensions before they are completed will cause fragmentation, because not all experiments succeed.
The community can see innovation pretty quickly, but good standards take time. I'd rather see experimental features as "proposals" rather than passed around as "the standard" misleadingly.
Web Content Authors Ignore Standards:
- Web authors don't care about standards. Most individual authors, designers, developers and content providers ignore standards anyway, so any efforts based on assuming authors will change isn't helpful.
- Influencing authors is possible. Authors can and will adopt standards if popular browsers tie new features to standard-conforming content.
I'm not convincued that influencing content authors is impossible. Doing so requires some agreement from "leading implementors" to give authors sufficient feedback to make them care, but this isn't impossible. It's happened in other standards when it was important.
Versionless Standards and Always On Committee:
- Standards committees should be chartered to work forever, because the technology needs to evolve continuously. A stable "standard" is just a meaningless snapshot. Standard committees should be "always on", to allow for rapid evolution. The notion of "version numbers" for standards is obsolete in a world where there are continual improvements.
- Standards should be stable. Continual innovation is good for technology suppliers, but bad for standards; evolution should be handled by allowing individual technology providers to innovate, and then to bring these innovations into standards in specific versions.
We shouldn't guarantee "lifetime employement for standards writers". A stable document should have a long lifetime, not subject to constant revision. If we're not ready to settle on a feature, it should likely move into a separate document and be designed as a (perhaps proprietary) extension. An "always on" committee is more likely to concentrate power in the few who can afford to commit resources, independently of how deeply they are affected by changes.
Open Source:
- Standards should always have an open source implementation. Allowing any company or software developer to provide their own private extensions is harmful; a content standard should be managed by the group of major (or major open source) implementors, so that any "standard" extension is available to all.
- Open source is useful but unnecessary. Proprietary extensions and capabilities (originally from a single source or a consortium) have benefited the web in the past and will continue to be sources of innovation. While "open source" may be beneficial, not everything will or can be open source.
Working on open source implementations can go hand in hand with working with standards. However, a standard is very different from open source software. In the end, users care about compatibility of a wide variety of implementations. We shouldn't guarantee "lifetime employement for standards writers".
The "Web" is defined by "What Browsers Do":
- The web is first and foremost “what browsers do”, and secondly a source of "web applications" technology (browser technology used for installable applications)
- Other needs can dominate browser needs Web technologies extend to the widest range of Internet applications, including email, instant messaging, news distribution, syndication and aggregation, help systems, electronic publishing; requirements of these applications should have equal weight, even when those requirements are meaningless for what “browsers” are used for.
Royalty Free:
- Avoid all patented technology. Every component of a browser MUST be implementable without any restriction based on patents or copyright (although creation tools, search engines, analysis, translation gateways, traffic analysis may not be)
- Patented technology has a place. In some cases, patented technology cannot be avoided, or is so widespread that “royalty free” is just one more requirement among many tradeoffs.
Forking:
- Forking a spec allows innovation. Having multiple specifications which offer different definitions same thing (such as HTML) allows leading features to be widely known and implemented, and allows group to work around organizational bottlenecks.
- Forking a spec is harmful. Multiple specifications which claim to define the same thing is a power trip, causing confusion.
Accessibility:
- Accessibility is just one of many requirements Accessibility is an important requirement for the web platform, but only one of many sets of requirements, to be traded off against the requirements of other user communities when developing standards
- Accessibility is not an option. Insuring that those who deploy products implementing W3C standards allow building accessible content is necessary before W3C can endorse or recommend that standard.
Architecture:
- Architecture is mainly theoretical; it is not a very useful concern; rather, invoking "architecture" is mainly a way of adding requirements that aren’t very useful.
- Architecture and consistency is crucial. Consistency between components of the web architecture and guidelines for consistency and orthogonality are important enough that existing work should slow down to insure architectural consistency.
And a few other topics I ran out of time to elaborate:
Digital Rights Management: DRM is Evil? DRM is an Important feature?
Privacy: Up to browsers? Mandated in specs?
Voice: Integrated? Separate spec?
Applications: Great? Misuse: use Browser?
JavaScript: Essential, stable? Fundamentally broken?
Nice overview. Leading, extensibility and timeliness seem to me to overlap to some extent. Or at the very least, they affect each other significantly. Another possible area of contention is whose consensus is required to alter the specification.
ReplyDeleteWas it really necessary to sprinkle ‘font color="…"’, ‘em’ and ‘strong’ throughout the post, though?
I think all of these interact to some degree, but not necessarily. I'd like to see more focus on teasing out what the real issues are... is it necessary to shut down 3rd party extensibility in order to insure that the "standard" is leading?
ReplyDeleteI'm pretty sure that "consensus" is the result of "compromise" and even then, it's necessary to have "rough consensus" with some way of figuring out winners and losers. And there is lots of gaming around "consensus" -- minor details of process have big effects on who can play.
I'm sorry about the post formatting, it wound up in blogspot through an awkward process.
I would guess that yes, if the goal is to always have the standard leading, then extensibility is necessarily disallowed.
ReplyDeletePerhaps “consensus” was a poor choice of words. I was referring to who’s got the final say (and how many entities form that who) on what goes into the spec.
Re: the formatting, ah, I see. I will soldier on regardless. ;-)
Phenomenal. A balanced approach almost always ends up being the best approach, though it can be hard to assess your own beliefs without an external measuring stick.
ReplyDeleteThis post is a reminder that many of the things my gut tells me are "right" actually stray into ideological extremes.
Ah yes, I remember this presentation. I disagree with a couple of your conclusions (e.g. I still prefer standards to be realtively small and modular, and relatively precise) but overall this is a nice way to think about the differences we see, which helps explain *some* of the politics and the gaps in understanding.
ReplyDeleteI like small modular standards too. I thought XHTML modularization was a great step forward.
ReplyDeleteAs for 'precision', I think the word 'precision' is used incorrectly (or at least inappropriately) to describe why some specification writers want to give mandatory-to-implement algorithms. That is, the standard specifies behavior in ways that are unnecessary for interoperability. I don't really think that accomplishes 'precision', although that's what the proponents claim as justification.