W3C's credibility rocked by the failure of "Do Not Track" and its embrace of DRM on the Web
The W3C's tracking preference working group has been seeking to develop a specification for a standard called "Do Not Track" or DNT. The concept behind this standard (which I've written about in detail elsewhere) was to specify how a website or advertiser should respond to a notification expressed by a user (typically through a browser setting) that they do not wish to be tracked online.
A few hard-working consumer representatives invested enormous efforts into injecting public interest considerations into the development of the standard; notably privacy advocate Jonathan Mayer of Stanford University – but exasperated by the lack of progress, he quit the group this August, followed shortly afterwards by the Digital Advertising Alliance (DAA) who declared the process as a colossal failure. Although it's not yet official, the working group is essentially dead.
Why? Because this is not the sort of standard that can be developed through an industry-led process, such as those to which the W3C is suited. The W3C is, as its name implies, essentially an industry consortium, with annual membership fees that start in the thousands, and run into the tens of thousands of dollars (does this sound "grassroots" to you?). The ultimate decision-making power in the organisation lies with its Director, who has shown his willingness to override community views in favour of those of corporate members. Another example of this is his unpopular decision, confirmed last week last week, to allow the W3C to add support for DRM-protected content to the official specification for the World Wide Web.
The W3C's process is simply unsuitable for making any progress on a technical standard that involves disputed public policy issues, particularly those that impact broader community interests. These public policy issues have to be resolved first – generally at a political level, and preferably through a more structured multi-stakeholder process – before it becomes possible to develop a technical standard on the basis of those decisions. This was the mistake of the European Commission and the US Federal Trade Commission (FTC) in abdicating responsibility for regulating online tracking, in favour of the W3C process.
By and large, an industry body will not adopt public interest considerations unless forced to do so by a firm regulatory hand. About ten years ago, as an IT lawyer in Australia, I chaired what is described as a "co-regulatory" industry panel that was responsible for developing a new policy for the Internet industry. Co-regulation is essentially a form of self-regulation, that occurs with oversight from a regulator. In the Australian model, the regulator can direct an industry member to comply with a registered co-regulatory code, even if they did not individually agree to its terms.
The concept is that the industry will develop a strong code in the knowledge that if the regulator isn't satisfied with it, they will face regulation instead. The practice, as I can attest from my experience as chair and from subsequent work as a consumer advocate, is that industry will grudgingly accept a code that doesn't significantly impact on their existing operations, but will actively obstruct anything stronger.
Continue reading in "Web Consortium's failures shows the limits of self-regulation" at Digital News Asia.
This work is licensed under a Attribution Share Alike Creative Commons license