2.2.2. Criticisms

If standards organisations ought to aim, as the IETF does, to achieve technical excellence, prior implementation and testing, clear, concise and easily understood documentation, openness and fairness, and timeliness in their specifications,[1] then the grounds upon which such organisations can be, and frequently are criticised follow naturally: technical mediocrity, lack of field implementation or testing, obscure documentation, closed or partial procedures, and delay. There are standards organisations and processes that have been accused of all of these things.[2]

However rather than examining each of these failings, this subsection of the thesis will focus on three specific areas of criticism common to the IETF and W3C that are of particular relevance to the practice of Internet governance. These are criticisms of whether private standards bodies can make decisions by consensus within their membership, if so whether they do make decisions by such a process, and if so whether they should make decisions by such a process—or in short, criticisms of their effectiveness, their inclusiveness, and their legitimacy.

2.2.2.1. Effectiveness

A weakness of the standards processes of both the IETF and the W3C is the ease with which they can be disrupted by those who, because they have a proprietary specification of their own to push, or for some other reason, are able to stymie the achievement of consensus on the acceptance of a competing standard. This has been observed in the case of S/MIME and OpenPGP, and that of SPF and SenderID, in both of which cases the outcome was to fragment the standards landscape into two competing segments, neither of which might ever reach the status of a full Internet standard.

Although this is a criticism of the IETF and W3C processes, in a sense it reveals no fault in those processes. After all, they produced exactly the outcome they were intended to—that in the absence of consensus, there should be no standard. It is considered better not to specify a standard at all, than to release a so-called standard that a segment of the affected Internet community refuses to implement. To the extent that this policy can be criticised, so too can that of any other organisation that operates by consensus.

As Chapter 4 will discuss in more detail,[3] the answer to this criticism, such as it is, is that when consensus fails, another mechanism of governance will determine the dominant specification: typically, this mechanism will be markets (though it could also be rules). Once this mechanism has run its course, the specification most successful in the marketplace (or that which has been mandated by law) can be returned to the standards body to be formalised as a standard.

2.2.2.2. Inclusiveness

On the other hand not every failure of the Internet standards development processes of the IETF or W3C can be attributed to differences between stakeholders. On other occasions those bodies’ failure to produce a standard can be attributed to deficits in the design or implementation of their processes, which have prompted the development of competition from other standards bodies, or in some cases from other mechanisms of governance altogether.[4]

At the root of these procedural deficiencies is a lack of inclusiveness in the standards development process. For example in 2004, a rival to the W3C with no membership fees, the Web Hypertext Application Technology Working Group (WHATWG),[5] was formed in response to concerns about “the W3C’s direction with XHTML, lack of interest in HTML and apparent disregard for the needs of real-world authors.”[6]

Similarly in 2006 the W3C was publicly accused of failing to acknowledge or respond to comments on a specification, even from one of its own staff, leading another long-time commentator to allege, “Beholden to its corporate paymasters who alone can afford membership, the W3C seems increasingly detached from ordinary designers and developers.”[7] In response to such criticisms, in 2007 the W3C relaunched an HTML working group designed to facilitate the active participation of some of its critics.[8]

As for the IETF, whilst its membership may be more open than that of the W3C in theory, in practice it is a meritocracy that can be quite impenetrable to non-technical stakeholders.[9] A self-critical RFC from 2004 frankly acknowledged this problem:

The IETF is unsure who its stakeholders are. Consequently, certain groups of stakeholder, who could otherwise provide important input to the process, have been more or less sidelined because it has seemed to these stakeholders that the organization does not give due weight to their input.[10]

2.2.2.3. Legitimacy

This leads to the third main criticism (mirroring a similar criticism made of ICANN) that the IETF and W3C have strayed into areas of public policy without being legitimately entitled to do so by reason of either carrying a democratic mandate to develop policy, or having established a broad community consensus. Whilst they do establish a consensus within the Internet technical community in support of the specifications they standardise, as noted above this is in general neither broad nor community-based.

For example, in 1995 the W3C developed a specification called PICS, or Platform for Internet Content Selection,[11] that provided Web publishers with the ability to mark their pages with computer-readable metatags rating the content of the page. It was envisioned that this would enable parents and teachers to proactively restrict childrens’ access to certain Internet content, without the need for that content to be censored altogether.

The W3C’s press release about PICS[12] proudly announced that it had received input from 23 companies and organisations, most of which were ISPs, media or software companies, and only one of which—the Center for Democracy and Technology (CDT)—claimed to represent users. It should not therefore have come as a surprise to the W3C to find that opposition to the technology began to mount from quarters it had not consulted when developing the technology.[13] These critics maintained that just as easily as parents or teachers could utilise PICS, so too could it be used by a paternalistic ISP or a repressive government to filter out PICS-rated content automatically without any input from the end user.[14]

A lesson that the W3C might have drawn was that altering the architecture of the Internet so as to compromise its inherent values such as interactivity, openness, egalitarianism, and resilience,[15] is to tinker with its fundamental stuff. To do so essentially for public policy reasons, with input from only one representative of users and none of governments, was brash to say the least. Lessig states of PICS, “Given that [the consortium] is a pretty powerful organization, it should be more open. If they want to do policy, they have to accept the constraints on a policy-making body, such as openness in who can participate.”[16]

The IETF placed itself in a similar position to the W3C when making a policy decision not to include support for wire-tapping in the protocols it develops, despite the fact that national legislation or policy might require wire-tapping to be conducted on networks utilising those protocols.[17] This decision was less publicly controversial than the introduction of PICS, perhaps because it was more congruent with the Internet’s underlying values (or perhaps because it could be naïvely characterised as an abstention from action on the public policy issues in question).

Even so, the decision was one with considerable public policy implications, made without consultation outside the IETF’s membership. This raises questions over the democratic legitimacy of the process; questions that will be revisited in the conclusion to this chapter, and in Chapter 3.[18]

For now, it may at least be concluded that in standards development, as in technical coordination, public policy issues are inherently engaged, and that standards development bodies cannot abnegate responsibility for policy development by denying or ignoring that this is so.

Notes

[1]

IETF, The Internet Standards Process—Revision 3 (1996)

[2]

Waclawsky, John G, Closed Architectures, Closed Systems And Closed Minds (2004)

[3]

See Section 4.4.4.

[4]

This process is also described at Section 4.2.4.4.

[5]

See http://www.whatwg.org/.

[6]

WHATWG, The WHATWG and HTML 5 FAQ (2006)

[7]

Zeldman, Jeffrey, An Angry Fix (2006); and another, “The process is stacked in favour of multinationals with expense accounts who can afford to talk on the phone for two hours a week and jet to world capitals for meetings”: Clarke, Joe, To Hell with WCAG 2 (2006) .

[8]

W3C, W3C Relaunches HTML Activity (2007)

[9]

See Section 4.2.3.1.

[10]

IETF, IETF Problem Statement (2004)

[11]

See http://www.w3.org/PICS/.

[12]

W3C, Industry and Academia Join Forces to Develop Platform for Internet Content Selection (PICS) (1995)

[13]

ACLU, Fahrenheit 451.2: Is Cyberspace Burning? (2002)

[14]

Graham, Irene, Will PICS Torch Free Speech on the Internet? (1998)

[15]

See Section 1.3.1.

[16]

Attributed to Lessig in Garfinkel, Simpson L, The Web’s Unelected Government (2002), 4 (brackets in original).

[17]

IETF, IETF Policy on Wiretapping (2000)

[18]

See Section 3.4.1.