2.2.1. Standards bodies

Although Internet standards may not be the sole province of the IETF, as the body responsible for the development of a large majority of such standards, it is unquestionably the Internet’s pre-eminent standards development body, and will be the focus of this section. Although it will not be possible to describe all of the dozens of other standards organisations that have played a part in Internet standards development, a discussion will also be made of two other particularly significant bodies: the W3C and the ITU.

Amongst the standards groups that will not be described in detail, but should be briefly mentioned here, are the International Organization for Standardization (ISO),[1] the IEEE (the Institute of Electrical and Electronic Engineers) and ETSI (the European Telecommunications Standards Institute).

The ISO, formed in 1947, is a network of generalised national standards institutes, such as Standards Australia,[2] the European Committee for Standardization (CEN), and the American National Standards Institute (ANSI),[3] coordinated by a Central Secretariat in Geneva. Some of its members are governmental organisations, but ISO membership is equally open to private sector national standards groups that are the most representative of standardisation efforts in their country.

The ISO does not usually initiate the development of specifications, but rather receives those that have already been approved as standards by one or more of its members or by other international standards organisations. From this point, a specification progresses towards recognition as an ISO standard within a Technical Committee, of which there are presently 192 (some of which are inactive).

As far as Internet standards development is concerned, the relevant Technical Committee is the JTC 1, which is unique in that it is the only joint committee, convened with the IEC or International Electrotechnical Commission[4] which in fact predates the ISO. It has 18 SubCommittees grouped into 11 “Technical Directions,” each SubCommittee potentially having a number of Working Groups. For example, the Motion Picture Experts Group which is responsible for the MPEG family of video and audio compression standards are Working Group 11 within SubCommittee 29 of JTC 1.

Examples of Internet standards developed by the ISO/IEC are SGML (Standard Generalized Markup Language), which formed the inspiration for the W3C’s simpler HTML and later XML, and the image format commonly known as JPEG,[5] whose namesake the Joint Photographic Experts Group is a joint ISO/IEC and ITU-T committee.

The contributions of the IEEE and ETSI to standards development include the IEEE 802.11 wireless networking standard, and the ETSI GSM (Global System for Mobile Communications) standard for digital mobile telephony networks. The Technical Liaison Group of ICANN (TLG) includes ETSI amongst its members, along with the IAB, W3C and ITU.

2.2.1.1. IETF

There is no formal membership of the IETF. It provides an inclusive technical forum for anyone who wishes to participate in Internet standards development. At each stage in the development of a proposed standard, it is discussed and debated on public electronic mailing lists and at three open meetings held each year. Whilst fees are payable for those who attend meetings, none are required to participate on the public mailing lists where most of the IETF’s work takes place. Those participating in the IETF do so in their capacity as individuals, not as representatives of their employers.

A macroscopic view of the unique structure of the IETF has already been given above.[6] Here the internal operations of the organisation will be described in more detail.

The IETF is currently divided into eight technical Areas. Work in each of these Areas is managed by an Area Director who is appointed by IETF’s Nomcom to the position for two years. The Area Directors and the Chair of the IETF make up the IESG, which bears overall responsibility for the technical management of the IETF’s activities.

Within each of the Areas are numerous short-term Working Groups established to work on specific projects, usually the development of specifications for a proposed Internet standard. Each Working Group has a Chair, and may have a number of subcommittees known as “design teams” which often perform the bulk of the work in drawing up the specification.

The charter of a Working Group, detailing its preliminary goals and schedules, is developed before its formation at a BOF (“Birds of a Feather”) meeting, which is called upon application by interested parties to the relevant Area Director. If the BOF so resolves, the Area Director will be requested to recommend the IESG to formally establish the Working Group. Each Working Group establishes its own operating procedures, which are generally not legalistic, and may vary its own charter as circumstances require.

The outcome of a Working Group’s deliberations are usually eventually published in the form of one or more RFCs. However, this is not to say that all RFCs are destined to become Internet standards. In part this is because the position of RFC Editor is not exclusively an IETF function, being overseen by the IAB and predating the IETF by two decades. Most RFCs in fact are simply informational, and are identified as such in their document header and by the use of an identifying “FYI” code.[7] Amongst the informational RFCs are documents on the IETF itself, such as RFC 3233 which provides a definition of the IETF, and RFC 3716 which is a report on its recent administrative restructuring.[8]

RFCs that are intended to become Internet standards develop out of documents known as Internet drafts that are normally generated by the relevant Working Group (although an individual outside of a Working Group could also submit one). To progress an Internet draft towards promotion as a standard, the Working Group, through its Area Director, may make a recommendation to IESG that it be accepted as a “Proposed Standard.” The IESG will do so if it considers the specification has undergone the requisite community review, and is generally stable, well understood and considered useful.

A six month discussion period on the new Proposed Standard follows, at the conclusion of which it may be reconsidered by the IESG to determine whether it should be promoted to the status of a “Draft Standard.” A Draft Standard must be sufficiently stable and unambiguous that applications can be developed by reference to it. At this point, the specification is expected to undergo only minimal revision, and there should also be at least two complete and independent implementations of the standard in software.

In practice, few specifications progress further than this. However the IETF standards process does allow for those that have become very stable and widely used to be promoted by the IESG from a Draft Standard to a full Internet Standard after four more months of discussion. The general criteria for acceptance of an RFC as an Internet Standard have been described as “competence, constituency, coherence and consensus.”[9] Consensus is required not only from within the Working Group, nor even the technical area from which the specification originated, but from the IETF as a whole, which includes anyone who subscribes to its public mailing lists.[10]

The IESG can decline to progress an otherwise technically competent and useful specification towards Internet Standard status if it determines that it has not gained the requisite degree of consensus. A recent example is provided by the SPF[11] and the competing Sender ID[12] Internet Drafts, both intended to address the problem of spam emanating from forged addresses. Both specifications, the first a community-developed document and the second based on a Microsoft proposal, provide a facility for recipients to verify that email bearing a certain domain name came from a source that was authorised to send that domain’s email.

The IETF formed a Working Group intended to reconcile the two drafts and produce a standards-track specification. However due to each side’s intransigence, the compromises required to enable either draft to be reconciled with the other could not be made, and the Working Group was eventually disbanded without reaching consensus. The result is that each specification has been approved only to proceed as an Experimental RFC, and that neither is likely to gain Internet Standard status.

A more successful recent example of the practical operation of the Internet standards development process in the IETF is that of DNSSEC.[13] DNSSEC (DNS Security Extensions) adds the facility for DNS information to be authenticated through the use of digital signatures. The importance of this is that the DNS as originally specified does not certify the authenticity of responses received to DNS queries. In practical terms, this means that an Internet user who accesses a certain domain cannot be certain that the Web site that appears in response actually belongs to the registered owner of that domain, rather than an imposter.

The applicable technical area of the IETF dealing with DNS is the Internet Area. A DNS Working Group already existed within that Area when DNSSEC was first proposed in 1995, so in this instance it was not necessary to go through the process of forming one. It took two years until the first Internet Draft developed by the Working Group was published as an RFC, the IESG allotting it the status of a Proposed Standard.[14]

Two years later again in 1999, the specification was refined into what became a new RFC[15] which obsoleted the earlier one, retaining its Proposed Standard status. A new version of the most popular DNS software called BIND (Berkeley Internet Name Daemon)[16] supporting the new DNSSEC specification was released that same year. This implementation of DNSSEC revealed practical problems that required an addition to the specification.

For the publication of this addition, the specification was divided into three Internet drafts. These became RFCs in March 2005,[17] still retaining the Proposed Standard status. By May 2005 there was a second implementation of the latest specification,[18] bringing the RFCs closer to progression to Draft Standards, though this is yet to occur. The first ccTLD to employ DNSSEC for its operations using the latest version of BIND was se (Sweden), in October 2005.

The deployment of DNSSEC within the global DNS root is likely to take somewhat longer, since it raises the political question of whether ICANN, the NTIA, or some more broadly-based body, ought to possess signing authority.[19] If DNSSEC is to be eventually accepted as a full Internet Standard, this will likely occur only once this political issue has been resolved and the DNS-signed root zone has been in successful operation for a number of years.

2.2.1.2. W3C

The World Wide Web Consortium,[20] or W3C, is an unincorporated body formed in 1994 by the software engineer who designed the protocols that define the Web, Tim Berners-Lee. The W3C develops standards for the World Wide Web that are known as W3C Recommendations. The IETF’s relationship with the W3C is a cooperative one, in which the IETF has formally ceded control over standards development in the Web space to the W3C.[21]

The main distinction between the W3C and the IETF is that the W3C was from its inception a paid membership-based organisation, with a sliding membership fee which as at 2008 ranges from USD$953 for small corporate, non-profit or governmental members in developing countries, up to USD$65 000 for full corporate membership in developed countries. This funding is used to support a full-time staff to assist in administration, research, and in the design and development of software conforming to the specifications developed by the organisation.[22]

This difference aside—and it is not a small difference—the organisation operates in a similar manner to the IETF in that members are expected to collaborate, through a variety of Working Groups, on the development of open technical specifications to support and enhance the infrastructure and features of the World Wide Web.[23]

As the IETF’s Working Groups work within a number of Areas, so the W3C’s Working Groups work within defined Activities, of which there are presently 24. The usual manner in which a new Activity or Working Group is formed is following the successful conclusion of a Workshop on the topic (similar in principle to an IETF BOF), typically arranged by the W3C’s staff (its “Team”) in response to a member’s submission.

Working Group membership is not open to the public as in the IETF, save that invited experts, not affiliated with any W3C member, may be co-opted to the group by its Chair. The first release of a proposed Web standard by a Working Group is known as a “Working Draft” (though like RFCs, there are also some Working Drafts that are not intended to become Recommendations). Comments on the Working Draft are solicited from both within and outside the W3C for a minimum period of three weeks. Once these comments have been addressed in writing, the specification may be progressed to the stage of a Candidate Recommendation.

A Candidate Recommendation is required to be implemented in software, preferably in two interoperable forms, before it may progress to a Proposed Recommendation. Comments on a Proposed Recommendation are received for a minimum period of four weeks. The specification finally reaches the status of a W3C Recommendation once it has been endorsed by the W3C Director and the members at large, through an Advisory Committee to which each W3C member appoints a representative and which meets in person biannually.

The W3C’s Working Groups are guided by an Advisory Board on issues of strategy, management, legal matters, process, and conflict resolution. The Board’s nine ordinary members are elected for two-year terms by the Advisory Committee. The Board’s Chair is appointed by the Team.

The Working Groups are also guided in technical issues related to Web architecture by a Technical Advisory Group (TAG). Five of the TAG’s eight ordinary members are elected by the Advisory Committee for two year terms, with the balance of its members, and the Chair, being appointed by the W3C Director.

The Director, Tim Berners-Lee, hears appeals from the decisions of Working Group Chairs. He is also responsible for assessing the consensus of the Advisory Committee, for example as to a proposal for the creation of a new Activity. The role of Director is not an elected one, with Berners-Lee essentially holding the position in perpetuity as the W3C’s benevolent dictator.

2.2.1.3. ITU

The International Telecommunications Union[24] was established in 1865 originally as the International Telegraph Union to regulate international telegraph transmissions. It became an agency of the United Nations in 1947. The ITU is now divided into three sectors, the Radiocommunication Sector or ITU-R, the Standardization Sector or ITU-T, and the Development Sector or ITU-D. Unless otherwise noted, references to the ITU in this thesis are to the ITU-T.

Broadly, the ITU’s equivalent to Areas or Activities are Study Groups, of which there are presently thirteen, and its equivalent to ad-hoc Working Groups are Working Parties (who delegate the actual technical work still further, to so-called Rapporteur Groups). Both Study Groups and Working Parties meet face-to-face on a variable schedule, and are not open to the public. A World Telecommunications Standardization Assembly (WTSA), held at least every four years, approves the structure and work programme of Study Groups and the draft Recommendations that they produce.

Until quite recently this meant that a telecommunications standard could not be developed in fewer than four years, but since 2000 a faster Alternative Approval Process (AAP), and the introduction of self-organised Focus Groups as an alternative to Working Parties established by Study Groups, have been introduced enabling some Recommendations to be finalised more quickly. The use of the AAP is restricted to Recommendations which do not have policy or regulatory implications, and therefore do not require formal consultation with Member States.

A Telecommunication Standardization Advisory Group, constituted by representatives from the ITU membership and convening between WTSA meetings, offers a role akin to that of the W3C’s Advisory Board in reviewing and coordinating the activities of the Study Groups. The General Secretariat is the staff of the ITU which manages its administrative and financial affairs, headed by a Secretary-General and his Deputy. Within the General Secretariat is the Telecommunication Standardization Bureau (TSB) which exercises oversight over the ITU-T process at large, and whose Director is elected by the members.

The ITU’s membership includes governments who join as Member States, and since 1994 private organisations who join as Sector Members. In 2007–2008, full membership fees ranged from CHF 19,875 for developing Member States or CHF 31,800 for Sector Members, up to CHF 12.72m and CHF 2.54m respectively. Up to 25% of the Member States form the Council of the ITU, which spans all three sectors and guides the policy of the Union in between four-yearly Plenipotentiary Conferences at which all members meet.

Until they are released, ITU Recommendations are not open for public comment (though a Study Group may request permission to open its email mailing lists or FTP area to outsiders). In fact, even when they have been released, copies of ITU Recommendations must be purchased. In response to criticism of this policy, since 2001 three free electronic copies of Recommendations have been offered to registered users per year. The ITU’s definition of an “open standard” does not preclude the practice of charging to provide the specification, nor for the use of intellectual property comprised in the specification.[25]

Historically, the ITU has had little involvement in Internet standards development. Its experience lies in the tightly-regulated, hierarchically managed world of circuit switched telecommunications. But being well aware of the advance of packet switched technology as pioneered by the IETF’s TCP/IP protocol pair, and of the incipient convergence of IP and traditional telephony, the ITU has lately attempted to enter the Internet standards space, relying on the breadth of the definition of “telecommunications” in its Constitution for its mandate to do so.[26]

Its first significant entree to the world of data networking was as early as 1982, when the ITU introduced the OSI (Open Systems Interconnection) suite of network protocols, building on its earlier X.25 suite, which it intended as computer networking standards.[27] OSI had much going for it, not least the backing of the ISO which approved the OSI specifications as official standards. The IETF even established an Area devoted to the integration of OSI with the Internet’s protocols.[28] Yet OSI has been a resounding failure.[29]

The poor reception of the ITU’s networking standards is often attributed to the fact that they are complex, generally hierarchical in design, and, compared to Internet standards, over-engineered. For example, like their predecessor X.25, the OSI protocols placed Postal Telegraph and Telephone (PTT) authorities firmly at the top of a hierarchy, and assumed that computer owners would interconnect with those networks rather than directly to each other.[30] In comparison, Internet standards are generally much simpler, more likely to be decentralised in design, and more amenable to implementation in multiple interoperable forms.

For example, the ITU’s X.400 standard for email is broadly equivalent to the IETF Internet standard SMTP (Simple Mail Transport Protocol),[31] though the specification is very much larger and more complex.[32] It was assumed that X.400 mail servers would be operated by centralised PTTs, for example in that the standard specified automated procedures for X.400 messages to be transferred to facsimile, telex and postal mail services. An individual or business wishing to send X.400 email to a third party had to pay, in Australia’s case, Telstra $20 per hour for access to its X.400 network.[33]

As the ITU’s standards are complex, hierarchical and over-engineered, so too the organisation that produced them is complex, hierarchical and highly bureaucratised. In the same way that the open, transparent architecture of the Internet reflects the culture of its founders, so too elements of the ITU’s more closed, opaque culture can be discerned in the standards that the ITU develops. It should therefore come as no surprise that the ITU’s Recommendations have failed to gain purchase on the Internet, since they are technically, and the processes by which they are developed are culturally, antithetical to the Internet’s architecture.

There are nevertheless a few instances in which ITU Recommendations have been deployed on the Internet; mostly where it borders the telephone network, for example in the technologies by which users connect to their ISPs. Four other examples can be given:

Even so, these remain isolated successes, and in general the ITU has been relegated to a subsidiary role in standards development by participating in the IETF process (on an equal footing with all other IETF members).

Having failed to make significant inroads into the standards development sphere of Internet governance, the ITU instead sought a role in technical coordination and public policy governance, through its adoption of Resolution 102 at its Plenipotentiary Conference in 2002, by which it undertook to “contribute to policy development related to the management of Internet domain names and addresses.”[42] The resolution also directs the ITU-D:

to organize international and regional forums, in conjunction with appropriate entities, for the period 2002–2006, to discuss policy, operational and technical issues on the Internet in general and the management of Internet domain names and addresses in particular for the benefit of Member States, especially for least developed countries ...

In pursuit of this directive, the ITU has held several joint workshops with ICANN on ccTLD management and the int gTLD since 2003, hosted fora on various other Internet governance issues such as spam and cybersecurity since 2004, and most significantly established the WSIS.[43]

Notes

[1]

See http://www.iso.org/.

[2]

See http://www.standards.org.au/.

[3]

See http://www.ansi.org/.

[4]

See http://www.iec.ch/.

[5]

More formally known as IS 10918-1 | T.81; see http://www.jpeg.org/.

[6]

At Section 2.1.2.

[7]

Other document codes are “BCP” which is assigned to policy documents intended to represent “Best Current Practice,” and “STD” for specifications which have reached the final stage of standardisation. Experimental and Historical RFCs are also categorised separately.

[8]

The references for this section of the thesis are those RFCs IETF, Defining the IETF (2002) and IETF, The IETF in the Large: Administration and Execution (2004) , along with IETF, The Internet Activities Board (1990), IETF, The Internet Standards Process—Revision 3 (1996) , IETF, The Tao of IETF: A Novice’s Guide to the Internet Engineering Task Force (1991) and IETF, The Organizations Involved in the IETF Standards Process (1996) .

[9]

Crocker, D, Making Standards the IETF Way (1993)

[10]

See also Section 4.4.3.3.

[11]

IETF, Sender Policy Framework (SPF) for Authorizing Use of Domains in E-MAIL (2005)

[12]

IETF, Sender ID: Authenticating E-Mail (2005)

[13]

See http://www.dnssec.net/.

[14]

IETF, Domain Name System Security Extensions (1997)

[15]

IETF, Domain Name System Security Extensions (1999)

[16]

See http://www.isc.org/sw/bind/.

[17]

IETF, DNS Security Introduction and Requirements (2005), IETF, Resource Records for the DNS Security Extensions (2005) and IETF, Protocol Modifications for the DNS Security Extensions (2005).

[18]

See http://www.ninetlabs.nl/nsd/.

[19]

Kuerbis, Brenden & Mueller, Milton, Securing the Root: A Proposal for Distributing Signing Authority (2007)

[20]

See http://www.w3c.org/.

[21]

IETF, The “text/html" Media Type (2000)

[22]

See generally Berners-Lee, Tim & Fischetti, Mark, Weaving the Web (1999), especially at 100–101.

[23]

See generally W3C, Process Document (2005).

[24]

See http://www.itu.int/, and specifically ITU, ITU-T Guide for Beginners (2005).

[25]

See ITU, TSB Director’s Ad Hoc IPR Group Definition of “Open Standards" (2005)

[26]

“Any transmission, emission or reception of signs, signals, writing, images and sounds or intelligence of any nature by wire, radio, optical or other electromagnetic means”: International Telecommunication Union (ITU) Constitution; Convention; Optional Protocol on the Compulsory Settlement of Disputes relating to the ITU Constitution, to the ITU Convention and the Administrative Regulations, 22 Dec 1992, 1994 ATS No 28 (entry into force for Australia 29 Sep 1994), Annex para 1012.

[27]

See generally Larmouth, John, Understanding OSI (1996).

[28]

See http://www.ietf.org/html.charters/OLD/oim-charter.html.

[29]

Huston, Geoff, ICANN, the ITU, WSIS, and Internet Governance (2005)

[30]

Franda, Marcus F, Governing the Internet: The Emergence of an International Regime (2001), 26

[31]

IETF, Simple Mail Transport Protocol (1982)

[32]

It runs to the size of several large books, whereas the basic SMTP protocol is specified in an RFC of 68 pages.

[33]

Known as Keylink: Garrett, Paula, What Can the Internet Do for You?: Join the Revolution (1997).

[34]

IETF, E.164 Number and DNS (2000)

[35]

IETF, A Simple Network Management Protocol (SNMP) (1990)

[36]

IETF, SIP: Session Initiation Protocol (2002)

[37]

X.509 relies on a hierarchy of Certification Authorities (or CAs) to certify the identity claimed by an applicant for the issue of a cryptographic key signed by that CA. The most successful commercial CA happens to be Verisign, which also operates the com and net gTLD registries.

[38]

For example, OpenPGP does not rely on a small number of corporate CAs to certify the identities of the parties to a transaction, but allows those parties to choose any other third parties whom they trust to fulfil that role: see IETF, OpenPGP Message Format (1998).

[39]

Microsoft and Netscape were then firmly locked in the “browser wars” in which each company matched the other feature for feature in a frenzied series of new product releases. Both released new email clients to accompany their Web browsers, boasting support for S/MIME, within months of each other in 1997. The combined market share of Microsoft’s and Netscape’s browsers was then as high as 98%: Thompson, Maryann J, Behind the Numbers: Browser Market Share (1998).

[40]

See Lessig, Lawrence, Code and Other Laws of Cyberspace (1999), 39.

[41]

Lessig’s explanation for this phenomenon is that the architecture of the Internet is vulnerable to being manipulated by corporations (see Lessig, Lawrence, Code and Other Laws of Cyberspace (1999) at 34 and 52) and governments (at 43–44) to their own ends. As true as this is, it does not demonstrate an inherent weakness of the Internet’s architecture, so much as its potential vulnerability against a strong opposing force of architecture, or a strong alternate mechanism of governance altogether such as that of markets.

[42]

ITU, Management of Internet Domain Names and Addresses (2002)

[43]

See Section 5.1.