4.2.4. Hybrid models

This does, however, suggest the way forward: a hybrid between hierarchical ordering in the form of meritocracy, and a more participatory form of anarchistic, democratic or consensual ordering, to fill the normative holes in the hierarchical option, while retaining many of its benefits (such as the greater efficiency of a smaller governance body). Such a mixed system of governance is in fact precisely what Aristotle recommended.[1] It is also widely seen in Internet governance. ICANN, most notably has been described as a “semi-democracy,”[2] combining hierarchical and democratic elements, through the composition of its board which is drawn partly from the meritocratic Supporting Organisations and partly from the At Large community.[3] The same idea is found in other organisations in which a standing committee is appointed alongside elected members, for example in the Wikimedia Foundation and the W3C.

4.2.4.1. Co-regulation

Another example of an effective hybrid of hierarchical and participatory forms, as foreshadowed at the close of the discussion of anarchism, is the case of co-regulation.

Co-regulation illustrates a possible compromise between anarchistic forms of ordering (by norms, markets and architecture) and governance by rules, in which decentralised collective action is guided or directed by government (or to generalise this case, by some other hierarchical authority). To be more specific, co-regulation is the process by which an industry or industry segment is permitted to draft its own code of conduct on a particular issue, which if acceptable to the executive agency responsible for regulating that issue area, will be “registered” by it to serve in lieu of government regulation. Once registered the code applies to the entire industry sector in question, so that even those who are not signatories to it can be directed by the agency to comply with it.

There are numerous possible variations of this model along a continuum between pure hierarchical ordering and pure decentralised collective action (or between “command and control” and self-regulation, in simpler if less precise terms),[4] and these are sometimes known by other names such as “enforced self-regulation”[5] and “policy co-ordination,”[6] but the name and description given reflect the dominant practice in Australia.

Examples of co-regulatory regimes already in place in Australia include the various codes on topics such as billing and customer complaints developed by Communications Alliance Ltd for the telecommunications industry under the Telecommunications Act 1997 (Cth), the Internet content regulation regime established under the Broadcasting Services Act 1992 (Cth) and drafted by the IIA for the Internet industry, and two codes under the Spam Act 2003 (Cth), one of which was drafted by a committee of the IIA for the Internet industry and the other by the Australian Direct Marketing Association (ADMA) for the direct marketing industry.[7] In all of these cases, the government agency responsible for the registration of the codes is ACMA.[8]

The benefits of co-regulation can be described by comparison to either of the pure forms of which it is a hybrid. Over pure hierarchical organisational forms, it offers many of the same benefits as self-regulation, such as its greater speed and reduced expense over traditional governmental regulation, the ability of industry to develop or modify codes swiftly in response to environmental stimuli, as well as the pull towards voluntary compliance that is associated with governance by norms.[9]

As for the benefits of co-regulation over anarchistic forms of ordering, the ability for compliance with a co-regulatory code to be independently enforced addresses the limited effectiveness of anarchistic ordering that results from its voluntary nature.[10] Although a registered co-regulatory code does not have the full force of law, pursuant to section 121 of the Telecommunications Act 1997 (Cth), a member of an industry covered by a code can be directed to comply with its provisions by ACMA. It is an offence to fail to comply with such a direction.

The substantive content of the code is also more likely to reflect public policy concerns, rather than serving only the interests of its drafters as is often found in cases of pure self-regulation.[11] This is achieved in much the same way as in the case of directives of the European Union, whereby the government regulator specifies certain minimum outcomes that code is required to achieve, but not how those outcomes are to be achieved, which is left to the discretion of the industry.[12]

The problems of accountability and transparency associated with anarchistic ordering can also be addressed in co-regulatory structures, by establishing systems for the regulator to monitor compliance and for complaints to be independently heard. For example, clause 12 of the Internet Industry Spam Code of Practice drafted by the IIA provides that consumers may make complaints about an ISP’s breach of the code to ACMA, which will refer them to the IIA or the Telecommunications Industry Ombudsman (TIO) for determination.[13]

Since these are all benefits to government more so than to industry, it is a misapprehension to consider that phenomena such as co-regulation represent a loss of power by states to the private sector. Rather, the sharing of state authority with private actors is a process for which states are largely responsible, and which serves their own ends first and foremost.[14]

However whilst addressing some of the shortcomings of each of the pure regulatory forms, the co-regulatory form does introduce or exacerbate certain other problems. These include the risk of regulatory capture,[15] and the inherent incentive for industry to “cheat,” for example by writing loopholes into its codes.[16]

These dangers underline the need for broadly-based oversight of co-regulatory arrangements, from civil society as well as government.[17] For example section 117 of the Telecommunications Act requires codes registered under that Act to be subjected to an open process of public consultation. All codes registered to date have also been subject to regular review, with the first review of the Spam Code for example taking place one year after its registration.

4.2.4.2. Hybrid models in Internet governance

The model of domestic co-regulation could in principle be extended to the international arena, as self-regulatory arrangements are naturally extensible transnationally, as for example in the case of the International Bar Association’s[18] International Code of Ethics.[19] However in practice this is complicated by the limited choice of international authorities to assume the regulator’s role. Although there may already be an appropriate regulator in some issue areas, such as the WTO (which with the assistance of its members could transform international commercial arbitration into a co-regulatory regime), in other issue areas such as Internet governance new intergovernmental agreements may be required to establish a regulatory framework.

For this reason there are few existing international or transnational examples analogous to domestic co-regulation, but the European Union’s CE mark found on consumer and industrial goods offers one. The requirement for goods sold within the European Union to conform to EU standards and to carry the CE mark is mandated by EU resolution, but a product’s conformity to those EU standards is self-assessed by or on behalf of the product’s manufacturers, who must create a test report and declaration of conformity to support their assessment.[20]

Hybrid regulatory models are found in the context of Internet governance also. Most significantly, ICANN remains contracted until at least 2009 to the NTIA, which allows ICANN to manage the DNS essentially independently, while the NTIA retains ultimate authority over the DNS root.

auDA provides another good example. The process by which control of the au ccTLD passed from a pure self-regulatory regime under Robert Elz and later ADNA, to auDA has already been described.[21] In particular it was noted that this was facilitated by NOIE, a Commonwealth government agency, and that the Commonwealth reserved authority to itself under the Telecommunications Act 1997 to take over from auDA in the event that it ceased to act effectively.

In the context of the IGF, the scope for a co-regulatory approach can be found in the fact that one of the concessions made by governments in the Tunis Agenda was that the issues of DNS management and IP address allocation would be left outside the IGF’s mandate, and remain under the private management of the ICANN regime. There is no reason why the governmental stakeholders in the IGF could not similarly agree to leave other issues to be regulated through the decentralised collective action of the stakeholders at large, whilst retaining ultimate authority to intervene on a domestic or intergovernmental level should decentralised collective action fail to adequately address the issues in question.[22]

4.2.4.3. Governments as a proxy for the meritocracy

Would an IGF structured in such a manner, as a hybrid between the hierarchical power of governments and the anarchistic ordering of all other stakeholders, still amount to a governance network as it has been described in this thesis? It is not exactly the hybrid between meritocracy and decentralised collective action that was previously considered, as it substitutes governments for a meritocratic elite drawn from amongst all stakeholders. This is in one way indefensible, in that it privileges one stakeholder group over the others; a stakeholder group that we have already found lacks the legitimacy to exercise authority over transnational public policy issues.

Yet in another way, it could be argued that if it is necessary to concede to hierarchical ordering in order to address some of the identified limitations of anarchistic ordering, governments are in a better practical position to hold this elevated position than any of the other stakeholder groups. After all, it is they who can most effectively wield the coercive power of rules. And to allow governments to wield hierarchical power would neatly side-step the dilemma of how to select a meritocratic elite to do so. Whilst it was vaguely suggested above that such an elite could be selected through democratic or consensual means, most governments can be presumed already to have been selected by such means (though admittedly not in respect of transnational issues). Why then should it be necessary to reinvent the wheel? Reflecting this view, former ICANN President and CEO M Stuart Lynn has argued,

Although governments vary around the world, for better or worse they are the most evolved and best legitimated representatives of their populations—that is, of the public interest. As such, their greater participation in general, and in particular their collective selection of outstanding non-governmental individuals to fill a certain portion of ICANN Trustee seats, could better fill the need for public accountability without the serious practical and resource problems of global elections in which only a relatively few self-selected voters are likely to participate.[23]

If this view were to prevail, it would be that all stakeholders are equal within the IGF, but that some are more equal than others. Perhaps, however, this is the only practical outcome. The following discussion of hierarchy within open source software development may provide an insight into that suggestion.

4.2.4.4. Hierarchy and open source software

Although the burgeoning success of open source software and the philosophy underpinning it has been often described as the “open source revolution,”[24] open source software is actually nothing new; in fact it is older than proprietary software. Levy describes how even in the late 1950s and early 1960s, software for the first generation of minicomputers was made available “for anyone to access, look at, and rewrite as they saw fit.”[25]

Another common observation is that it is no coincidence that the rise of open source software has coincided with that of the Internet.[26] As never before, the Internet facilitated the development of open source software en masse by geographically distributed groups of hackers. But the relationship goes back still further, as the technical infrastructure of the Internet was itself largely built on open source software—even before it was known by that name. Prior to the term “open source” being coined in 1998,[27] it was more commonly known simply as “free software.”[28]

However, the software is free in more than one sense. Free or open source software[29] is in the FSF’s words not only free in the sense of “free beer,” but also in the sense of “freedom,” encompassing:

Although it is not required in order to satisfy this definition, certain open source software licences, most notably the GNU General Public License (GPL) which is used by a majority of all open source software (see FSF, GNU General Public License (1991)), require any work copied or derived from software covered by the GPL to be distributed under the same licence terms. This characteristic is referred to by the FSF as “copyleft,” as a play on “copyright,” in that it requires those who base their own works on copyleft-licensed software to forgo the exclusive rights that copyright law gives them to copy and modify their works, and to share those rights freely with the community.

More significant than the freedoms associated with open source software are the larger cultural and organisational consequences to which their exercise gives rise. These include the widespread voluntary service that members of the open source community provide in coding and documenting the software projects to which they contribute,[31] and the typical high quality, timeliness and innovation of their output.[32]

Eric Raymond, a hacker himself, has famously described the difference between the development methodology for proprietary software and that for open source software as that between “the cathedral and the bazaar,” in his essay of that name. To be built like a cathedral, in that context, is to be “carefully crafted by individual wizards or small bands of mages working in splendid isolation, with no beta to be released before its time,” whereas the bazaar style of development was epitomised by the Linux kernel development process, which

seemed to resemble a great babbling bazaar of differing agendas and approaches (aptly symbolized by the Linux archive sites, who’d take submissions from anyone) out of which a coherent and stable system could seemingly emerge only by a succession of miracles.[33]

The same phenomenon of “peer production” has begun to propagate beyond software development into other fields. It has already been observed in the hours that hundreds of contributors devote each week to the Wikipedia project, producing the most comprehensive encyclopædia ever written. The licensing model employed by Wikipedia is equivalent to that of open source software, although the material licensed may be more accurately described as “open content,” and the license employed is the GNU Free Documentation License (GFDL).[34]

There are, of course, other open content licences. Creative Commons is a project to draft and promote licences suitable for the release of all manner of literary, musical, artistic and dramatic works as open content.[35] The Creative Commons Web site makes some of this content available, though Creative Commons licensed content is also found on many other sites including the Internet Archive[36] and the OpenCourseWare project,[37] inaugurated by MIT and since extended to other institutions[38] for the publication of course materials.

The success of the open source development methodology is often explained by economic sociologists in terms of the low transaction costs associated with communication between developers,[39] and the network effects which increase the value of the open source “commons” to all as more people become involved.[40] Although puzzled as to what individual incentives developers have to voluntarily build up this open source commons,[41] they posit that it is a barter or gift exchange system in which developers exchange their labour for such goods as feedback from users and an enhanced reputation amongst their peers,[42] or that it is a means of improving their future employment prospects.[43]

To developers such as Raymond the question is less of a mystery: they do it because it is fun.[44]

Linus Torvalds, original author of the Linux operating system kernel, concurs with this view in his autobiography (which is suitably enough titled Just For Fun),[45] as does Levy in his history of the hacker community.[46] Software development is only one application of the open source ethic, but the fun extends to publishers of other forms of open content too: Jimmy Wales of Wikipedia for example unpretentiously states, “The goal of Wikipedia is fun for the contributors.”[47]

The same motivation also extends to projects small enough to be pursued by a single developer. Whilst these might not be thought of as organisations, lacking a community of developers, they are still aimed at a community of users or readers[48] and thus fulfil similar social needs as more structured virtual communities.[49] Take the example of blogs (“Web logs”); self-published online journals numbering over 100 million as at 2008.[50] Tim Wu observes that “in general, bloggers writing for fun—or out of single-minded obsession—can thump reporters trying to get home by 6pm.”[51]

But what underlies the fun? It might be argued that it is inherent in the creative process, but that begs the question, what underlies that?

At least to some extent, the answer is empowerment: the power to independently create or achieve something one perceives to be of value. The desire for such power is known by psychologists as a mastery, competence or achievement motive,[52] and Maslow placed it at the pinnacle of his hierarchy of human needs, naming it the need for self-actualisation.[53] Sociologists as far back as Weber came to the same realisation that increasing the bureaucratic rationalisation of work could be dehumanising; Weber describing this trend as an “iron cage” in which humanity was destined to be trapped.[54] Scholars of organisational behaviour have inherited this insight, and proposed strategies by which employees can be empowered (and thus made happier and more productive) by increasing their autonomy at work.[55]

Although the emergence of the open source methodology has been quite orthogonal to this scholarship, it is an exemplar of its programme in the extent to which it empowers the members of the open source community to pursue their own objectives, in their own way, in a manner that is not possible within an hierarchical bureaucracy.

It follows that the licence under which open source software is released, as important as it may be to the success of the software and to the movement as a whole, is not the most critical factor in its success as a software development methodology; rather, it is the empowerment of its contributors that is central. The licence is simply the means by which hackers have institutionalised in law (or rules) the ethic that “all information should be free”[56] in respect of open source software and open content, as they embedded it in the architecture of the Internet in respect of data communications.

On this basis, the egalitarianism of the open source software development model can be seen as reflecting that of the Internet itself. Both are models of anarchistic ordering largely of hackers’ own creation.[57] Thus as already observed it is no coincidence that the Internet is an enabling force for the open source paradigm, levelling the playing field between media juggernauts and software powerhouses, and teenagers writing or coding in their attic.[58] Freed of the hegemony of hierarchy, hackers and others pursing their need for self-actualisation become more empowered, fulfilled and happy.

However, to characterise the open source software development model as purely anarchistic is simplistic. In most projects, anarchy is balanced with hierarchical control.[59]

It is in fact common for open source software development projects to be governed by a “benevolent dictator for life” (or BDFL).[60] These are found in projects ranging from the Linux operating system kernel itself, of which Linus Torvalds is the BDFL,[61] Linux-based operating system distributions such as Ubuntu led by Mark Shuttleworth,[62] application software such as the Samba networking suite coordinated by Andrew Tridgell,[63] and programming languages such as Perl,[64] PHP[65] and Python[66] in which Larry Wall, Rasmus Lerdorf and Guido van Rossum respectively act as project leaders in perpetuity.[67]

In the case of the Linux kernel, Torvalds who is perhaps the archetype of a BDFL, possesses ultimate authority to decide which contributions (“patches”) to the Linux operating system kernel should be accepted and which should be refused. Torvalds no longer personally manages the whole of the kernel and has delegated authority to a number of trusted associates to manage particular subsystems and hardware architectures, but it remains his authority to appoint these so-called “lieutenants” and to supervise their work. A document distributed with the Linux kernel source code that is subtitled “Care And Operation Of Your Linus Torvalds” describes him as “the final arbiter of all changes accepted into the Linux kernel.”[68]

Thus contrary to what might be assumed from Raymond’s claim about “the Linux archive sites, who’d take submissions from anyone,” the Linux kernel development process is neither anarchistic nor consensual: if Torvalds does not like a patch, it does not go in to the kernel.[69] This has often antagonised other kernel developers, one of them commencing a long-running thread on the kernel development mailing list by saying:

Linus doesn’t scale, and his current way of coping is to silently drop the vast majority of patches submitted to him onto the floor. Most of the time there is no judgement involved when this code gets dropped. Patches that fix compile errors get dropped. Code from subsystem maintainers that Linus himself designated gets dropped. A build of the tree now spits out numerous easily fixable warnings, when at one time it was warning-free. Finished code regularly goes unintegrated for months at a time, being repeatedly resynced and re-diffed against new trees until the code’s maintainer gets sick of it. This is extremely frustrating to developers, users, and vendors, and is burning out the maintainers. It is a huge source of unnecessary work. The situation needs to be resolved. Fast.[70]

Torvalds’ initially unapologetic response[71] recalls another classic example of his sardonic view of his position as BDFL, when announcing the selection of a penguin logo for Linux. Acknowledging the comments of those who had expressed reservations about it, Torvalds concluded with the quip, “If you still don’t like it, that’s ok: that’s why I’m boss. I simply know better than you do.”[72]

The Mozilla[73] and OpenOffice.org[74] projects provide a slightly different example of hierarchical ordering in open source software development.[75] In these cases, the authority is not that of an individual, but a corporation: originally Netscape Communications in the case of Mozilla,[76] and Sun Microsystems in the case of OpenOffice.org.[77]

This kind of collective hierarchical control over an open source software project can also be exercised by a civil society organisation. The non-profit Mozilla Foundation, for example, succeeded to the rights of Netscape, such as the trademark and rights under the Netscape Public License.[78] Membership of its governing body (or “staff”) is by invitation only. Another example of such an organisation, also taken from one of the most prominent and successful open source projects, is the Apache Software Foundation (ASF),[79] which is best known for the Apache HTTP Server which powers the majority of Web sites on the Internet.[80]

The case of the ASF also illustrates well that there are also various strata of developers underneath the BDFL. One study has categorised these into core members (or maintainers), active developers, peripheral developers, bug reporters, readers and passive users,[81] and confirmed previous findings that the core developers are generally the smallest group but write the majority of the project’s code.[82] Whilst developers in lower strata are mostly self-selected,[83] in many projects, including those of the ASF, the core developers are selected by the BDFL, applying stringent meritocratic standards.[84]

In fact of the examples given of open source projects in which a significant hierarchical structure exists or has existed—the Linux kernel, Mozilla, OpenOffice.org and Apache, as well as Samba and Ubuntu mentioned earlier—all are the most widely-used open source projects in their class, and have large and active communities of developers. How can this be reconciled with the earlier hypothesis that it was the very lack of hierarchy that empowered developers and attracted them to volunteer their services to open source projects?

Despite the fact that its significance to developers had earlier been downplayed, the answer is found in the open source licence. It is the open source license that enforces benevolence upon the dictator. It does this by ensuring that for any open source project, there is always relatively costless freedom of exit, in that any developers who feel they are being oppressed by a project leader can simply cease participating in the project, take its source code, and use it as the base for a new project of their own (known as a “fork” of the original project). This “exit-based empowerment”[85] enjoyed by developers mitigates the power of the project leaders.

As Torvalds has put it,

I am a dictator, but it’s the right kind of dictatorship. I can’t really do anything that screws people over. The benevolence is built in. I can’t be nasty. If my baser instincts took hold, they wouldn’t trust me, and they wouldn’t work with me anymore. I’m not so much a leader, I’m more of a shepherd.[86]

The Linux kernel has, indeed, been forked numerous times. One prominent fork was that maintained by Red Hat Linux developer Alan Cox, who released a series of kernel source trees that contained patches not yet accepted by Torvalds.[87] However since 2002, a technical solution to Torvalds’ backlog was found in the use of specialised revision control software,[88] which has placated many of Torvalds’ critics, and resulted in the obsolescence of many former forks of the kernel.

Both Mozilla’s Firefox browser and the OpenOffice.org office suite have also been forked. The Debian project, for example, has replaced Firefox in its distribution with a forked version called Iceweasel, to escape the onerous trademark licence conditions imposed by the Mozilla Foundation for the use of the Firefox name and logo.[89] As for OpenOffice.org, a prominent fork called NeoOffice[90] has been customised to integrate more smoothly with the Mac OS X operating system. Debian itself has also spawned a number of derivative distributions, Ubuntu being one.[91]

Admittedly, forking an open source project is not costless. Usually the most significant cost is that it will be necessary for the new project leader to establish a community of users and developers to support the project in the long term. For economic sociologists, this is the cost of developing social capital.[92] Thus, the more successful the parent project is (and the more cohesive its communities of developer and users), the higher its social capital will be, the higher the transaction costs of a fork, and the more effectively that fork will have to differentiate itself from its parent in order to overcome those costs.

This is illustrated by the case of Samba-TNG which forked from the highly successful Samba project in 1999,[93] seeking to differentiate itself by first offering the facility to replace a Microsoft Windows server as the Primary Domain Controller for an office network. However it struggled to build a development community comparable in size and expertise to that of its parent project, which in the meantime implemented its own version of Samba-TNG’s differentiating feature. In comparison, forks of less dominant and stable projects have been forked more often and more successfully.[94]

This characteristic of the transaction costs associated with migration from one open source project to another provides a cohesive force against the unnecessary fragmentation of open source projects, that will only be overcome if enough developers become sufficiently dissatisfied to form a viable competing project (which the project leaders have an incentive not to allow to happen, lest they lose their base of developers). In comparison, developers within Microsoft Corporation face much higher transaction costs in replicating their work and their communities elsewhere if they are dissatisfied, if indeed it is possible for them to do so at all.

Thus it is from the unexpected source of the open source licence that a solution is found to the problem of maintaining an organisation under an hierarchical structure to address the limitations of anarchistic ordering, in that it provides an implicit ongoing consensual check on the power of the authority which side-steps the difficult task of objectively assessing the authority’s merit antecedently.

Notes

[1]

Aristotle, Politics (1943), 195

[2]

Palfrey Jr, John G, The End of the Experiment: How ICANN’s Foray into Global Internet Democracy Failed (2004)

[3]

Weinberg, Jonathan, Geeks and Greeks (2001), 329

[4]

Sinclair, Darren, Self-Regulation Versus Command and Control?: Beyond False Dichotomies (1997), 544

[5]

Braithwaite, John, Enforced Self-Regulation: A New Strategy for Corporate Crime Control (1982)

[6]

Kleinwächter, Wolfgang, Global Governance in the Information Age: GBDe and ICANN as “Pilot Projects" for Co-regulation and a New Trilateral Policy? (2001), 20

[7]

Malcolm, Jeremy M, Australia’s Stand on Spam (2004)

[8]

See http://www.acma.gov.au/WEB/STANDARD/pc=PC_2525.

[9]

Weber, Rolf H, Regulatory Models for the Online World (2002), 80, 83–84

[10]

Braithwaite, John, Enforced Self-Regulation: A New Strategy for Corporate Crime Control (1982), 1470

[11]

Page, A C, Self Regulation and Codes of Practice (1980)

[12]

Gunningham, Neil & Rees, Joseph, Industry Self-Regulation: An Institutional Perspective (1997), 401

[13]

IIA, Internet Industry Spam Code of Practice (2005)

[14]

Sassen, Saskia, The State and Globalization (2002)

[15]

Braithwaite, John, Enforced Self-Regulation: A New Strategy for Corporate Crime Control (1982), 1492

[16]

Braithwaite, John, Enforced Self-Regulation: A New Strategy for Corporate Crime Control (1982), 1495–1496

[17]

Gunningham, Neil & Rees, Joseph, Industry Self-Regulation: An Institutional Perspective (1997), 402–405

[18]

See http://www.ibanet.org/.

[19]

International Bar Association, International Code of Ethics (1988)

[20]

European Commission, Guide to the Implementation of Directives Based on the New Approach and the Global Approach (2000)

[21]

See Section 2.1.2.3.

[22]

That may be the practical effect of the prevailing hegemony of states in any case; that is, provided that a public policy issue is technically amenable to being addressed by rules, there would be nothing to stop governments or intergovernmental authorities from trumping the IGF’s recommendations even if the IGF were not structured in such a manner as to facilitate their doing so. The distinction though, formal as it may be, is between a multi-stakeholder governance forum structured to include a role for formal intergovernmental oversight, and one in which policy development is undertaken in the shadow of the exogenous power of states to intervene in and override the process.

[23]

Lynn, M S, President’s Report: ICANN—The Case for Reform (2002)

[24]

DiBona, Chris, Ockman, Sam, & Stone, Mark, Open Sources: Voices from the Open Source Revolution (1999)

[25]

Levy, Steven, Hackers: Heroes of the Computer Revolution (2001), 65

[26]

Raymond, Eric S, The Cathedral and the Bazaar: Musings on Linux and Open Source by an Accidental Revolutionary (2001), 51

[27]

Raymond, Eric S, Goodbye, “Free Software"; Hello, “Open Source" (1998)

[28]

It is still so known by many, notably including the Free Software Foundation; see http://www.fsf.org/.

[29]

Both appellations being encompassed by the acronym FOSS or F/OSS; FLOSS is also sometimes seen, adding the French libre.

[30]

Stallman, Richard M, The Free Software Definition (1998). A similar but more comprehensive list of ten requirements of open source software was first published by the Open Source Institute in 1998 in its Open Source Definition (see http://www.opensource.org/docs/osd).

[31]

Hertel, Guido, Niedner, Sven, & Herrmann, Stefanie, Motivation of Software Developers in Open Source Projects: An Internet-based Survey of Contributors to the Linux Kernel (2003)

[32]

Feller, Joseph & Fitzgerald, Brian, Understanding Open Source Software Development (2002), 131

[33]

Raymond, Eric S, The Cathedral and the Bazaar: Musings on Linux and Open Source by an Accidental Revolutionary (2001), 21–22

[34]

FSF, GNU Free Documentation License (2002)

[35]

See http://creativecommons.org/, though for criticism of the openness of the Creative Commons licences see Hill, Benjamin M, Towards a Standard of Freedom: Creative Commons and the Free Software Movement (2005).

[36]

See http://www.archive.org/.

[37]

See http://ocw.mit.edu/.

[38]

See http://www.ocwconsortium.org/.

[39]

Benkler, Yochai, Coase’s Penguin, or, Linux and The Nature of the Firm (2002)

[40]

von Hippel, Eric, Democratizing Innovation (2005)

[41]

Lerner, Josh & Tirole, Jean, The Economics of Technology Sharing: Open Source and Beyond (2004), 7

[42]

Ghosh, Rishab A, Cooking Pot Markets: An Economic Model for the Trade in Free Goods and Services on the Internet (1998)

[43]

Lerner, Josh & Tirole, Jean, The Economics of Technology Sharing: Open Source and Beyond (2004), 8

[44]

Raymond, Eric S, The Cathedral and the Bazaar: Musings on Linux and Open Source by an Accidental Revolutionary (2001), 60

[45]

Torvalds, Linus & Diamond, David, Just For Fun: the Story of an Accidental Revolutionary (2001), 248

[46]

Levy, Steven, Hackers: Heroes of the Computer Revolution (2001), 46

[47]

Poe, Marshall, The Hive (2006).

[48]

Davies, William, You Don’t Know Me, But... Social Capital and Social Software (2003), 32

[49]

Rheingold, H, The Virtual Community (1993)

[50]

According to blog analysis firm Technorati; see http://www.technorati.com/about/.

[51]

Wu, Tim, The Power of Fun (2006)

[52]

Matthews, Gerald, Deary, Ian J, & Whiteman, Martha C, Personality Traits (2003), 128

[53]

Maslow, Abraham, Motivation and Personality (1987)

[54]

Weber, Max, The Protestant Ethic and the Spirit of Capitalism (2003), 181

[55]

Fragoso, Heloisa, An Overview of Employee Empowerment: Do’s And Don’ts (2000)

[56]

Levy, Steven, Hackers: Heroes of the Computer Revolution (2001), 40

[57]

Imhorst, Christian, Anarchy and Source Code—What Does the Free Software Movement Have to Do With Anarchism? (2005)

[58]

Reynolds, Glenn, An Army of Davids: How Markets and Technology Empower Ordinary People to Beat Big Media, Big Government, and Other Goliaths (2006)

[59]

Holck, Jesper & Jørgensen, Niels, Do Not Check In On Red: Control Meets Anarchy in Two Open Source Projects (2005)

[60]

Reagle, Joseph, Why the Internet is Good: Community Governance That Works Well (1999)

[61]

See http://www.kernel.org/.

[62]

Ubuntu, founded in 2004 (see http://www.ubuntu.com/), is based on an earlier Linux distribution called Debian GNU/Linux, founded in 1993. The Debian project is the most egalitarian of the two; for example its elected Project Leader is directed by clause 5.3 of its constitution to “attempt to make decisions which are consistent with the consensus of the opinions of the Developers” and to “avoid overemphasizing their own point of view when making decisions in their capacity as Leader”: Debian Project, Debian Constitution (2006). In contrast, Mark Shuttleworth, who founded the Ubuntu distribution in 2004 and termed himself its SABDFL (self-appointed benevolent dictator for life), appoints the members of both of its main decision-making bodies (the Technical Board and the Ubuntu Community Council) and exercises a casting vote in those bodies.

A prominent former Debian Developer who resigned in 2006 compared the Debian and Ubuntu distributions by saying, “There’s a balance to be struck between organisational freedom and organisational effectiveness. I’m not convinced that Debian has that balance right as far as forming a working community goes. In that respect, Ubuntu’s an experiment—does a more rigid structure and a greater willingness to enforce certain social standards result in a more workable community?” (quoted in Byfield, Bruce, Maintainer’s Resignation Highlights Problems in Debian Project (2006), which links to the original source).

[63]

See http://www.samba.org/.

[64]

See http://www.perl.org/.

[65]

See http://www.php.net/.

[66]

See http://www.python.org/.

[67]

The position of BDFL normally falls to the developer who initiated a project, though in the case of multiple original core developers, the phenomenon of a benevolent oligarchy for life is not unknown (for example Matt Mullenweg and Ryan Boren for the WordPress blog engine at http://wordpress.com/).

[68]

See Documentation/SubmittingPatches within the kernel source tree which can be downloaded from http://www.kernel.org/.

[69]

For a more detailed case study of Linux kernel development see Schach, S, Jin, B, Wright, D, Heller, G, & Offut, A, Maintainability of the Linux Kernel (2002).

[70]

Landley, Rob, A Modest Proposal—We Need a Patch Penguin (2002)

[71]

See http://www.cs.helsinki.fi/linux/linux-kernel/2002-04/0389.html.

[72]

Originally published on Usenet at news:4sv02t$j8g@linux.cs.Helsinki.FI, now archived at http://groups.google.com/group/comp.os.linux.advocacy/msg/ee350cc97f7d0e69.

[73]

See http://www.mozilla.com/.

[74]

See http://www.openoffice.org/.

[75]

For more detailed case studies of these projects see Holck, Jesper & Jørgensen, Niels, Do Not Check In On Red: Control Meets Anarchy in Two Open Source Projects (2005) and Mockus, A, Fielding, R T, & Herbsleb, J D, Two Case Studies of Open Source Software Development: Apache and Mozilla (2002) for Mozilla, and Strba, Fridrich, From TrainedMonkey to Google SoC Mentor (2006) for OpenOffice.org.

[76]

As well as leading development, Netscape originally held the “Mozilla” trademark (as Linus Torvalds does for “Linux” in various jurisdictions: see http://www.linuxmark.org/), and until 2001 required modifications to its source code to be licensed under terms that exclusively exempted it from the copyleft provisions applicable to other users: see http://www.mozilla.org/MPL/FAQ.html in its description of the Netscape Public License.

[77]

Sun requires contributors to the OpenOffice.org project to assign joint copyright in their work to it: see http://www.openoffice.org/licenses/jca.pdf.

[78]

See http://www.mozilla.org/foundation/.

[79]

See http://www.apache.org/. The Apache Software Foundation is a non-profit corporation governed by a board of nine directors who are elected by the Foundation’s members for one-year terms, and who in turn appoint a number of officers (66, in 2008) to oversee its day-to-day operations. As of 2008 there are 249 members of the ASF, each of whom was invited to join on the basis of their previous contributions to ASF projects, and whose invitation was extended by a majority vote of the existing members.

[80]

See http://news.netcraft.com/archives/web_server_survey.html.

[81]

Ye, Yunwen, Nakakoji, Kumiyo, Yamamoto, Yasuhiro, & Kishida, Kouichi, The Co-Evolution of Systems and Communities in Free and Open Source Software Development (2005)

[82]

Mockus, A, Fielding, R T, & Herbsleb, J D, Two Case Studies of Open Source Software Development: Apache and Mozilla (2002)

[83]

Ye, Yunwen, Nakakoji, Kumiyo, Yamamoto, Yasuhiro, & Kishida, Kouichi, The Co-Evolution of Systems and Communities in Free and Open Source Software Development (2005), 64

[84]

For a more detailed case study of Apache see Mockus, A, Fielding, R T, & Herbsleb, J D, Two Case Studies of Open Source Software Development: Apache and Mozilla (2002).

[85]

Warren, Mark E, Controlling Corruption Through Democratic Empowerment: Market-Style Accountability Revisited (2006), 2

[86]

Hamm, Steve, Linus Torvalds’ Benevolent Dictatorship (2004)

[87]

Corbet, Jonathan, Where Does Kernel Development Stand? (2001)

[88]

Originally, ironically, a proprietary product called BitKeeper, and subsequently an open source equivalent called Git written by Torvalds himself: see http://git.or.cz/.

[89]

Corbet, Jonathan, Debian and Mozilla—A Study in Trademarks (2005)

[90]

See http://www.neooffice.org/.

[91]

The same phenomenon is found in other open content development communities. For example in 2002, Spanish Wikipedians who were dissatisfied with the Wikipedia project created their own fork, Enciclopedia Libre (“free encyclopædia”), as permitted by the GNU Free Documentation License under which Wikipedia’s content is licensed: see http://enciclopedia.us.es/. More recently Larry Sanger has attempted to do the same, creating “a responsible, expert-managed fork of Wikipedia” titled Citizendium: see http://www.citizendium.org/.

[92]

Uphoff, N, Understanding Social Capital: Learning from the Analysis and Experience of Participation (1999). Social capital can be formally defined as “the value of those aspects of the social structure to actors, as resources that can be used by the actors to realize their interests”: Coleman, J, Foundations of Social Theory (1990) , 305.

[93]

See http://www.samba-tng.org/.

[94]

For example, the oft-criticised PHP-Nuke content management system: see http://phpnuke.org/ and Corbet, Jonathan, PHP Nuke Remains Vulnerable (2001). These forks include Post-Nuke at http://www.postnuke.com/, Envolution at http://sourceforge.net/projects/envolution, MyPHPNuke at http://sourceforge.net/projects/myphpnuke and Xoops at http://www.xoops.org/.