The practices visible in "open source" software development were important in bringing this phenomenon to general awareness. In these projects it was clear policy that project contributors would routinely and systematically freely reveal code they had developed at private expense (Raymond 1999). However, free revealing of product innovations has a history that began long before the advent of open source software. Allen, in his 1983 study of the eighteenth-century iron industry, was probably the first to consider the phenomon systematically. Later, Nuvolari (2004) discussed free revealing in the early history of mine pumping engines. Contemporary free revealing by users has been documented by von Hippel and Finkelstein (1979) for medical equipment, by Lim (2000) for semiconductor process equipment, by Morrison, Roberts, and von Hippel (2000) for library information systems, and by Franke and Shah (2003) for sporting equipment. Henkel (2003) has documented free revealing among manufacturers in the case of embedded Linux software.
Contributors to the many open source software projects extant (more than 83,000 were listed on SourceForge.net in 2004) also routinely make the new code they have written public. Well-known open source software products include the Linux operating system software and the Apache web server computer software. Some conditions are attached to open source code licensing to ensure that the code remains available to all as an information commons. Because of these added protections, open source code does not quite fit the definition of free revealing given earlier in this chapter. (The licensing of open source software will be discussed in detail in chapter 7.)
Henkel (2003) showed that free revealing is sometimes practiced by directly competing manufacturers. He studied manufacturers that were competitors and that had all built improvements and extensions to a type of software known as embedded Linux. (Such software is "embedded in" and used to operate equipment ranging from cameras to chemical plants.) He found that these manufacturers freely revealed improvements to the common software platform that they all shared and, with a lag, also revealed much of the equipment-specific code they had written.
A variation of this argument applies to the free revealing among competing manufacturers documented by Henkel (2003). Competing developers of embedded Linux systems were creating software that was specifically designed to run the hardware products of their specific clients. Each manufacturer could freely reveal this equipment-specific code without fear of direct competitive repercussions: it was applicable mainly to specific products made by a manufacturer's client, and it was less valuable to others. At the same time, all would jointly benefit from free revealing of improvements to the underlying embedded Linux code base, upon which they all build their proprietary products. After all, the competitive advantages of all their products depended on this code base's being equal to or better than the proprietary software code used by other manufacturers of similar products. Additionally, Linux software was a complement to hardware that many of the manufacturers in Henkel's sample also sold. Improved Linux software would likely increase sales of their complementary hardware products. (Complement suppliers' incentives to innovate have been modeled by Harhoff (1996).)
Interestingly, successful open source software projects do not appear to follow any of the guidelines for successful collective action projects just described. With respect to project recruitment, goal statements provided by successful open source software projects vary from technical and narrow to ideological and broad, and from precise to vague and emergent (for examples, see goal statements posted by projects hosted on Sourceforge.net).【8 As a specific example of a project with an emergent goal, consider the beginnings of the Linux open source software project. In 1991, Linus Torvalds, a student in Finland, wanted a Unix operating system that could be run on his PC, which was equipped with a 386 processor. Minix was the only software available at that time but it was commercial, closed source, and it traded at US$150. Torvalds found this too expensive, and started development of a Posix-compatible operating system, later known as Linux. Torvalds did not immediately publicize a very broad and ambitious goal, nor did he attempt to recruit contributors. He simply expressed his private motivation in a message he posted on July 3, 1991, to the USENET newsgroup comp.os.minix (Wayner 2000): Hello netlanders, Due to a project I'm working on (in minix), I'm interested in the posix standard definition. [Posix is a standard for UNIX designers. A software using POSIX is compatible with other UNIX-based software.] Could somebody please point me to a (preferably) machine-readable format of the latest posix-rules? Ftp-sites would be nice. In response, Torvalds got several return messages with Posix rules and people expressing a general interest in the project. By the early 1992, several skilled programmers contributed to Linux and the number of users increased by the day. Today, Linux is the largest open source development project extant in terms of number of developers. 】 Further, such projects may engage in no active recruiting beyond simply posting their intended goals and access address on a general public website customarily used for this purpose (for examples, see the Freshmeat.net website). Also, projects have shown by example that they can be successful even if large groups---perhaps thousands---of contributors are involved. Finally, open source software projects seem to expend no effort to discourage free riding. Anyone is free to download code or seek help from project websites, and no apparent form of moral pressure is applied to make a compensating contribution (e.g., "If you benefit from this code, please also contribute . . .").
Interesting examples also exist regarding on the impact a commons can have on the value of intellectual property innovators seek to hold apart from it. Weber (2004) recounts the following anecdote: In 1988, Linux developers were building new graphical interfaces for their open source software. One of the most promising of these, KDE, was offered under the General Public License. However, Matthias Ettrich, its developer, had built KDE using a proprietary graphical library called Qt. He felt at the time that this could be an acceptable solution because Qt was of good quality and Troll Tech, owner of Qt, licensed Qt at no charge under some circumstances. However, Troll Tech did require a developer's fee be paid under other circumstances, and some Linux developers were concerned about having code not licensed under the GPL as part of their code. They tried to convince Troll Tech to change the Qt license so that it would be under the GPL when used in free software. But Troll Tech, as was fully within its rights, refused to do this. Linux developers then, as was fully within their rights, began to develop open source alternatives to Qt that could be licensed under the GPL. As those projects moved toward success, Troll Tech recognized that Qt might be surpassed and effectively shut out of the Linux market. In 2000 the company therefore decided to license Qt under the GPL.
Democratization of the opportunity to create is important beyond giving more users the ability to make exactly right products for themselves. As we saw in a previous chapter, the joy and the learning associated with creativity and membership in creative communities are also important, and these experiences too are made more widely available as innovation is democratized. The aforementioned Chris Hanson, a Principal Research Scientist at MIT and a maintainer in the Debian Linux community, speaks eloquently of this in his description of the joy and value he finds from his participation in an open source software community:
Many user innovations require or benefit from complementary products or services, and manufacturers can often supply these at a profit. For example, IBM profits from user innovation in open source software by selling the complement of computer hardware. Specifically, it sells computer servers with open source software pre-installed, and as the popularity of that software goes up, so do server sales and profits. A firm named Red Hat distributes a version of the open source software computer operating system Linux, and also sells the complementary service of Linux technical support to users. Opportunities to provide profitable complements are not necessarily obvious at first glance, and providers often reap benefits without being aware of the user innovation for which they are providing a complement. Hospital emergency rooms, for example, certainly gain considerable business from providing medical care to the users and user-developers of physically demanding sports, but may not be aware of this.
1. As a specific example of a project with an emergent goal, consider the beginnings of the Linux open source software project. In 1991, Linus Torvalds, a student in Finland, wanted a Unix operating system that could be run on his PC, which was equipped with a 386 processor. Minix was the only software available at that time but it was commercial, closed source, and it traded at US$150. Torvalds found this too expensive, and started development of a Posix-compatible operating system, later known as Linux. Torvalds did not immediately publicize a very broad and ambitious goal, nor did he attempt to recruit contributors. He simply expressed his private motivation in a message he posted on July 3, 1991, to the USENET newsgroup comp.os.minix (Wayner 2000): Hello netlanders, Due to a project I'm working on (in minix), I'm interested in the posix standard definition. [Posix is a standard for UNIX designers. A software using POSIX is compatible with other UNIX-based software.] Could somebody please point me to a (preferably) machine-readable format of the latest posix-rules? Ftp-sites would be nice. In response, Torvalds got several return messages with Posix rules and people expressing a general interest in the project. By the early 1992, several skilled programmers contributed to Linux and the number of users increased by the day. Today, Linux is the largest open source development project extant in terms of number of developers.
Henkel, J. 2003. "Software Development in Embedded Linux: Informal Collaboration of Competing Firms." In W. Uhr, W. Esswein, and E. Schoop, eds., Proceedings der 6. Internationalen Tagung Wirtschaftsinformatik 2003, volume 2. Physica.
Hertel, G., S. Niedner, and S. Herrmann. 2003. "Motivation of Software Developers in Open Source Projects: An Internet-Based Survey of Contributors to the Linux Kernel." Research Policy 32, no. 7: 1159--1177.
Finally, we could try to excuse this piracy with the argument that the piracy actually helps the copyright owner. When the Chinese "steal" Windows, that makes the Chinese dependent on Microsoft. Microsoft loses the value of the software that was taken. But it gains users who are used to life in the Microsoft world. Over time, as the nation grows more wealthy, more and more people will buy software rather than steal it. And hence over time, because that buying will benefit Microsoft, Microsoft benefits from the piracy. If instead of pirating Microsoft Windows, the Chinese used the free GNU/Linux operating system, then these Chinese users would not eventually be buying Microsoft. Without piracy, then, Microsoft would lose.
Still, the argument is not terribly persuasive. We don't give the alcoholic a defense when he steals his first beer, merely because that will make it more likely that he will buy the next three. Instead, we ordinarily allow businesses to decide for themselves when it is best to give their product away. If Microsoft fears the competition of GNU/Linux, then Microsoft can give its product away, as it did, for example, with Internet Explorer to fight Netscape. A property right means giving the property owner the right to say who gets access to what - at least ordinarily. And if the law properly balances the rights of the copyright owner with the rights of access, then violating the law is still wrong.
In the Supreme Court, the briefs on our side were about as diverse as it gets. They included an extraordinary historical brief by the Free Software Foundation (home of the GNU project that made GNU/ Linux possible). They included a powerful brief about the costs of uncertainty by Intel. There were two law professors' briefs, one by copyright scholars and one by First Amendment scholars. There was an exhaustive and uncontroverted brief by the world's experts in the history of the Progress Clause. And of course, there was a new brief by Eagle Forum, repeating and strengthening its arguments.
I don't mean to enter that debate here. It is important only to make clear that the distinction is not between commercial and noncommercial software. There are many important companies that depend fundamentally upon open source and free software, IBM being the most prominent. IBM is increasingly shifting its focus to the GNU/Linux operating system, the most famous bit of "free software" - and IBM is emphatically a commercial entity. Thus, to support "open source and free software" is not to oppose commercial entities. It is, instead, to support a mode of software development that is different from Microsoft's.【202 Microsoft's position about free and open source software is more sophisticated. As it has repeatedly asserted, it has no problem with "open source" software or software in the public domain. Microsoft's principal opposition is to "free software" licensed under a "copyleft" license, meaning a license that requires the licensee to adopt the same terms on any derivative work. See Bradford L. Smith, "The Future of Software: Enabling the Marketplace to Decide," Government Policy Toward Open Source Software (Washington, D.C.: AEI-Brookings Joint Center for Regulatory Studies, American Enterprise Institute for Public Policy Research, 2002), 69, available at link #62. See also Craig Mundie, Microsoft senior vice president, The Commercial Software Model, discussion at New York University Stern School of Business (3 May 2001), available at link #63. 】
Therefore, in 1984, Stallman began a project to build a free operating system, so that at least a strain of free software would survive. That was the birth of the GNU project, into which Linus Torvalds's "Linux" kernel was added to produce the GNU/Linux operating system.
Free For All - How Linux and the Free Software Movement Undercut the High Tech Titans,
The list should also include the dozens of journalists at places like Slashdot.org, LinuxWorld, Linux magazine, Linux Weekly News, Kernel Traffic, Salon, and the New York Times. I should specifically mention the work of Joe Barr, Jeff Bates, Janelle Brown, Zack Brown, Jonathan Corbet, Elizabeth Coolbaugh, Amy Harmon, Andrew Leonard, Rob Malda, John Markoff, Mark Nielsen, Nicholas Petreley, Harald Radke, and Dave Whitinger. They wrote wonderful pieces that will make a great first draft of the history of the open source movement. Only a few of the pieces are cited directly in the footnotes, largely for practical reasons. The entire body of websites like Slashdot, Linux Journal, Linux World, Kernel Notes, or Linux Weekly News should be required reading for anyone interested in the free software movement.
There are hundreds of folks at Linux trade shows who took the time to show me their products, T-shirts, or, in one case, cooler filled with beer. Almost everyone I met at the conferences was happy to speak about their experiences with open source software. They were all a great source of information, and I don't even know most of their names.
See ‹http://www.wayner.org/books/ffa/› for the FIRST PDF EDITION Page layout for this and the original paper edition designed by William Ruoto, see Not printed on acid-free paper. Library of Congress Cataloging-in-Publication Data Wayner, Peter, 1964 Free for all : how Linux and the free software movement undercut the high-tech titans / Peter Wayner. p. cm. ISBN 0-06-662050-3 1. Linux. 2. Operating systems (Computers) 3. Free computer software. I. Title. QA76.76.063 W394 2000 005.4'469 dc21 00-023919 00 01 02 03 04 V/RRD 10 9 8 7 6 5 4 3 2 1
The last competitor, though, was the most surprising to everyone. Schmalensee saw Linux, a program given away for free, as a big potential competitor. When he said Linux, he really meant an entire collection of programs known as "open source" software. These were written by a loose-knit group of programmers who shared all of the source code to the software over the Internet.
Schmalensee didn't mention that most people thought of Linux as a strange tool created and used by hackers in dark rooms lit by computer monitors. He didn't mention that many people had trouble getting Linux to work with their computers. He forgot to mention that Linux manuals came with subheads like "Disk Druid-like 'fstab editor' available." He didn't delve into the fact that for many of the developers, Linux was just a hobby they dabbled with when there was nothing interesting on television. And he certainly didn't mention that most people thought the whole Linux project was the work of a mad genius and his weirdo disciples who still hadn't caught on to the fact that the Soviet Union had already failed big-time. The Linux folks actually thought sharing would make the world a better place. Fat-cat programmers who spent their stock-option riches on Porsches and balsamic vinegar laughed at moments like this.
Schmalensee didn't mention these facts. He just offered Linux as an alternative to Windows and said that computer manufacturers might switch to it at any time. Poof. Therefore, Microsoft had competitors. At the trial, the discourse quickly broke down into an argument over what is really a worthy competitor and what isn't. Were there enough applications available for Linux or the Mac? What qualifies as "enough"? Were these really worthy?
Under cross-examination, Schmalensee explained that he wasn't holding up the Mac, BeOS, or Linux as competitors who were going to take over 50 percent of the marketplace. He merely argued that their existence proved that the barriers produced by the so-called Microsoft monopoly weren't that strong. If rational people were investing in creating companies like BeOS, then Microsoft's power wasn't absolute.
Afterward, most people quickly made up their minds. Everyone had heard about the Macintosh and knew that back then conventional wisdom dictated that it would soon fail. But most people didn't know anything about BeOS or Linux. How could a company be a competitor if no one had heard of it? Apple and Microsoft had TV commercials. BeOS, at least, had a charismatic chairman. There was no Linux pitchman, no Linux jingle, and no Linux 30-second spot in major media. At the time, only the best-funded projects in the Linux community had enough money to buy spots on late-night community-access cable television. How could someone without money compete with a company that hired the Rolling Stones to pump excitement into a product launch?
When people heard that Microsoft was offering a free product as a worthy competitor, they began to laugh even louder at the company's chutzpah. Wasn't money the whole reason the country was having a trial? Weren't computer programmers in such demand that many companies couldn't hire as many as they needed, no matter how high the salary? How could Microsoft believe that anyone would buy the supposition that a bunch of pseudo-communist nerds living in their weird techno-utopia where all the software was free would ever come up with software that could compete with the richest company on earth? At first glance, it looked as if Microsoft's case was sinking so low that it had to resort to laughable strategies. It was as if General Motors were to tell the world "We shouldn't have to worry about fixing cars that pollute because a collective of hippies in Ithaca, New York, is refurbishing old bicycles and giving them away for free." It was as if Exxon waved away the problems of sinking oil tankers by explaining that folksingers had written a really neat ballad for teaching birds and otters to lick themselves clean after an oil spill. If no one charged money for Linux, then it was probably because it wasn't worth buying.
But as everyone began looking a bit deeper, they began to see that Linux was being taken seriously in some parts of the world. Many web servers, it turned out, were already running on Linux or another free cousin known as FreeBSD. A free webserving tool known as Apache had controlled more than 50 percent of the web servers for some time, and it was gradually beating out Microsoft products that cost thousands of dollars. Many of the web servers ran Apache on top of a Linux or a FreeBSD machine and got the job done. The software worked well, and the nonexistent price made it easy to choose.
Linux was also winning over some of the world's most serious physicists, weapons designers, biologists, and hard-core scientists. Some of the nation's top labs had wired together clusters of cheap PCs and turned them into supercomputers that were highly competitive with the best machines on the market. One upstart company started offering "supercomputers" for $3,000. These machines used Linux to keep the data flowing while the racks of computers plugged and chugged their way for hours on complicated simulations.
There were other indications. Linux users bragged that their system rarely crashed. Some claimed to have machines that had been running for a year or more without a problem. Microsoft (and Apple) users, on the other hand, had grown used to frequent crashes. The "Blue Screen of Death" that appears on Windows users' monitors when something goes irretrievably wrong is the butt of many jokes.
Linux users also bragged about the quality of their desktop interface. Most of the uninitiated thought of Linux as a hacker's system built for nerds. Yet recently two very good operating shells called GNOME and KDE had taken hold. Both offered the user an environment that looked just like Windows but was better. Linux hackers started bragging that they were able to equip their girlfriends, mothers, and friends with Linux boxes without grief. Some people with little computer experience were adopting Linux with little trouble.
To the rest of the world, this urge to putter and fiddle with machines is more than a source of marital comedy. Cox is one of the great threats to the continued dominance of Microsoft, despite the fact that he found a way to weld spaghetti to a nonstick pan. He is one of the core developers who help maintain the Linux kernel. In other words, he's one of the group of programmers who helps guide the development of the Linux operating system, the one Richard Schmalensee feels is such a threat to Microsoft. Cox is one of the few people whom Linus Torvalds, the creator of Linux, trusts to make important decisions about future directions. Cox is an expert on the networking guts of the system and is responsible for making sure that most of the new ideas that people suggest for Linux are considered carefully and integrated correctly. Torvalds defers to Cox on many matters about how Linux-based computers talk with other computers over a network. Cox works long and hard to find efficient ways for Linux to juggle multiple connections without slowing down or deadlocking.
The group that works with Cox and Torvalds operates with no official structure. Millions of people use Linux to keep their computers running, and all of them have copies of the source code. In the 1980s, most companies began keeping the source code to their software as private as possible because they worried that a competitor might come along and steal the ideas the source spelled out. The source code, which is written in languages like C, Java, FORTRAN, BASIC, or Pascal, is meant to be read by programmers. Most companies didn't want other programmers understanding too much about the guts of their software. Information is power, and the companies instinctively played their cards close to their chests.
When Linus Torvalds first started writing Linux in 1991, however, he decided to give away the operating system for free. He included all the source code because he wanted others to read it, comment upon it, and perhaps improve it. His decision was as much a radical break from standard programming procedure as a practical decision. He was a poor student at the time, and this operating system was merely a hobby. If he had tried to sell it, he wouldn't have gotten anything for it. He certainly had no money to build a company that could polish the software and market it. So he just sent out copies over the Internet.
Today, about a thousand people regularly work with people like Alan Cox on the development of the Linux kernel, the official name for the part of the operating system that Torvalds started writing back in 1991. That may not be an accurate estimate because many people check in for a few weeks when a project requires their participation. Some follow everything, but most people are just interested in little corners. Many other programmers have contributed various pieces of software such as word processors or spreadsheets. All of these are bundled together into packages that are often called plain Linux or GNU/Linux and shipped by companies like Red Hat or more ad hoc groups like Debian.【1 Linux Weekly News keeps a complete list of distributors. These range from the small, one- or two-man operations to the biggest, most corporate ones like Red Hat: Alzza Linux, Apokalypse, Armed Linux, Bad Penguin Linux, Bastille Linux, Best Linux (Finnish/Swedish), Bifrost, Black Cat Linux (Ukrainian/Russian), Caldera OpenLinux, CCLinux, Chinese Linux Extension, Complete Linux, Conectiva Linux (Brazilian), Debian GNU/Linux, Definite Linux, DemoLinux, DLD, DLite, DLX, DragonLinux, easyLinux, Enoch, Eridani Star System, Eonova Linux, e-smith server and gateway, Eurielec Linux (Spanish), eXecutive Linux, floppyfw, Floppix, Green Frog Linux, hal91, Hard Hat Linux, Immunix, Independence, Jurix, Kha0s Linux, KRUD, KSI-Linux, Laetos, LEM, Linux Cyrillic Edition, LinuxGT, Linux-Kheops (French), Linux MLD (Japanese), LinuxOne OS, LinuxPPC, LinuxPPP (Mexican), Linux Pro Plus, Linux Router Project, LOAF, LSD, Mandrake, Mastodon, MicroLinux, MkLinux, muLinux, nanoLinux II, NoMad Linux, OpenClassroom, Peanut Linux, Plamo Linux, PLD, Project Ballantain, PROSA, QuadLinux, Red Hat, Rock Linux, RunOnCD, ShareTheNet, Skygate, Slackware, Small Linux, Stampede, Stataboware, Storm Linux, SuSE, Tomsrtbt, Trinux, TurboLinux, uClinux, Vine Linux, WinLinux 2000, Xdenu, XTeamLinux, and Yellow Dog Linux. 】 While Torvalds only wrote the core kernel, people use his name, Linux, to stand for a whole body of software written by thousands of others. It's not exactly fair, but most let it slide. If there hadn't been the Linux kernel, the users wouldn't have the ability to run software on a completely free system. The free software would need to interact with something from Microsoft, Apple, or IBM. Of course, if it weren't for all of the other free software from Berkeley, the GNU project, and thousands of other garages around the world, there would be little for the Linux kernel to do.
All of these people work at their own pace. Some work in their homes, like Alan Cox. Some work in university labs. Others work for businesses that use Linux and encourage their programmers to plug away so it serves their needs.
The team is united by mailing lists. The Linux Kernel mailing list hooks up Cox in Britain, Torvalds in Silicon Valley, and the others around the globe. They post notes to the list and discuss ideas. Sometimes verbal fights break out, and sometimes everyone agrees. Sometimes people light a candle by actually writing new code to make the kernel better, and other times they just curse the darkness.
Cox is now one of several people responsible for coordinating the addition of new code. He tests it for compatibility and guides Linux authors to make sure they're working together optimally. In essence, he tests every piece of incoming software to make sure all of the gauges work with the right system of measurement so there will be no glitches. He tries to remove the incompatibilities that marred Zorro.
Other features are not so popular, and they're tackled by the people who need the features. Some people want to hook their Linux boxes up to Macintoshes. Doing that smoothly can require some work in the kernel. Others may want to add special code to enable a special device like a high-speed camera or a strange type of disk drive. These groups often work on their own but coordinate their solutions with the main crowd. Ideally, they'll be able to come up with some patches that solve their problem without breaking some other part of the system.
Each day, Cox and his virtual colleagues pore through the lists trying to figure out how to make Linux better, faster, and more usable. Sometimes they skip out to watch a movie. Sometimes they go for hikes. But one thing they don't do is spend months huddled in conference rooms trying to come up with legal arguments. Until recently, the Linux folks didn't have money for lawyers, and that means they didn't get sidetracked by figuring out how to get big and powerful people like Richard Schmalensee to tell a court that there's no monopoly in the computer operating system business.
The battle between Linux and Microsoft is lining up to be the classic fight between the people like Schmalensee and the people like Cox. On one side are the armies of lawyers, lobbyists, salesmen, and expensive executives who are armed with patents, lawsuits, and legislation. They are skilled at moving the levers of power until the gears line up just right and billions of dollars pour into their pockets. They know how to schmooze, toady, beg, or even threaten until they wear the mantle of authority and command the piety and devotion of the world. People buy Microsoft because it's "the standard." No one decreed this, but somehow it has come to be.
That's an idyllic picture, and the early success of Linux, FreeBSD, and other free packages makes it tempting to think that the success will build. Today, open source servers power more than 50 percent of the web servers on the Internet, and that is no small accomplishment. Getting thousands, if not millions, of programmers to work together is quite amazing given how quirky programmers can be. The ease of copying makes it possible to think that Alan Cox could get up late and still move the world.
Right now, the free software movement stands at a crucial moment in its history. In the past, a culture of giving and wide-open sharing let thousands of programmers build a great operating system that was, in many ways, better than anything coming from the best companies. Many folks began working on Linux, FreeBSD, and thousands of other projects as hobbies, but now they're waking up to find IBM, HewlettPackard, Apple, and all the other big boys pounding on their door. If the kids could create something as nice as Linux, everyone began to wonder whether these kids really had enough good stuff to go the distance and last nine innings against the greatest power hitters around.
Linus Torvalds may be on the cover of magazines, but he can't do anything with the wave of a hand. He must charm and cajole the thousands of folks on the Linux mailing list to make a change. Many of the free software projects may generate great code, but they have to beg for computers. The programmers might even surprise him and come up with an even better solution. They've done it in the past. But no money means that no one has to do what anyone says.
But shows that are charming and fresh in a barn can become thin and weak on a big stage on Broadway. The glitches and raw functionality of Linux and free software don't seem too bad if you know that they're built by kids in their spare time. Building real tools for real companies, moms, police stations, and serious users everywhere is another matter. Everyone may be hoping that sharing, caring, and curiosity are enough, but no one knows for certain. Maybe capital will end up winning. Maybe it won't. It's freedom versus assurance; it's wide-open sharing versus stock options; it's cooperation versus intimidation; it's the geeks versus the suits, all in one knockdown, hack-till-you-drop, winner-take-everything fight.
FreeBSD is a close cousin to the Linux kernel and one that predates it in some ways. It descends from a long tradition of research and development of operating systems at the University of California at Berkeley. The name BSD stands for "Berkeley Software Distribution," the name given to one of the first releases of operating system source code that Berkeley made for the world. That small package grew, morphed, and absorbed many other contributions over the years.
Referring to Linux and FreeBSD as cousins is an apt term because they share much of the same source code in the same way that cousins share some of the same genes. Both borrow source code and ideas from each other. If you buy a disk with FreeBSD, which you can do from companies like Walnut Creek, you may get many of the same software packages that you get from a disk from Red Hat Linux. Both include, for instance, some of the GNU compilers that turn source code into something that can be understood by computers.
On that January 14, a new member of the WINE list was learning just how volunteering works. The guy posted a note to the list that described his Diamond RIO portable music device that lets you listen to MP3 files whenever you want. "I think the WINE development team should drop everything and work on getting this program to work as it doesn't seem like Diamond wants to release a Linux utility for the Rio," he wrote.
The WINE clone of the Win32 is a fascinating example of how open source starts slowly and picks up steam. Bob Amstadt started the project in 1993, but soon turned it over to Alexandre Julliard, who has been the main force behind it. The project, although still far from finished, has produced some dramatic accomplishments, making it possible to run major programs like Microsoft Word or Microsoft Excel on a Linux box without using Windows. In essence, the WINE software is doing a good enough job acting like Windows that it's fooling Excel and Word. If you can trick the cousins, that's not too bad.
The WINE home page (www.winehq.com) estimates that more than 90,000 people use WINE regularly to run programs for Microsoft Windows without buying Windows. About 140 or more people regularly contribute to the project by writing code or fixing bugs. Many are hobbyists who want the thrill of getting their software to run without Windows, but some are corporate programmers. The corporate programmers want to sell their software to the broadest possible marketplace, but they don't want to take the time to rewrite everything. If they can get their software working well with WINE, then people who use Linux or BSD can use the software that was written for Microsoft Windows.
The new user who wanted to get his RIO player working with his Linux computer soon got a rude awakening. Andreas Mohr, a German programmer, wrote back,
Mohr's suggestion was to file a bug report that ranks the usability of the software so the programmers working on WINE can tweak it. This is just the first step in the free software experience. Someone has to notice the problem and fix it. In this case, someone needs to hook up their Diamond RIO MP3 player to a Linux box and try to move MP3 files with the software written for Windows. Ideally, the software will work perfectly, and now all Linux users will be able to use RIO players. In reality, there might be problems or glitches. Some of the graphics on the screen might be wrong. The software might not download anything at all. The first step is for someone to test the product and write up a detailed report about what works and what doesn't.
WINE can't pay anyone, and that means that great ideas sometimes get ignored. The free software community, however, doesn't necessarily see this as a limitation. If the RIO player were truly important, someone else would come along and pick up the project. Someone else would do the work and file a bug report so everyone could use the software. If there's no one else, then maybe the RIO software isn't that important to the Linux community. Work gets done when someone really cares enough to do it.
Some contributors are fairly described as "ragtag," but many aren't. Many are corporate droids who work in cubicle farms during the day and create free software projects at night. Some work at banks. Some work on databases for human resource departments. Some build websites. Everyone has a day job, and many keep themselves clean and ready to be promoted to the next level. Bruce Perens, one of the leaders of the Debian group, used to work at the Silicon Valley glitz factory Pixar and helped write some of the software that created the hit Toy Story.
Still, he told me, "At the time Toy Story was coming out, there was a space shuttle flying with the Debian GNU/Linux distribution on it controlling a biological experiment. People would say 'Are you proud of working at Pixar?' and then I would say my hobby software was running on the space shuttle now. That was a turnaround point when I realized that Linux might become my career."
In fact, it's a bad idea to see the free software revolution as having much to do with Microsoft. Even if Linux, FreeBSD, and other free software packages win, Microsoft will probably continue to fly along quite happily in much the same way that IBM continues to thrive even after losing the belt of the Heavyweight Computing Champion of the World to Microsoft. Anyone who spends his or her time focused on the image of a ragtag band of ruffians and orphans battling the Microsoft leviathan is bound to miss the real story.
Anyone who tunes in to the battle between Microsoft and the world expecting to see a good old-fashioned fight for marketplace domination is going to miss the real excitement. Sure, Linux, FreeBSD, OpenBSD, NetBSD, Mach, and the thousands of other free software projects are going to come out swinging. Microsoft is going to counterpunch with thousands of patents defended by armies of lawyers. Some of the programmers might even be a bit weird, and a few will be entitled to wear the adjective "ragtag." But the real revolution has nothing to do with whether Bill Gates keeps his title as King of the Hill. It has nothing to do with whether the programmers stay up late and work in the nude. It has nothing to do with poor grooming, extravagant beards, Coke-bottle glasses, black trench coats, or any of the other stereotypes that fuel the media's image.
Meanwhile, on the other coast, the lawsuit tied up Berkeley and the BSD project for several years, and the project lost valuable energy and time by devoting them to the legal fight. In the meantime, several other completely free software projects started springing up around the globe. These began in basements and depended on machines that the programmer owned. One of these projects was started by Linus Torvalds and would eventually grow to become Linux, the unstoppable engine of hype and glory. He didn't have the money of the Berkeley computer science department, and he didn't have the latest machines that corporations gave them. But he had freedom and the pile of source code that came from unaffiliated, free projects like GNU that refused to compromise and cut intellectual corners. Although Torvalds might not have realized it at the time, freedom turned out to be most valuable of all.
One of the people who wanted UNIX was the Finnish student Linus Torvalds, who couldn't afford this tithe. He was far from the first one, and the conflict began long before he started to write Linux in 1991.
The first move to separate Berkeley's version of UNIX from AT&T's control wasn't really a revolution. No one was starting a civil war by firing shots at Fort Sumter or starting a revolution by dropping tea in the harbor. In fact, it started long before the lawsuit and Linux. In 1989, some people wanted to start hooking their PCs and other devices up to the Internet, and they didn't want to use UNIX.
While news traveled quickly to some corners, it didn't reach Finland. Network Release 2 came in June 1991, right around the same time that Linus Torvalds was poking around looking for a high-grade OS to use in experiments. Jolitz's 386BSD came out about six months later as Torvalds began to dig into creating the OS he would later call Linux. Soon afterward, Jolitz lost interest in the project and let it lie, but others came along. In fact, two groups called NetBSD and FreeBSD sprang up to carry the torch.
Any grown-up should take one look at this battle and understand just how the free software movement got so far. While the Berkeley folks were meeting with lawyers and worrying about whether the judges were going to choose the right side, Linus Torvalds was creating his own kernel. He started Linux on his own, and that made him a free man.
In June 1991, soon after Torvalds【3 Everyone in the community, including many who don't know him, refers to him by his first name. The rules of style prevent me from using that in something as proper as a book. 】 started his little science project, the Computer Systems Research Group at Berkeley released what they thought was their completely unencumbered version of BSD UNIX known as Network Release 2. Several projects emerged to port this to the 386, and the project evolved to become the FreeBSD and NetBSD versions of today. Torvalds has often said that he might never have started Linux if he had known that he could just download a more complete OS from Berkeley.
The core of an OS is often called the "kernel," which is one of the strange words floating around the world of computers. When people are being proper, they note that Linus Torvalds was creating the Linux kernel in 1991. Most of the other software, like the desktop, the utilities, the editors, the web browsers, the games, the compilers, and practically everything else, was written by other folks. If you measure this in disk space, more than 95 percent of the code in an average distribution lies outside the kernel. If you measure it by user interaction, most people using Linux or BSD don't even know that there's a kernel in there. The buttons they click, the websites they visit, and the printing they do are all controlled by other programs that do the work.
In 1991, Torvalds had a short list of features he wanted to add to the kernel. The Internet was still a small network linking universities and some advanced labs, and so networking was a small concern. He was only aiming at the 386, so he could rely on some of the special features that weren't available on other chips. High-end graphics hardware cards were still pretty expensive, so he concentrated on a text-only interface. He would later fix all of these problems with the help of the people on the Linux kernel mailing list, but for now he could avoid them.
When Torvalds started crafting the Linux kernel, he decided he was going to create a bigger, more integrated version that he called a "monolithic kernel." This was something of a bold move because the academic community was entranced with what they called "microkernels." The difference is partly semantic and partly real, but it can be summarized by analogy with businesses. Some companies try to build large, smoothly integr the steps of production. Others try to create smaller operations that subcontract much of the production work to other companies. One is big, monolithic, and all-encompassing, while the other is smaller, fragmented, and heterogeneous. It's not uncommon to find two companies in the same industry taking different approaches and thinking they're doing the right thing.
By the beginning of 1992, Linux was no longer a Finnish student's part-time hobby. Several influential programmers became interested in the code. It was free and relatively usable. It ran much of the GNU code, and that made it a neat, inexpensive way to experiment with some excellent tools. More and more people downloaded the system, and a significant fraction started reporting bugs and suggestions to Torvalds. He rolled them back in and the project snowballed.
This talent for organizing the work of others is a rare commodity, and Torvalds had a knack for it. He was gracious about sharing his system with the world and he never lorded it over anyone. His messages were filled with jokes and self-deprecating humor, most of which were carefully marked with smiley faces (:-)) to make sure that the message was clear. If he wrote something pointed, he would apologize for being a "hothead." He was always gracious in giving credit to others and noted that much of Linux was just a clone of UNIX. All of this made him easy to read and thus influential.
His greatest trick, though, was his decision to avoid the mantle of power. He wrote in 1992, "Here's my standing on 'keeping control,' in 2 words (three?): I won't. The only control I've effectively been keeping on Linux is that I know it better than anybody else."
He made it clear that people could vote to depose him at any time. "If people feel I do a bad job, they can do it themselves." They could just take all of his Linux code and start their own version using Torvalds's work as a foundation.
Torvalds's burgeoning kernel dovetailed nicely with the tools that the GNU project created. All of the work by Stallman and his disciples could be easily ported to work with the operating system core that Torvalds was now calling Linux. This was the power of freely distributable source code. Anyone could make a connection, and someone invariably did. Soon, much of the GNU code began running on Linux. These tools made it easier to create more new programs, and the snowball began to roll.
This freedom also attracted others to the party. They knew that Linux would always be theirs, too. They could write neat features and plug them into the Linux kernel without worrying that Torvalds would yank the rug out from under them. The GPL was a contract that lasted long into the future. It was a promise that bound them together.
The Linux kernel also succeeded because it was written from the ground up for the PC platform. When the Berkeley UNIX hackers were porting BSD to the PC platform, they weren't able to make it fit perfectly. They were taking a piece of software crafted for older computers like the VAX, and shaving off corners and rewriting sections until it ran on the PC.
During the early months of Torvalds's work, the BSD group was stuck in a legal swamp. While the BSD team was involved with secret settlement talks and secret depositions, Linus Torvalds was happily writing code and sharing it with the world on the Net. His life wasn't all peaches and cream, but all of his hassles were open. Professor Andy Tanenbaum, a fairly well-respected and famous computer scientist, got in a long, extended debate with Torvalds over the structure of Linux. He looked down at Linux and claimed that Linux would have been worth two F's in his class because of its design. This led to a big flame war that was every bit as nasty as the fight between Berkeley and AT&T's USL. In fact, to the average observer it was even nastier. Torvalds returned Tanenbaum's fire with strong words like "fiasco," "brain-damages," and "suck." He brushed off the bad grades by pointing out that Albert Einstein supposedly got bad grades in math and physics. The highpriced lawyers working for AT&T and Berkeley probably used very expensive and polite words to try and hide the shivs they were trying to stick in each other's back. Torvalds and Tanenbaum pulled out each other's virtual hair like a squawkfest on the Jerry Springer show.
But Torvalds's flame war with Tanenbaum occurred in the open in an Internet newsgroup. Other folks could read it, think about it, add their two cents' worth, and even take sides. It was a wide-open debate that uncovered many flaws in the original versions of Linux and Tanenbaum's Minix. They forced Torvalds to think deeply about what he wanted to do with Linux and consider its flaws. He had to listen to the arguments of a critic and a number of his peers on the Net and then come up with arguments as to why his Linux kernel didn't suck too badly.
The fight between Torvalds and Tanenbaum, however, drew people into the project. Other programmers like David Miller, Ted T'so, and Peter da Silva chimed in with their opinions. At the time, they were just interested bystanders. In time, they became part of the Linux brain trust. Soon they were contributing source code that ran on Linux. The argument's excitement forced them to look at Torvalds's toy OS and try to decide whether his defense made any sense. Today, David Miller is one of the biggest contributors to the Linux kernel. Many of the original debaters became major contributors to the foundations of Linux.
To this day, all of the devotees of the various BSDs grit their teeth when they hear about Linux. They think that FreeBSD, NetBSD, and OpenBSD are better, and they have good reasons for these beliefs. They know they were out the door first with a complete running system. But Linux is on the cover of the magazines. All of the great technically unwashed are now starting to use "Linux" as a synonym for free software. If AT&T never sued, the BSD teams would be the ones reaping the glory. They would be the ones to whom Microsoft turned when it needed a plausible competitor. They would be more famous.
McKusick says, "If you plot the installation base of Linux and BSD over the last five years, you'll see that they're both in exponential growth. But BSD's about eighteen to twenty months behind. That's about how long it took between Net Release 2 and the unencumbered 4.4BSD-Lite. That's about how long it took for the court system to do its job."
Through the 1990s, the little toy operating system grew slowly and quietly as more and more programmers were drawn into the vortex. At the beginning, the OS wasn't rich with features. You could run several different programs at once, but you couldn't do much with the programs. The system's interface was just text. Still, this was often good enough for a few folks in labs around the world. Some just enjoyed playing with computers. Getting Linux running on their PC was a challenge, not unlike bolting an aftermarket supercharger onto a Honda Civic. But others took the project more seriously because they had serious jobs that couldn't be solved with a proprietary operating system that came from Microsoft or others.
In time, more people started using the system and started contributing their additions to the pot. Someone figured out how to make MIT's free X Window System run on Linux so everyone could have a graphical interface. Someone else discovered how to roll in technology for interfacing with the Internet. That made a big difference because everyone could hack, tweak, and fiddle with the code and then just upload the new versions to the Net.
It goes without saying that all the cool software coming out of Stallman's Free Software Foundation found its way to Linux. Some were simple toys like GNU Chess, but others were serious tools that were essential to the growth of the project. By 1991, the FSF was offering what might be argued were the best text editor and compiler in the world. Others might have been close, but Stallman's were free. These were crucial tools that made it possible for Linux to grow quickly from a tiny experimental kernel into a full-featured OS for doing everything a programmer might want to do.
James Lewis-Moss, one of the many programmers who devote some time to Linux, says that GCC made it possible for programmers to create, revise, and extend the kernel. "GCC is integral to the success of Linux," he says, and points out that this may be one of the most important reasons why "it's polite to refer to it as GNU/Linux."
Lewis-Moss points out one of the smoldering controversies in the world of free software: all of the tools and games that came from the GNU project started becoming part of what people simply thought of as plain "Linux." The name for the small kernel of the operating system soon grew to apply to almost all the free software that ran with it. This angered Stallman, who first argued that a better name would be"Lignux."When that failed to take hold, he moved to "GNU/Linux." Some ignored his pleas and simply used "Linux," which is still a bit unfair. Some feel that"GNU/Linux"is too much of a mouthful and, for better or worse, just plain Linux is an appropriate shortcut. Some, like Lewis-Moss, hold firm to GNU/Linux.
Soon some people were bundling together CD-ROMs with all this software in one batch. The group would try to work out as many glitches as possible so that the purchaser's life would be easier. All boasted strange names like Yggdrasil, Slackware, SuSE, Debian, or Red Hat. Many were just garage projects that never made much money, but that was okay. Making money wasn't really the point. People just wanted to play with the source. Plus, few thought that much money could be made. The GPL, for instance, made it difficult to differentiate the product because it required everyone to share their source code with the world. If Slackware came up with a neat fix that made their version of Linux better, then Debian and SuSE could grab it. The GPL prevented anyone from constraining the growth of Linux.
But only greedy businessmen see sharing and competition as negatives. In practice, the free flow of information enhanced the market for Linux by ensuring that it was stable and freely available. If one key CDROM developer gets a new girlfriend and stops spending enough time programming, another distribution will pick up the slack. If a hurricane flattened Raleigh, North Carolina, the home of Red Hat, then another supplier would still be around. A proprietary OS like Windows is like a set of manacles. An earthquake in Redmond, Washington, could cause a serious disruption for everyone.
The competition and the GPL meant that the users would never feel bound to one OS. If problems arose, anyone could always just start a splinter group and take Linux in that direction. And they did. All the major systems began as splinter groups, and some picked up enough steam and energy to dominate. In time, the best splinter groups spun off their own splinter groups and the process grew terribly complicated.
Hall remembers well the moment he discovered Linux. He told Linux Today,
I didn't even know I was involved with Linux at first. I got a copy of Dr. Dobb's Journal, and in there was an advertisement for "get a UNIX operating system, all the source code, and run it on your PC." And I think it was $99. And I go, "Oh, wow, that's pretty cool. For $99, I can do that." So I sent away for it, got the CD. The only trouble was that I didn't have a PC to run it on. So I put it on my Ultrix system, took a look at the main pages, directory structure and stuff, and said, "Hey, that looks pretty cool." Then I put it away in the filing cabinet. That was probably around January of 1994.
At the meeting, Torvalds helped Hall and his boss set up a PC with Linux. This was the first time that Hall actually saw Linux run, and he was pleasantly surprised. He said, "By that time I had been using UNIX for probably about fifteen years. I had used System V, I had used Berkeley, and all sorts of stuff, and this really felt like UNIX. You know . . . I mean, it's kind of like playing the piano. You can play the piano, even if it's a crappy piano. But when it's a really good piano, your fingers just fly over the keys. That's the way this felt. It felt good, and I was really impressed."
This experience turned Hall into a true convert and he went back to Digital convinced that the Linux project was more than just some kids playing with a toy OS. These so-called amateurs with no centralized system or corporate backing had produced a very, very impressive system that was almost as good as the big commercial systems. Hall was an instant devotee. Many involved in the project recall their day of conversion with the same strength. A bolt of lightning peeled the haze away from their eyes, and they saw.
Hall set out trying to get Torvalds to rewrite Linux so it would work well on the Alpha. This was not a simple task, but it was one that helped the operating system grow a bit more. The original version included some software that assumed the computer was designed like the Intel 386. This was fine when Linux only ran on Intel machines, but removing these assumptions made it possible for the software to run well on all types of machines.
Hall went sailing with Torvalds to talk about the guts of the Linux OS. Hall told me, "I took him out on the Mississippi River, went up and down the Mississippi in the river boat, drinking Hurricanes, and I said to him, 'Linus, did you ever think about porting Linux to a 64-bit processor, like the Alpha?' He said, 'Well, I thought about doing that, but the Helsinki office has been having problems getting me a system, so I guess I'll have to do the PowerPC instead.'
"I knew that was the wrong answer, so I came back to Digital (at the time), and got a friend of mine, named Bill Jackson, to send out a system to Linus, and he received it about a couple weeks after that. Then I found some people inside Digital who were also thinking about porting Linux to an Alpha. I got the two groups together, and after that, we started on the Alpha Linux project."
Hall also helped start a group called Linux International, which works to make the corporate world safe for Linux. "We help vendors understand the Linux marketplace," Hall told me. "There's a lot of confusion about what the GPL means. Less now, but still there's a lot of confusion. We helped them find the markets."
Today, Linux International helps control the trademark on the name Linux and ensures that it is used in an open way. "When someone wanted to call themselves something like 'Linux University,' we said that's bad because there's going to be more than one. 'Linux University of North Carolina' is okay. It opens up the space."
In the beginning, Torvalds depended heavily on the kindness of strangers like Hall. He didn't have much money, and the Linux project wasn't generating a huge salary for him. Of course, poverty also made it easier for people like Hall to justify giving him a machine. Torvalds wasn't rich monetarily, but he became rich in machines.
By 1994, when Hall met Torvalds, Linux was already far from just a one-man science project. The floppy disks and CD-ROMs holding a version of the OS were already on the market, and this distribution mechanism was one of the crucial unifying forces. Someone could just plunk down a few dollars and get a version that was more or less ready to run. Many simply downloaded their versions for free from the Internet.
In 1994, getting Linux to run was never really as simple as putting the CD-ROM in the drive and pressing a button. Many of the programs didn't work with certain video cards. Some modems didn't talk to Linux. Not all of the printers communicated correctly. Yet most of the software worked together on many standard machines. It often took a bit of tweaking, but most people could get the OS up and running on their computers.
This was a major advance for the Linux OS because most people could quickly install a new version without spending too much time downloading the new code or debugging it. Even programmers who understood exactly what was happening felt that installing a new version was a long, often painful slog through technical details. These CDROMs not only helped programmers, they also encouraged casual users to experiment with the system.
Today, there are many different kinds of volunteers putting together these packages. The Debian group, for instance, is one of the best known and most devoted to true open source principles. It was started by Ian Murdock, who named it after himself and his girlfriend, Debra. The Debian group, which now includes hundreds of official members, checks to make sure that the software is both technically sound and politically correct. That is, they check the licenses to make sure that the software can be freely distributed by all users. Their guidelines later morphed into the official definition of open source software.
Other CD-ROM groups became more commercial. Debian sold its disks to pay for Internet connection fees and other expenses, but they were largely a garage operation. So were groups with names like Slackware, FreeBSD, and OpenBSD. Other groups like Red Hat actually set out to create a burgeoning business, and to a large extent, they succeeded. They took the money and used it to pay programmers who wrote more software to make Linux easier to use.
In the beginning, there wasn't much difference between the commercially minded groups like Red Hat and the more idealistic collectives like Debian. The marketplace was small, fragmented, and tribal. But by 1998, Red Hat had attracted major funding from companies like Intel, and it plowed more and more money into making the package as presentable and easy to use as possible. This investment paid off because more users turned instinctively to Red Hat, whose CD-ROM sales then exploded.
Slowly but surely, more and more people became aware of Linux, the GNU project, and its cousins like FreeBSD. No one was making much money off the stuff, but the word of mouth was spreading very quickly. The disks were priced reasonably, and people were curious. The GPL encouraged people to share. People began borrowing disks from their friends. Some companies even manufactured cheap rip-off copies of the CD-ROMs, an act that the GPL encouraged.
At the top of the pyramid was Linus Torvalds. Many Linux developers treated him like the king of all he surveyed, but he was like the monarchs who were denuded by a popular constitutional democracy. He had always focused on building a fast, stable kernel, and that was what he continued to do. The rest of the excitement, the packaging, the features, and the toys, were the dominion of the volunteers and contributors.
Torvalds moved to Silicon Valley and took a job with the very secret company Transmeta in order to help design the next generation of computer chips. He worked out a special deal with the company that allowed him to work on Linux in his spare time. He felt that working for one of the companies like Red Hat would give that one version of Linux a special imprimatur, and he wanted to avoid that. Plus, Transmeta was doing cool things.
In January 1999, the world caught up with the pioneers. Schmalensee mentioned Linux on the witness stand during the trial and served official notice to the world that Microsoft was worried about the growth of Linux. The system had been on the company's radar screen for some time. In October 1998, an internal memo from Microsoft describing the threat made its way to the press. Some thought it was just Microsoft's way of currying favor during the antitrust investigation. Others thought it was a serious treatment of a topic that was difficult for the company to understand.
The media followed Schmalensee's lead. Everyone wanted to know about Linux, GNU, open source software, and the magical effects of widespread, unconditional sharing. The questions came in tidal waves, and Torvalds tried to answer them again and again. Was he sorry he gave it all away? No. If he charged anything, no one would have bought his toy and no one would have contributed anything. Was he a communist? No, he was rather apolitical. Don't programmers have to eat? Yes, but they will make their money selling a service instead of getting rich off bad proprietary code. Was Linux going to overtake Microsoft? Yes, if he had his way. World Domination Soon became the motto.
But there were also difficult questions. How would the Linux world resist the embrace of big companies like IBM, Apple, Hewlett-Packard, and maybe even Microsoft? These were massive companies with paid programmers and schedules to meet. All the open source software was just as free to them as anyone else. Would these companies use their strength to monopolize Linux?
Many wanted to know when Linux would become easier to use for nonprogrammers. Programmers built the OS to be easy to take apart and put back together again. That's a great feature if you like hacking the inside of a kernel, but that doesn't excite the average computer user. How was the open source community going to get the programmers to donate their time to fix the mundane, everyday glitches that confused and infuriated the nonprogrammers? Was the Linux community going to be able to produce something that a nonprogrammer could even understand?
Others wondered if the Linux world could ever agree enough to create a software package with some coherence. Today, Microsoft users and programmers pull their hair out trying to keep Windows 95, Windows 98, and Windows NT straight. Little idiosyncrasies cause games to crash and programs to fail. Microsoft has hundreds of quality assurance engineers and thousands of support personnel. Still, the little details drive everyone crazy.
New versions of Linux appear as often as daily. People often create their own versions to solve particular problems. Many of these changes won't affect anyone, but they can add up. Is there enough consistency to make the tools easy enough to use?
Many wondered if Linux was right for world domination. Programmers might love playing with source code, but the rest of the world just wants something that delivers the e-mail on time. More important, the latter are willing to pay for this efficiency.
During this time, the relationship between AT&T and the universities was cordial. AT&T owned the commercial market for UNIX and Berkeley supplied many of the versions used in universities. While the universities got BSD for free, they still needed to negotiate a license with AT&T, and companies paid a fortune. This wasn't too much of a problem because universities are often terribly myopic. If they share their work with other universities and professors, they usually consider their sharing done. There may be folks out there without university appointments, but those folks are usually viewed as cranks who can be safely ignored. Occasionally, those cranks write their own OS that grows up to be Linux. The BSD version of freedom was still a far cry from Stallman's, but then Stallman hadn't articulated it yet. His manifesto was still a few years off.
Daniel is basically correct. The BSD code has evolved, or forked, into many different versions with names like FreeBSD, OpenBSD, and NetBSD while the Linux UNIX kernel released under Stallman's GPL is limited to one fairly coherent package. Still, there is plenty of crosspollination between the different versions of BSD UNIX. Both NetBSD 1.0 and FreeBSD 2.0, for instance, borrowed code from 4.4 BSD-Lite. Also, many versions of Linux come with tools and utilities that came from the BSD project.
But Daniel's point is also clouded with semantics. There are dozens if not hundreds of different Linux distributions available from different vendors. Many differ in subtle points, but some are markedly different. While these differences are often as great as the ones between the various flavors of BSD, the groups do not consider them psychologically separate. They haven't forked politically even though they've split off their code.
Sam Ockman, a Linux enthusiast and the founder of Penguin Computing, remembers the day of the meeting just before Netscape announced it was freeing its source code. "Eric Raymond came into town because of the Netscape thing. Netscape was going to free their software, so we drove down to Transmeta and had a meeting so we could advise Netscape," he said.
The definition of what was open source grew out of the Debian project, one of the different groups that banded together to press CDROMs of stable Linux releases. Groups like these often get into debates about what software to include on the disks. Some wanted to be very pure and only include GPL'ed software. In a small way, that would force others to contribute back to the project because they wouldn't get their software distributed by the group unless it was GPL'ed. Others wanted less stringent requirements that might include quasi-commercial projects that still came with their source code. There were some cool projects out there that weren't protected by GPL, and it could be awfully hard to pass up the chance to integrate them into a package.
Over time, one of the leaders of the Debian group, Bruce Perens, came to create a definition of what was acceptable and what wasn't. This definition would be large enough to include the GNU General Public License, the BSD-style licenses, and a few others like MIT's X Consortium license and the Artistic license. The X-windows license covers a graphical windowing interface that began at MIT and was also freely distributed with BSD-like freedom. The Artistic license applies to the Perl programming language, a tool that is frequently used to transform files. The Debian meta-definition would embrace all of these.
The official definition of what was acceptable to Debian leaned toward more freedom and fewer restrictions on the use of software. Of course, that's the only way that anyone could come up with a definition that included both GNU and the much less restrictive BSD. But this was also the intent of the open source group. Perens and Eric Raymond felt that Stallman still sounded too quasi-communist for "conservative businessmen," and they wanted the open source definition to avoid insisting upon the sort of forced sharing that Stallman's GNU virus provided.
Still, the definition borrowed heavily from Stallman's concept of GNU, and Perens credits him by saying that many of the Debian guidelines are derived from the GPL. An official open source license for a product must provide the programmer with source code that is human-readable. It can't restrict what modifications are made to the software or how it is sold or given away.
Stallman saw this secrecy as a great crime. Computer users should be able to share the source code so they can share ways to make it better. This trade should lead to more information-trading in a great feedback loop. Some folks even used the word "bloom" to describe the explosion of interest and cross-feedback. They're using the word the way biologists use it to describe the way algae can just burst into existence, overwhelming a region of the ocean. Clever insights, brilliant bug fixes, and wonderful new features just appear out of nowhere as human curiosity is amplified by human generosity in a grand explosion of intellectual synergy. The only thing missing from the picture is a bunch of furry Ewoks dancing around a campfire.【8 Linux does have many marketing opportunities. Torvalds chose a penguin named Tux as the mascot, and several companies actually manufacture and sell stuffed penguins to the Linux realm. The BSD world has embraced a cute demon, a visual pun on the fact that BSD UNIX uses the word "daemon" to refer to some of the faceless background programs in the OS. 】
Raymond pointed out that the free source world can do a great job with these nasty bugs. He characterized this with the phrase, "Given enough eyeballs, all bugs are shallow," which he characterized as "Linus's Law." That is, eventually some programmer would start printing and using the Internet at the same time. After the system crashed a few times, some programmer would care enough about the problem to dig into the free source, poke around, and spot the problem. Eventually somebody would come along with the time and the energy and the commitment to diagnose the problem. Raymond named this "Linus's Law" after Linus Torvalds. Raymond is a great admirer of Torvalds and thinks that Torvalds's true genius was organizing an army to work on Linux. The coding itself was a distant second.
The comparison to software was simple. Corporations gathered the tithes, employed a central architect with a grand vision, managed the team of programmers, and shipped a product every once and a bit. The Linux world, however, let everyone touch the Source. People would try to fix things or add new features. The best solutions would be adopted by oth ers and the mediocre would fall by the wayside. Many different Linux versions would proliferate, but over time the marketplace of software would coalesce around the best standard version.
Linus Torvalds changed his mind by increasing the speed of sharing, which Raymond characterized as the rule of "release early and often, delegate everything you can, be open to the point of promiscuity." Torvalds ran Linux as openly as possible, and this eventually attracted some good contributors. In the past, the FSF was much more careful about what it embraced and brought into the GNU project. Torvalds took many things into his distributions and they mutated as often as daily. Occasionally, new versions came out twice a day.
Raymond mixed this experience with his time watching Torvalds's team push the Linux kernel and used them as the basis for his essay on distributing the Source. "Mostly I was trying to pull some factors that I had observed as unconscious folklore so people could take them out and reason about them," he said.
There is a good empirical reason for the faith in the strength of free source. After all, a group of folks who rarely saw each other had assembled a great pile of source code that was kicking Microsoft's butt in some corners of the computer world. Linux servers were common on the Internet and growing more common every day. The desktop was waiting to be conquered. They had done this without stock options, without corporate jets, without secret contracts, and without potentially illegal alliances with computer manufacturers. The success of the software from the GNU and Linux world was really quite impressive.
Part of this problem is the success of Raymond's metaphor. He said he just wanted to give the community some tools to understand the success of Linux and reason about it. But his two visions of a cathedral and a bazaar had such a clarity that people concentrated more on dividing the world into cathedrals and bazaars. In reality, there's a great deal of blending in between. The most efficient bazaars today are the suburban malls that have one management company building the site, leasing the stores, and creating a unified experience. Downtown shopping areas often failed because there was always one shop owner who could ruin an entire block by putting in a store that sold pornography. On the other side, religion has always been something of a bazaar. Martin Luther effectively split apart Christianity by introducing competition. Even within denominations, different parishes fight for the hearts and souls of people.
The same blurring holds true for the world of open source software. The Linux kernel, for instance, contains many thousands of lines of source code. Some put the number at 500,000. A few talented folks like Alan Cox or Linus Torvalds know all of it, but most are only familiar with the corners of it that they need to know. These folks, who may number in the thousands, are far outnumbered by the millions who use the Linux OS daily.
Second, no one really knows who reads the Linux source code for the opposite reason. The GNU/Linux source is widely available and frequently downloaded, but that doesn't mean it's read or studied. The Red Hat CDs come with one CD full of pre-compiled binaries and the second full of source code. Who knows whoever pops the second CDROM in their computer? Everyone is free to do so in the privacy of their own cubicle, so no records are kept.
If I were to bet, I would guess that the ratios of cognoscenti to uninformed users in the Linux and Microsoft worlds are pretty close. Reading the Source just takes too much time and too much effort for many in the Linux world to take advantage of the huge river of information available to them.
The average population, however, is aging quickly. As the software becomes better, it is easier for working stiffs to bring it into the corporate environments. Many folks brag about sneaking Linux into their office and replacing Microsoft on some hidden server. As more and more users find a way to make money with the free software, more and more older people (i.e., over 25) are able to devote some time to the revolution.
I suppose I would like to report that there's a healthy contingent of women taking part in the free source world, but I can't. It would be nice to isolate the free software community from the criticism that usually finds any group of men. By some definition or legal reasoning, these guys must be practicing some de facto discrimination. Somebody will probably try to sue someone someday. Still, the women are scarce and it's impossible to use many of the standard explanations. The software is, after all, free. It runs well on machines that are several generations old and available from corporate scrap heaps for several hundred dollars. Torvalds started writing Linux because he couldn't afford a real version of UNIX. Lack of money or the parsimony of evil, gender-nasty parents who refuse to buy their daughters a computer can hardly be blamed.
This may change in the future if organizations like LinuxChix (www.linuxchix.org) have their way. They run a site devoted to celebrating women who enjoy the open source world, and they've been trying to start up chapters around the world. The site gives members a chance to post their names and biographical details. Of course, several of the members are men and one is a man turning into a woman. The member writes, "I'm transsexual (male-to-female, pre-op), and at the moment still legally married to my wife, which means that if we stay together we'll eventually have a legal same-sex marriage."
Racial politics, however, are more complicated. Much of the Linux community is spread out throughout the globe. While many members come from the United States, major contributors can be found in most countries. Linus Torvalds, of course, came from Finland, one of the more technically advanced countries in the world. Miguel de Icaza, the lead developer of the GNOME desktop, comes from Mexico, a country perceived as technically underdeveloped by many in the United States.
In general, the free source revolution is worldwide and rarely encumbered by racial and national barricades. Europe is just as filled with Linux developers as America, and the Third World is rapidly skipping over costly Microsoft and into inexpensive Linux. Interest in Linux is booming in China and India. English is, of course, the default language, but other languages continue to live thanks to automatic translation mechanisms like Babelfish.
When Linux began to take off, Torvalds moved to Silicon Valley and took a job with the supersecret research firm Transmeta. At Comdex in November 1999, Torvalds announced that Transmeta was working on a low-power computing chip with the nickname "Crusoe."
There are, of course, some conspiracy theories. Transmeta is funded by a number of big investors including Microsoft cofounder Paul Allen. The fact that they chose to employ Torvalds may be part of a plan, some think, to distract him from Linux development. After all, version 2.2 of the kernel took longer than many expected, although it may have been because its goals were too ambitious. When Microsoft needed a coherent threat to offer up to the Department of Justice, Transmeta courteously made Torvalds available to the world. Few seriously believe this theory, but it is constantly whispered as a nervous joke.
This freedom also extended to programmers at work. In many companies, the computer managers are doctrinaire and officious. They often quickly develop knee-jerk reactions to technologies and use these stereotypes to make technical decisions. Free software like Linux was frequently rejected out of hand by the gatekeepers, who thought something must be wrong with the software if no one was charging for it. These attitudes couldn't stop the engineers who wanted to experiment with the free software, however, because it had no purchase order that needed approval.
The people on the side of BSD-style license, on the other hand, seem pragmatic, organized, and focused. There are three major free versions of BSD UNIX alone, and they're notable because they each have centrally administered collections of files. The GPL-protected Linux can be purchased from at least six major groups that bundle it together, and each of them includes packages and pieces of software they find all over the Net.
The BSD-license folks are also less cultish. The big poster boys, Torvalds and Stallman, are both GPL men. The free versions of BSD, which helped give Linux much of its foundation, are largely ignored by the press for all the wrong reasons. The BSD teams appear to be fragmented because they are all separate political organizations who have no formal ties. There are many contributors, which means that BSD has no major charismatic leader with a story as compelling as that of Linus Torvalds.
The Apache web server is protected by a BSD-style license that permits commercial reuse of the software without sharing the source code. It is a separate program, however, and many Linux users run the software on Linux boxes. Of course, this devotion to business and relatively quiet disposition isn't always true. Theo de Raadt, the leader of the OpenBSD faction, is fond of making bold proclamations. In his interview with me, he dismissed the Free Software Foundation as terribly misnamed because you weren't truly free to do whatever you wanted with the software.
Someone might point out that Alan Cox, one of the steadfast keepers of the GPL-protected Linux kernels, is not particularly flashy nor given to writing long manifestos on the Net. Others might say that Brian Behlendorf has been a great defender of the Apache project. He certainly hasn't avoided defending the BSD license, although not in the way that Stallman might have liked. He was, after all, one of the members of the Apache team who helped convince IBM that they could use the Apache web server without danger.
The three BSD projects are well known for keeping control of all the source code for all the software in the distribution. They're very centrally managed and brag about keeping all the source code together in one build tree. The Linux distributions, on the other hand, include software from many different sources. Some include the KDE desktop. Others choose GNOME. Many include both.
Some of the groups have carefully delineated jobs. The Debian group elects a president and puts individuals in charge of particular sections of the distribution. Or perhaps more correctly, the individuals nominate themselves for jobs they can accomplish. The group is as close to a government as exists in the open software world. Many of the Open Source Initiative guidelines on what fits the definition of "open source" evolved from the earlier rules drafted by the Debian group to help define what could and couldn't be included in an official Debian distribution. The OpenBSD group, on the other hand, opens up much of the source tree to everyone on the team. Anyone can make changes. Core areas, on the other hand, are still controlled by leaders.
Some groups have become very effective marketing forces. Red Hat is a well-run company that has marketing teams selling people on upgrading their software as well as engineering teams with a job of writing improved code to include in future versions. Red Hat packages their distribution in boxes that are sold through normal sales channels like bookstores and catalogs. They have a big presence at trade shows like LinuxExpo, in part because they help organize them.
In many cases, there is no clear spectrum defined between order and anarchy. The groups just have their own brands of order. OpenBSD brags about stopping security leaks and going two years without a rootlevel intrusion, but some of its artwork is a bit scruffy. Red Hat, on the other hand, has been carefully working to make Linux easy for everyone to use, but they're not as focused on security details.
This disorder is changing a bit now that serious companies like Red Hat and VA Linux are entering the arena. These companies pay fulltime programmers to ensure that their products are bug free and easy to use. If their management does a good job, the open source software world may grow more ordered and actually anticipate more problems instead of waiting for the right person to come along with the time and the inclination to solve them.
Most people quickly become keenly aware of this competition. Each of the different teams creating distributions flags theirs as the best, the most up-to-date, the easiest to install, and the most plush. The licenses mean that each group is free to grab stuff from the other, and this ensures that no one builds an unstoppable lead like Microsoft did in the proprietary OS world. Sure, Red Hat has a large chunk of the mindshare and people think their brand name is synonymous with Linux, but anyone can grab their latest distribution and start making improvements on it. It takes little time at all.
But Stallman is right to distance himself from Soviet-style communism because there are few similarities. There's little central control in Stallman's empire. All Stallman can do to enforce the GNU General Public License is sue someone in court. He, like the Pope, has no great armies ready to keep people in line. None of the Linux companies have much power to force people to do anything. The GNU General Public License is like a vast disarmament treaty. Everyone is free to do what they want with the software, and there are no legal cudgels to stop them. The only way to violate the license is to publish the software and not release the source code.
Many people who approach the free software world for the first time see only communism. Bob Metcalfe, an entrepreneur, has proved himself several times over by starting companies like 3Com and inventing the Ethernet. Yet he looked at the free software world and condemned it with a derisive essay entitled "Linux's 60's technology, open-sores ideology won't beat W2K, but what will?"
The essay makes more confounding points equating Richard Stallman to Karl Marx for his writing and Linus Torvalds to Vladimir Lenin because of his aim to dominate the software world with his OS. For grins, he compares Eric Raymond to "Trotsky waiting for The People's ice pick" for no clear reason. Before this gets out of hand, he backpedals a bit and claims, "OK, communism is too harsh on Linux. Lenin too harsh on Torvalds [sic]."Then he sets off comparing the world of open source to the tree-hugging, back-to-the-earth movement.
"How about Linux as organic software grown in utopia by spiritualists?" he wonders. "If North America actually went back to the earth, close to 250 million people would die of starvation before you could say agribusiness. When they bring organic fruit to market, you pay extra for small apples with open sores--the Open Sores Movement."
But numbers like this can't really capture the depth of the gift. Linus Torvalds always likes to say that he started writing Linux because he couldn't afford a decent OS for his machine so he could do some experiments. Who knows how many kids, grown-ups, and even retired people are hacking Linux now and doing some sophisticated computer science experiments because they can? How do we count this beneficence?
Free source code has none of these inefficiencies. Websites like Slashdot, Freshmeat, Linux Weekly News, LinuxWorld, KernelTraffic, and hundreds of other Linux or project-specific portals do a great job moving the software to the people who can use its value. People write the code and then other folks discover the value in it. Bad or unneeded code isn't foisted on anyone.
Of course, there are some open source charities. Richard Stallman's Free Software Foundation is a tax-exempt 501(c)(3) charity that raises money and solicits tax-deductible donations. This money is used to pay for computers, overhead, and the salaries of young programmers who have great ideas for free software. The Debian Project also has a charitable arm known as Software in the Public Interest that raises money and computer equipment to support the creation of more free software.
Their budgets are pretty manageable as well. Perens notes that Debian's budget is about $10,000 a year, and this is spent largely on distributing the software. Servers that support plenty of traffic cost a fair amount of money, but the group does get donations of hardware and bandwidth. The group also presses a large number of CD-ROMs with the software.
The comparison does offer some insight into life in the free software community. Some conventions like LinuxExpo and the hundreds of install-fests are sort of like parties. One company at a LinuxExpo was serving beer in its booth to attract attention. Of course, Netscape celebrated its decision to launch the Mozilla project with a big party. They then threw another one at the project's first birthday.
But the giving goes beyond the parties and the conferences. Giving great software packages creates social standing in much the same way that giving a lavish feast will establish you as a major member of the tribe. There is a sort of pecking order, and the coders of great systems like Perl or Linux are near the top. The folks at the top of the pyramid often have better luck calling on other programmers for help, making it possible for them to get their jobs done a little better. Many managers justify letting their employees contribute to the free software community because they build up a social network that they can tap to finish their official jobs.
The free source world, on the other hand, is a big free-for-all in both senses of the phrase. The code circulates for everyone to grab, and only those who need it dig in. There's no great connection between programmer and user. People grab software and take it without really knowing to whom they owe any debt. I only know a few of the big names who wrote the code running the Linux box on my desk, and I know that there are thousands of people who also contributed. It would be impossible for me to pay back any of these people because it's hard to keep them straight.
Of course, there's also a certain element of selfishness to the charity. The social prestige that comes from writing good free software is worth a fair amount in the job market. People like to list accomplishments like "wrote driver" or "contributed code to Linux Kernel 2.2" on their résumé. Giving to the right project is a badge of honor because serious folks doing serious work embraced the gift. That's often more valuable and more telling than a plaque or an award from a traditional boss.
Newberry is also a Linux fan. He reads the Kernel list but rarely contributes much to it. He runs various versions of Linux around the house, and none of them were working as well as he wanted with his Macintosh. So he poked around in the software, fixed it, and sent his code off to Alan Cox, who watches over the part of the kernel where his fixes belonged.
"I contributed some changes to the Appletalk stack that's in the Linux Kernel that make it easier for a Linux machine to offer dial-in services for Macintosh users," he said in an article published in Salon. "As it stands, Mac users have always been able to dial into a Linux box and use IP protocols, but if they wanted to use Appletalk over PPP, the support wasn't really there."
Of course, all of this justification and rationalization aren't the main reason why Newberry spends so much of his time hacking on Linux. Sure, it may help his company's bottom line. Sure, it might beef up his résumé by letting him brag that he got some code in the Linux kernel. But he also sees this as a bit of charity.
"I get a certain amount of satisfaction from the work . . . but I get a certain amount of satisfaction out of helping people. Improving Linux and especially its integration with Macs has been a pet project of mine for some time," he says. Still, he sums up his real motivation by saying, "I write software because I just love doing it." Perhaps we're just lucky that so many people love writing open source software and giving it away.
It's not hard to find bad stories about people who write good code. One person at a Linux conference told me, "The strange thing about Linus Torvalds is that he hasn't really offended everyone yet. All of the other leaders have managed to piss off someone at one time or another. It's hard to find someone who isn't hated by someone else." While he meant it as a compliment for Torvalds, he sounded as if he wouldn't be surprised if Torvalds did a snotty, selfish, petulant thing. It would just be par for the course.
Occasionally, the fights get interesting. Eric Raymond and Bruce Perens are both great contributors to the open source movement. In fact, both worked together to try to define the meaning of the term. Perens worked with the community that creates the Debian distribution of Linux to come up with a definition of what was acceptable for the community. This definition morphed into a more official version used by the Open Source Initiative. When they got a definition they liked, they published it and tried to trademark the term "open source" in order to make sure it was applied with some consistency. It should be no surprise that all of that hard work brought them farther apart.
The free software world, of course, removes these barriers. If the Hotmail folks had joined the Linux team instead of Microsoft, they would be free to do whatever they wanted with their website even if it annoyed Linus Torvalds, Richard Stallman, and the pope. They wouldn't be rich, but there's always a price.
This love also has a more traditional effect on the hackers who create the free source code. They do it because they love what they're doing. Many of the people in the free source movement are motivated by writing great software, and they judge their success by the recognition they get from equally talented peers. A "nice job" from the right person--like Richard Stallman, Alan Cox, or Linus Torvalds--can be worth more than $100,000 for some folks. It's a strange way to keep score, but for most of the programmers in the free source world it's more of a challenge than money. Any schmoe in Silicon Valley can make a couple of million dollars, but only a few select folks can rewrite the network interface code of the Linux kernel to improve the throughput of the Apache server by 20 percent.
And of course there are thousands of free software projects that are going to get left behind hanging out at the same old pizza joint. There were always going to be thousands left behind. People get excited about new projects, better protocols, and neater code all the time. The old code just sort of withers away. Occasionally someone rediscovers it, but it is usually just forgotten and superseded. But this natural evolution wasn't painful until the successful projects started ending up on the covers of magazines and generating million-dollar deals with venture capitalists. People will always be wondering why their project isn't as big as Linux.
This decision to welcome money into the fold didn't wreck free software. If anything, it made it possible for companies like Red Hat to emerge and sell easier-to-use versions of the free software. The companies competed to put out the best distributions and didn't use copyright and other intellectual property laws to constrain each other. This helped attract more good programmers to the realm because most folks would rather spend their time writing code than juggling drivers on their machine. Good distributions like Red Hat, Slackware, Debian, FreeBSD, and SuSE made it possible for everyone to get their machines up and running faster.
This split is already growing. Red Hat software employs some of the major Linux contributors like Alan Cox. They get a salary while the rest of the contributors get nothing. Sun, Apple, and IBM employees get salaries, but folks who work on Apache or the open versions of BSD get nothing but the opportunity to hack cool code.
Jeff Bates, an editor at Slashdot, says that Mozilla may have suffered because Netscape was so successful. The Netscape browser was already available for free for Linux. "There wasn't a big itch to scratch," he says. "We already had Netscape, which was fine for most people. This project interested a smaller group than if we'd not had Netscape-hence why it didn't get as much attention."
In most cases, the flow is not particularly novel. The companies just choose FreeBSD or some version of Linux for their machines like any normal human being. Many web companies use a free OS like Linux or FreeBSD because they're both cheap and reliable. This is going to grow much more common as companies realize they can save a substantial amount of money over buying seat licenses from companies like Microsoft.
What happens if a bug emerges in some version of the Linux kernel and it makes it into several distributions? It's not really the fault of the distribution creators, because they were just shipping the latest version of the kernel. And it's not really the kernel creators' fault, because they weren't marketing the kernel as ready for everyone to run. They were just floating some cool software on the Net for free. Who's responsible for the bug? Who gets sued?
The free OS also puts Intel's lion's share up for grabs. Linux runs well on Intel chips, but it also runs on chips made by IBM, Motorola, Compaq, and many others. The NetBSD team loves to brag that its software runs on almost all platforms available and is dedicated to porting it to as many as possible. Someone using Linux or NetBSD doesn't care who made the chip inside because the OS behaves similarly on all of them.
This threat shows that the emergence of the free OSs ensures that hardware companies will also face increased competitive pressure. Sure, they may be able to get Microsoft off their back, but Linux may make things a bit worse.
He has a point. Linux is a lot of fun to play with and it is now a very stable OS, but it took a fair number of years to get to this point. Many folks in the free source world like to say things like, "It used to be that the most fun in Linux was just getting it to work." Companies like Morgan Stanley, Schwab, American Airlines, and most others live and die on the quality of their computer systems. They're quite willing to pay money if it helps ensure that things don't go wrong.
Red Hat has managed to sell enough CD-ROM disks to fund the development of new projects. They've created a good selection of installation tools that make it relatively easy for people to use Linux. They also help pay salaries for people like Alan Cox who contribute a great deal to the evolution of the kernel. They do all of this while others are free to copy their distribution disks verbatim.
McVoy doesn't argue with these facts, but feels that they're just a temporary occurrence. The huge growth of interest in Linux means that many new folks are exploring the operating system. There's a great demand for the hand-holding and packaging that Red Hat offers. In time, though, everyone will figure out how to use the product and the revenue stream should disappear as competition drives out the ability to charge $50 for each disk.
CoSource says that it will try to put together the bounties of many small groups and allow people to pay them with credit cards. It uses the example of a group of Linux developers who would gather together to fund the creation of an open source version of their favorite game. They would each chip in $10, $20, or $50 and when the pot got big enough, someone would step forward. Creating a cohesive political group that could effectively offer a large bounty is a great job for these sites.
On the other hand, forking can hurt the community by duplicating efforts, splitting alliances, and sowing confusion in the minds of users. If Bob starts writing and publishing his own version of Linux out of his house, then he's taking some energy away from the main version. People start wondering if the version they're running is the Missouri Synod version of Emacs or the Christian Baptist version. Where do they send bug fixes? Who's in charge? Distribution groups like Debian or Red Hat have to spend a few moments trying to decide whether they want to include one version or the other. If they include both, they have to choose one as the default. Sometimes they just throw up their hands and forget about both. It's a civil war, and those are always worse than a plain old war.
Of course, good software can have anti-forking effects. Linus Torvalds said in one interview, "Actually, I have never even checked 386BSD out; when I started on Linux it wasn't available (although Bill Jolitz's series on it in Dr. Dobbs Journal had started and were interesting), and when 386BSD finally came out, Linux was already in a state where it was so usable that I never really thought about switching. If 386BSD had been available when I started on Linux, Linux would probably never have happened." So if 386BSD had been easier to find on the Net and better supported, Linux might never have begun.
While the three forks of BSD may cooperate more than they compete, the Linux world still likes to look at the BSD world with a bit of contempt. All of the forks look somewhat messy, even if having the freedom to fork is what Stallman and GNU are ostensibly fighting to achieve. The Linux enthusiasts seem to think, "We've got our ducks in a single row. What's your problem?" It's sort of like the Army mentality. If it's green, uniform, and the same everywhere, then it must be good.
The BSD lacks the monomaniacal cohesion of Linux, and this seems to hurt their image. The BSD community has always felt that Linux is stealing the limelight that should be shared at least equally between the groups. Linux is really built around a cult of Linus Torvalds, and that makes great press. It's very easy for the press to take photos of one man and put him on the cover of a magazine. It's simple, clean, neat, and perfectly amenable to a 30-second sound bite. Explaining that there's FreeBSD, NetBSD, OpenBSD, and who knows what smaller versions waiting in the wings just isn't as manageable.
Eric Raymond, a true disciple of Linus Torvalds and Linux, sees it in technical terms. The BSD community is proud of the fact that each distribution is built out of one big source tree. They get all the source code for all the parts of the kernel, the utilities, the editors, and whatnot together in one place. Then they push the compile button and let people work. This is a crisp, effective, well-managed approach to the project.
The Linux groups, however, are not that coordinated at all. Torvalds only really worries about the kernel, which is his baby. Someone else worries about GCC. Everyone comes up with their own source trees for the parts. The distribution companies like Red Hat worry about gluing the mess together. It's not unusual to find version 2.0 of the kernel in one distribution while another is sporting version 2.2.
"In BSD, you can do a unified make. They're fairly proud of that," says Raymond. "But this creates rigidities that give people incentives to fork. The BSD things that are built that way develop new spin-off groups each week, while Linux, which is more loosely coupled, doesn't fork."
But this distinction may be semantic. Forking does occur in the Linux realm, but it happens as small diversions that get explained away with other words. Red Hat may choose to use GNOME, while another distribution like SuSE might choose KDE. The users will see a big difference because both tools create virtual desktop environments. You can't miss them. But people won't label this a fork. Both distributions are using the same Linux kernel and no one has gone off and said, "To hell with Linus, I'm going to build my own version of Linux." Everyone's technically still calling themselves Linux, even if they're building something that looks fairly different on the surface.
Jason Wright, one of the developers on the OpenBSD team, sees the organization as a good thing. "The one thing that all of the BSDs have over Linux is a unified source tree. We don't have Joe Blow's tree or Bob's tree," he says. In other words, when they fork, they do it officially, with great ceremony, and make sure the world knows of their separate creations. They make a clear break, and this makes it easier for developers.
Wright says that this single source tree made it much easier for them to turn OpenBSD into a very secure OS."We've got the security over Linux. They've recently been doing a security audit for Linux, but they're going to have a lot more trouble. There's not one place to go for the source code."
To extend this to political terms, the Linux world is like the 1980s when Ronald Reagan ran the Republican party with the maxim that no one should ever criticize another Republican. Sure, people argued internally about taxes, abortion, crime, and the usual controversies, but they displayed a rare public cohesion. No one criticizes Torvalds, and everyone is careful to pay lip service to the importance of Linux cohesion even as they're essentially forking by choosing different packages.
John Gilmore, one of the founders of the free software company Cygnus and a firm believer in the advantages of the GNU General Public License, says, "In Linux, each package has a maintainer, and patches from all distributions go back through that maintainer. There is a sense of cohesion. People at each distribution work to reduce their differences from the version released by the maintainer. In the BSD world, each tree thinks they own each program--they don't send changes back to a central place because that violates the ego model."
Jordan Hubbard, the leader of FreeBSD, is critical of Raymond's characterization of the BSD world. "I've always had a special place in my heart for that paper because he painted positions that didn't exist," Hubbard said of Raymond's piece "The Cathedral and the Bazaar." "You could point to just the Linux community and decide which part was cathedral-oriented and which part was bazaar-oriented.
When it comes right down to it, there's even plenty of forking going on about the definition of a fork. When some of the Linux team point at the BSD world and start making fun about the forks, the BSD team gets defensive. The BSD guys always get defensive because their founder isn't on the cover of all the magazines. The Linux team hints that maybe, if they weren't forking, they would have someone with a name in lights, too.
Hubbard is right. Linux forks just as much, they just call it a distribution or an experimental kernel or a patch kit. No one has the chutzpah to spin off their own rival political organization. No one has the political clout.
The most prevalent form of government in these communities is the benign dictatorship. Richard Stallman wrote some of the most important code in the GNU pantheon, and he continues to write new code and help maintain the old software. The world of the Linux kernel is dominated by Linus Torvalds. The original founders always seem to hold a strong sway over the group. Most of the code in the Linux kernel is written by others and checked out by a tight circle of friends, but Torvalds still has the final word on many changes.
The two of them are, of course, benign dictators, and the two of them don't really have any other choice. Both have a seemingly absolute amount of power, but this power is based on a mixture of personal affection and technical respect. There are no legal bounds that keep all of the developers in line. There are no rules about intellectual property or non-disclosure. Anyone can grab all of the Linux kernel or GNU source code, run off, and start making whatever changes they want. They could rename it FU, Bobux, Fredux, or Meganux and no one could stop them. The old threats of lawyers, guns, and money aren't anywhere to be seen.
19.1 Debian's Core Team
The Debian group has a wonderful pedigree and many praise it as the purest version of Linux around, but it began as a bunch of outlaws who cried mutiny and tossed Richard Stallman overboard. Well, it wasn't really so dramatic. In fact, "mutiny" isn't really the right word when everyone is free to use the source code however they want.
Bruce Perens remembers the split occurred less than a year after the project began and says, "Debian had already started. The FSF had been funding Ian Murdock for a few months. Richard at that time wanted us to make all of the executables unstripped."
Still, Stallman insisted it was a good idea. Debian resisted and said it took up too much space and raised duplication costs. Eventually, the debate ended as the Debian group went their own way. Although Stallman paid Murdock and wrote much of the GNU code on the disk, the GPL prevented him from doing much. The project continued. The source code lived on. And the Debian disks kept shipping. Stallman was no longer titular leader of Debian.
The rift between the group has largely healed. Perens now praises Stallman and says that the two of them are still very close philosophically on the most important issues in the free software world. Stallman, for his part, uses Debian on his machines because he feels the closest kinship with it.
This army is a diverse bunch. At a recent Linux conference, Jeff Bates, one of the editors of the influential website Slashdot (www.slashdot.org), pointed me toward the Debian booth, which was next to theirs. "If you look in the booth, you can see that map. They put a pushpin in the board for every developer and project leader they have around the world. China, Netherlands, Somalia, there are people coming from all over."
James Lewis-Moss is one of the members, who just happened to be in the Debian booth next door. He lives in Asheville, North Carolina, which is four hours west of the Convention Center in downtown Raleigh. The Debian group normally relies upon local volunteers to staff the booth, answer questions, distribute CD-ROMs, and keep people interested in the project.
The packager's job is to download the latest software from the programmer and make sure that it runs well with the latest version of the other software to go in the Debian distribution. This crucial task is why groups like Debian are so necessary. If Lewis-Moss does his job well, someone who installs Debian on his computer will not have any trouble using X Emacs.
Lewis-Moss's job isn't exactly programming, but it's close. He has to download the source code, compile the program, run it, and make sure that the latest version of the source works correctly with the latest version of the Linux kernel and the other parts of the OS that keep a system running. The packager must also ensure that the program works well with the Debian-specific tools that make installation easier. If there are obvious bugs, he'll fix them himself. Otherwise, he'll work with the author on tracking down and fixing the problems.
He's quite modest about this effort and says, "Most Debian developers don't write a whole lot of code for Debian. We just test things to make sure it works well together. It would be offensive to some of the actual programmers to hear that some of the Debian folks are writing the programs when they're actually not."
Lewis-Moss ended up with this job in the time-honored tradition of committees and volunteer organizations everywhere. "I reported a bug in X Emacs to Debian. The guy who had the package at that time said, 'I don't want this anymore. Do you want it?' I guess it was random. It was sort of an accident. I didn't intend to become involved in it, but it was something I was interested in. I figured 'Hell, might as well.'"
The Linux development effort moves slowly forward with thousands of stories like Lewis-Moss's. Folks come along, check out the code, and toss in a few contributions that make it a bit better for themselves. The mailing list debates some of the changes if they're controversial or if they'll affect many people. It's a very efficient system in many ways, if you can stand the heat of the debates.
While the mailing list looks like an idealized notion of a congress for the Linux kernel development, it is not as perfect as it may seem. Not all comments are taken equally because friendships and political alliances have evolved through time. The Debian group elected a president to make crucial decisions that can't be made by deep argument and consensus. The president doesn't have many other powers in other cases.
While the Linux and GNU worlds are dominated by their one great Sun King, many other open source projects have adopted a more modern government structure that is more like Debian. The groups are still fairly ad hoc and unofficial, but they are more democratic. There's less idolatry and less dependence on one person.
The Debian group is a good example of a very loose-knit structure with less reliance on the central leader. In the beginning, Ian Murdock started the distribution and did much of the coordination. In time, the mailing list grew and attracted other developers like Bruce Perens. As Murdock grew busier, he started handing off work to others. Eventually, he handed off central control to Perens, who slowly delegated more of the control until there was no key maintainer left. If someone dies in a bus crash, the group will live on.
Now a large group of people act as maintainers for the different packages. Anyone who wants to work on the project can take responsibility for a particular package. This might be a small tool like a game or a bigger tool like the C compiler. In most cases, the maintainer isn't the author of the software or even a hard-core programmer. The maintainer's job is to make sure that the particular package continues to work with all the rest. In many cases, this is a pretty easy job. Most changes in the system don't affect simple programs. But in some cases it's a real challenge and the maintainer must act as a liaison between Debian and the original programmer. Sometimes the maintainers fix the bugs themselves. Sometimes they just report them. But in either case, the maintainer must make sure that the code works.
Every once and a bit, Debian takes the latest stable kernel from Torvalds's team and mixes it together with all of the other packages. The maintainers check out their packages and when everything works well, Debian presses another CD-ROM and places the pile of code on the net. This is a stable "freeze" that the Debian group does to make sure they've got a stable platform that people can always turn to.
"Making a whole OS with just a crew of volunteers and no money is a pretty big achievement. You can never discount that. It's easy for Red Hat to do it. They're all getting paid. The fact is that Debian makes a good system and still continues to do so. I don't think that there've been that many unpaid, collaborative projects that complex before," says Perens.
When Perens took over at Debian he brought about two major changes. The first was to create a nonprofit corporation called Software in the Public Interest and arrange for the IRS to recognize it as a bona fide charitable organization. People and companies who donate money and equipment can take them off their taxes.
Perens says that the group's budget is about $10,000 a year. "We pay for hardware sometimes. Although a lot of our hardware is donated. We fly people to conferences so they can promote Debian. We have a trade show booth. In general we get the trade show space from the show for free or severely discounted. We also have the conventional PO boxes, accounting, phone calls. The project doesn't have a ton of money, but it doesn't spend a lot, either."
The Debian group also wrote the first guidelines for acceptable open source software during Perens's time in charge. These eventually mutated to become the definition endorsed by the Open Source Initiative. This isn't too surprising, since Perens was one of the founders of the Open Source Initiative.
Debian's success has inspired many others. Red Hat, for instance, borrowed a significant amount of work done by Debian when they put together their distribution, and Debian borrows some of Red Hat's tools. When Red Hat went public, it arranged for Debian members to get a chance to buy some of the company's stock reserved for friends and family members. They recognized that Debian's team of package maintainers helped get their job done.
Debian's constitution and strong political structure have also inspired Sun, which is trying to unite its Java and Jini customers into a community. The company is framing its efforts to support customers as the creation of a community that's protected by a constitution. The old paradigm of customer support is being replaced by a more active world of customer participation and representation.
Of course, Sun is keeping a close hand on all of these changes. They protect their source code with a Community Source License that places crucial restrictions on the ability of these community members to stray. There's no real freedom to fork. Sun's not willing to embrace Debian's lead on that point, in part because they say they're afraid that Microsoft will use that freedom to scuttle Java.
This seriousness and corporatization are probably the only possible steps that the Apache group could take. They've always been devoted to advancing the members' interests. Many of the other open source projects like Linux were hobbies that became serious. The Apache project was always filled with people who were in the business of building the web. While some might miss the small-town kind of feel of the early years, the corporate structure is bringing more certainty and predictability to the realm. The people don't have to wear suits now that it's a corporation. It just ensures that tough decisions will be made at a predictable pace.
The next induction ceremony for this pantheon should include Robert Young, the CEO of Red Hat Software, who helped the Linux and the open source world immeasurably by finding a way to charge people for something they could get for free. This discovery made the man rich, which isn't exactly what the free software world is supposed to do. But his company also contributed a sense of stability and certainty to the Linux marketplace, and that was sorely needed. Many hard-core programmers, who know enough to get all of the software for free, are willing to pay $70 to Red Hat just because it is easier. While some may be forever jealous of the millions of dollars in Young's pocket, everyone should realize that bringing Linux to a larger world of computer illiterates requires good packaging and hand-holding. Free software wouldn't be anywhere if someone couldn't find a good way to charge for it.
The best way to understand why Young ranks with the folks who discovered how to sell sugar water is to go to a conference like LinuxExpo. In the center of the floor is the booth manned by Red Hat Software, the company Young started in Raleigh, North Carolina, after he got through working in the computer-leasing business. Young is in his fifties now and manages to survive despite the fact that most of his company's devotees are much closer to 13. Red Hat bundles together some of the free software made by the community and distributed over the Net and puts it on one relatively easy-to-use CD-ROM. Anyone who wants to install Linux or some of its packages can simply buy a disk from Red Hat and push a bunch of keys. All of the information is on one CD-ROM, and it's relatively tested and pretty much ready to go. If things go wrong, Red Hat promises to answer questions by e-mail or telephone to help people get the product working.
To make matters worse for Red Hat, the potential competitors don't have to go out onto the Net and reassemble the collection of software for themselves. The GPL specifically forbids people from placing limitations on redistributing the source code. That means that a potential competitor doesn't have to do much more than buy a copy of Red Hat's disk and send it off to the CD-ROM pressing plant. People do this all the time. One company at the exposition was selling copies of all the major Linux distributions like Red Hat, Slackware, and OpenBSD for about $3 per disk. If you bought in bulk, you could get 11 disks for $25. Not a bad deal if you're a consumer.
Red Hat also added a custom installation utility to make life easier for people who want to add Red Hat to their computer.【12 Er, I mean to say "add Linux" or "add GNU/Linux." "Red Hat" is now one of the synonyms for free software. 】 They could have made this package installation tool proprietary. After all, Red Hat programmers wrote the tool on company time. But Young released it with the GNU General Public License, recognizing that the political value of giving something back was worth much more than the price they could charge for the tool.
This is part of a deliberate political strategy to build goodwill among the programmers who distribute their software. Many Linux users compare the different companies putting together free source software CDROMs and test their commitment to the free software ideals. Debian, for instance, is very popular because it is a largely volunteer project that is careful to only include certified free source software on their CD-ROMs. Debian, however, isn't run like a business and it doesn't have the same attitude. This volunteer effort and enlightened pursuit of the essence of free software make the Debian distribution popular among the purists.
Distributors like Caldera, on the other hand, include nonfree software with their disk. You pay $29.95 to $149.95 for a CD-ROM and get some nonfree software like a word processor tossed in as a bonus. This is a great deal if you're only going to install the software once, but the copyright on the nonfree software prevents you from distributing the CD-ROM to friends. Caldera is hoping that the extras it throws in will steer people toward its disk and get them to choose Caldera's version of Linux. Many of the purists, like Richard Stallman, hate this practice and think it is just a not very subtle way to privatize the free software. If the average user isn't free to redistribute all the code, then there's something evil afoot. Of course, Stallman or any of the other software authors can't do anything about this because they made their software freely distributable.
Several companies are already making PCs with Linux software installed at the factory. While they could simply download the software from the Net themselves and create their own package, several have chosen to bundle Red Hat's version with their machines. Sam Ockman, the president of Penguin Computing, runs one of those companies.
Ockman is a recent Stanford graduate in his early twenties and a strong devotee of the Linux and GPL world. He says he started his company to prove that Linux could deliver solid, dependable servers that could compete with the best that Sun and Microsoft have to offer.
Ockman has mixed feelings about life at Stanford. While he fondly remembers the "golf course-like campus," he says the classes were too easy. He graduated with two majors despite spending plenty of time playing around with the Linux kernel. He says that the computer science department's hobbled curriculum drove him to Linux. "Their whole CS community is using a stupid compiler for C on the Macintosh," he says."Why don't they start you off on Linux? By the time you get to [course] 248, you could hack on the Linux kernel or your own replacement kernel. It's just a tragedy that you're sitting there writing virtual kernels on a Sun system that you're not allowed to reboot."
When Ockman had to choose a version of Linux for his Penguin computers, he chose Red Hat. Bob Young's company made the sale because it was playing by the rules of the game and giving software back with a GPL. Ockman says, "We actually buy the box set for every single one. Partially because the customers like to get the books, but also to support Red Hat. That's also why we picked Red Hat. They're the most free of all of the distributions."
Debian, Ockman concedes, is also very free and politically interesting, but says that his company is too small to support multiple distributions. "We only do Red Hat. That was a very strategic decision on our part. All of the distributions are pretty much the same, but there are slight differences in this and that. We could have a twelve-person Debian group, but it would just be a nightmare for us to support all of these different versions of Linux."
At the LinuxExpo, Red Hat was selling T-shirts, too. One slick number retailing for $19 just said "The Revolution of Choice" in Red Hat's signature old typewriter font. Others for sale at the company's site routinely run for $15 or more. They sucked me in. When I ordered my first Red Hat disk from them, I bought an extra T-shirt to go with the mix.
Many of the other groups are part of the game. The OpenBSD project sold out of their very fashionable T-shirts with wireframe versions of its little daemon logo soon after the beginning of the LinuxExpo. They continue to sell more T-shirts from their website. Users can also buy CD-ROMs from OpenBSD.
The most expensive T-shirt at the show came with a logo that imitated one of the early marketing images of the first Star Wars movie. The shirt showed Torvalds and Stallman instead of Han Solo and Luke Skywalker under a banner headline of "OS Wars." The shirt cost only $100, but "came with free admission to the upcoming Linux convention in Atlanta."
Ockman looks at this market competition for T-shirts and sees a genius. He says, "I think Bob Young's absolutely brilliant. Suddenly he realized that there's no future in releasing mainframes. He made a jump after finding college kids in Carolina [using Linux]. For him to make that jump is just amazing. He's a marketing guy. He sat down and figured it out.
Young's plan to brand the OS with a veneer of cool produced more success than anyone could imagine. Red Hat is by far the market leader in providing Linux to the masses, despite the fact that many can and do "steal" a low-cost version. Of course, "steal" isn't the right word, because Red Hat did the same thing. "Borrow" isn't right, "grab" is a bit casual, and "join in everlasting communion with the great free software continuum" is just too enthusiastic to be cool.
The GPL is a powerful force that prevents Red Hat from making many unilateral decisions. There are plenty of distributions that would like to take over the mantle of the most popular version of Linux. It's not hard. The source code is all there.
There are parts of this conspiracy theory that are already true. Red Hat does dominate the United States market for Linux and it controls a great deal of the mindshare. Their careful growth supported by an influx of cash ensured a strong position in the marketplace.
Can they squeeze their partners by charging different rates for Linux? Microsoft is known to offer lower Windows prices to their friends. This is unlikely. Anyone can just buy a single Red Hat CDROM from a duplicator like CheapBytes. This power play won't work.
Can they force people to pay a "Red Hat tax" just to upgrade to the latest software? Not likely. Red Hat is going to be a service company, and they're going to compete on having the best service for their customers. Their real competitor will be companies that sell support contracts like LinuxCare. Service industries are hard work. Every customer needs perfect care or they'll go somewhere else next time. Red Hat's honeymoon with the IPO cash will only last so long. Eventually, they're going to have to earn the money to get a return on the investment. They're going to be answering a lot of phone calls and e-mails.
Still, most users including the best programmers end up paying a company like Red Hat, Caldera, or a group like OpenBSD to do some of the basic research in building a Linux system. All of the distribution companies charge for a copy of their software and throw in some support. While the software is technically free, you pay for help to get it to work.
Of course, the cost of this is debatable. Tivo, for instance, is a company that makes a set-top box for recording television content on an internal hard disk. The average user just sees a fancy, easy-to-use front end, but underneath, the entire system runs on the Linux operating system. Tivo released a copy of the stripped-down version of Linux it ships on its machines on its website, fulfilling its obligation to the GNU GPL. The only problem I've discovered is that the web page (www.tivo.com/linux/) is not particularly easy to find from the home page. If I hadn't known it was there, I wouldn't have found it.
Free source folks are just as free to share ideas. Many of the rival Linux and BSD distributions often borrow code from each other. While they compete for the hearts and minds of buyers, they're forced by the free source rules to share the code. If someone writes one device driver for one platform, it is quickly modified for another.
The Linux movement isn't really about nations and it's not really about war in the old-fashioned sense. It's about nerds building software and letting other nerds see how cool their code is. It's about empowering the world of programmers and cutting out the corporate suits. It's about spending all night coding on wonderful, magnificent software with massive colonnades, endless plazas, big brass bells, and huge steam whistles without asking a boss "Mother, may I?" It's very individualistic and peaceful.
That stirring romantic vision may be moving the boys in the trenches, but the side effects are beginning to be felt in the world of global politics. Every time Linux, FreeBSD, or OpenBSD is installed, several dollars don't go flowing to Seattle. There's a little bit less available for the Microsoft crowd to spend on mega-mansions, SUVs, and local taxes. The local library, the local police force, and the local schools are going to have a bit less local wealth to tax. In essence, the Linux boys are sacking Seattle without getting out of their chairs or breaking a sweat. You won't see this battle retold on those cable channels that traffic in war documentaries, but it's unfolding as we speak.
The difference in treatment probably did not result from any secret love for Linux or OpenBSD lurking in the hearts of the regulators in the Bureau of Export Affairs at the Department of Commerce. The regulators are probably more afraid of losing a lawsuit brought by Daniel Bernstein. In the latest decision released in May 1999, two out of three judges on an appeals panel concluded that the U.S. government's encryption regulations violated Bernstein's rights of free speech. The government argued that source code is a device not speech. The case is currently being appealed. The new regulations seem targeted to specifically address the problems the court found with the current regulations.
Most folks in the free source world may not have big bank accounts. Those are just numbers in a computer anyway, and everyone who can program knows how easy it is to fill a computer with numbers. But the free source world has good software and the source code that goes along with it. How many times a day must Bill Gates look at the blue screen of death that splashes across a Windows computer monitor when the Windows software crashes? How many times does Torvalds watch Linux crash? Who's better off? Who's wealthier?
There's no question that people like Stallman love life with source code. A deeper question is whether the free source realm offers a wealthier lifestyle for the average computer user. Most people aren't programmers, and most programmers aren't even the hard-core hackers who love to fiddle with the UNIX kernel. I've rarely used the source code to Linux, Emacs, or any of the neat tools on the Net, and many times I've simply recompiled the source code without looking at it. Is this community still a better deal?
Some grouse that comparing features like this isn't fair to the Mac or Windows world. The GNOME toolkit, they point out, didn't come out of years of research and development. The start button and the toolbar look the same because the GNOME developers were merely copying. The GNU/Linux world didn't create their own OS, they merely cloned all of the hard commercial research that produced UNIX. It's always easier to catch up, but pulling ahead is hard. The folks who want to stay on the cutting edge need to be in the commercial world. It's easy to come up with a list of commercial products and tools that haven't been cloned by an open source dude at the time of this writing: streaming video, vector animation, the full Java API, speech recognition, three dimensional CAD programs, speech synthesis, and so forth. The list goes on and on. The hottest innovations will always come from well capitalized start-ups driven by the carrot of wealth.
Another way to approach the question is to look at people's behavior. Some argue that companies like Red Hat or organizations like Debian prove that people need and want some of the commercial world's handholding. They can't afford to simply download the code and fiddle with it. Most people aren't high school students doing time for being young. They've got jobs, families, and hobbies. They pay because paying brings continuity, form, structure, and order to the free source world. Ultimately, these Red Hat users aren't Stallman disciples, they're commercial sheep who are just as dependent on Red Hat as the Windows customers are on Microsoft.
Most Linux users don't need to rewrite the source, but they can still benefit from the freedom. If everyone has the freedom, then someone will come along with the ability to do it and if the problem is big enough, someone probably will. In other words, only one person has to fly the X-wing fighter down the trench and blow up the Death Star.
Which is a better world? A polished Disneyland where every action is scripted, or a pile of Lego blocks waiting for us to give them form? Do we want to be entertained or do we want to interact? Many free software folks would point out that free software doesn't preclude you from settling into the bosom of some corporation for a long winter's nap. Companies like Caldera and Linuxcare are quite willing to hold your hand and give you the source code. Many other corporations are coming around to the same notion. Netscape led the way, and many companies like Apple and Sun will follow along. Microsoft may even do the same thing by the time you read this.
While Stallman didn't have monetary capital, he did have plenty of intellectual capital. By 1991, his GNU project had built many well respected tools that were among the best in their class. Torvalds had a great example of what the GPL could do before he chose to protect his Linux kernel with the license. He also had a great set of tools that the GNU project created.
Stallman's reputation also can be worth more than money when it opens the right doors. He continues to be blessed by the implicit support of MIT, and many young programmers are proud to contribute their work to his projects. It's a badge of honor to be associated with either Linux or the Free Software Foundation. Programmers often list these details on their résumés, and the facts have weight.
Of course, companies like Red Hat lie in a middle ground. The company charges money for support and plows this money back into improving the product. It pays several engineers to devote their time to improving the entire Linux product. It markets its work well and is able to charge a premium for what people are able to get for free.
He's done his part. The open source movement thrives on the GCC compiler, and Cygnus managed to find a way to make money on the process of keeping the compiler up to date. The free operating systems like Linux or FreeBSD are great alternatives for people today. They're small, fast, and very stable, unlike the best offerings of Microsoft or Apple. If the open software movement continues to succeed and grow, his child could grow up into a world where the blue screen of death that terrorizes Microsoft users is as foreign to them as manual typewriters.
One group that is locked out of the fray is the Linux community. While software for playing DVD movies exists for Macintoshes and PCs, there's none for Linux. DeCSS should not be seen as a hacker's tool, but merely a device that allows Linux users to watch the legitimate copies of the DVDs that they bought. Locking out Linux is like locking in Apple and Microsoft.
The battle between the motion picture community and the Linux world is just heating up as I write this. There will be more lawsuits and prehaps more jail time ahead for the developers who produced DeCSS and the people who shared it through their websites.
Most of the battles are not so dramatic. They're largely technical, and the free source world should win these easily. Open source solutions haven't had the same sophisticated graphical interface as Apple or Windows products. Most of the programmers who enjoy Linux or the various versions of BSD don't need the graphical interface and may not care about it. The good news is that projects like KDE and GNOME are great tools already. The open source world must continue to tackle this area and fight to produce something that the average guy can use.
Microsoft's greatest asset is the installed base of Windows, and it will try to use this to the best of its ability to defeat Linux. At this writing, Microsoft is rolling out a new version of the Domain Name Server (DNS), which acts like a telephone book for the Internet. In the past, many of the DNS machines were UNIX boxes because UNIX helped define the Internet. Windows 2000 includes new extensions to DNS that practically force offices to switch over to Windows machines to run DNS. Windows 2000 just won't work as well with an old Linux or UNIX box running DNS.
This is a typical strategy for Microsoft and one that is difficult, but not impossible, for open source projects to thwart. If the cost of these new servers is great enough, some group of managers is going to create its own open source clone of the modified DNS server. This has happened time and time again, but not always with great success. Linux boxes come with Samba, a program that lets Linux machines act as file servers. It works well and is widely used. Another project, WINE, started with the grand design of cloning all of the much more complicated Windows API used by programmers. It is a wonderful project, but it is far from finished. The size and complexity make a big difference.
The first cracks are already obvious. Microsoft lost the server market to Apache and Linux on the basis of price and performance. Web server managers are educated computer users who can make their own decisions without having to worry about the need to train others. Hidden computers like this are easy targets, and the free software world will gobble many of them up. More users mean more bug fixes and propagations of better code.
Of course, free software really isn't free. A variety of companies offering Linux support need to charge something to pay their bills. Distributions like Red Hat or FreeBSD may not cost much, but they often need some customization and hand-holding. Is a business just trading one bill for another? Won't Linux support end up costing the same thing as Microsoft's product?
Of course, there are also hard numbers. An article in Wired by Andrew Leonard comes with numbers originally developed by the Gartner Group. A 25-person office would cost $21,453 to outfit with Microsoft products and $5,544.70 to outfit with Linux. This estimate is a bit conservative. Most of the Linux cost is debatable because it includes almost $3,000 for 10 service calls to a Linux consultant and about $2,500 for Applixware, an office suite that does much of the same job as Microsoft Office. A truly cheap and technically hip office could make do with the editor built into Netscape and one of the free spreadsheets available for Linux. It's not hard to imagine someone doing the same job for about $3, which is the cost of a cheap knockoff of Red Hat's latest distribution.
Of course, it's important to realize that free software still costs money to support. But so does Microsoft's. The proprietary software companies also charge to answer questions and provide reliable information. It's not clear that Linux support is any more expensive to offer.
Also, many offices large and small keep computer technicians on hand. There's no reason to believe that Linux technicians will be any more or less expensive than Microsoft technicians. Both answer questions. Both keep the systems running. At least the Linux tech can look at the source code.
These users will be the most loyal to Microsoft because they will find it harder than anyone else to move. They can't afford to hire their own Linux gurus to redo the office, and they don't have the time to teach themselves.
These are the main weaknesses for Microsoft, and the company is already taking them seriously. I think many underestimate how bloody the battle is about to become. If free source software is able to stop and even reverse revenue growth for Microsoft, there are going to be some very rich people with deep pockets who feel threatened. Microsoft is probably going to turn to the same legal system that gave it such grief and find some wedge to drive into the Linux community. Their biggest weapon will be patents and copyright to stop the cloners.
Suddenly, brands like Hewlett-Packard or IBM can mean something when they're slapped on a PC. Any goofball in a garage can put a circuit board in a box and slap on Microsoft Windows. A big company like HP or IBM could do extra work to make sure the Linux distribution on the box worked well with the components and provided a glitch-free existence for the user.
Despite these gifts, free software will continue to grow on the campuses. Students often have little cash and Microsoft doesn't get any great tax deduction by giving gifts to individual students (that's income). The smartest kids in the dorms will continue to run Linux. Many labs do cutting-edge work that requires customized software. These groups will naturally be attracted to free source code because it makes their life easier. It will be difficult for Microsoft to counteract the very real attraction of free software.
If things go perfectly for Microsoft, the company will be able to pull out one or two patents from its huge portfolio and use these to sue Red Hat, Walnut Creek, and a few of the other major distributors. Ideally, this patent would cover some crucial part of the Linux or BSD operating system. After the first few legal bills started arriving on the desk of the Red Hat or Walnut Creek CEO, the companies would have to settle by quitting the business. Eventually, all of the distributors of Linux would crumble and return to the small camps in the hills to lick their wounds. At least, that's probably the dream of some of Microsoft's greatest legal soldiers.
This maneuver is far from a lock for Microsoft because the free software world has a number of good defenses. The first is that the Linux and BSD world do a good job of publicizing their advances. Any patent holder must file the patent before someone else publishes their ideas. The Linux discussion groups and source distributions are a pretty good public forum. The ideas and patches often circulate publicly long before they make their way into a stable version of the kernel. That means that the patent holders will need to be much farther ahead than the free software world.
Linux and the free software world are often the cradle of new ideas. University students use open source software all the time. It's much easier to do way cool things if you've got access to the source. Sure, Microsoft has some smart researchers with great funding, but can they compete with all the students?
The second defense is adaptability. The free software distributions can simply strip out the offending code. The Linux and BSD disks are very modular because they come from a variety of different sources. The different layers and tools come from different authors, so they are not highly integrated. This makes it possible to remove one part without ruining the entire system.
It will be pretty difficult for a company like Microsoft to find a patent that will allow it to deal a fatal blow to either the Linux or BSD distributions. The groups will just clip out the offending code and then work around it.
Microsoft's greatest hope is to lock up the next generation of computing with patents. New technologies like streaming multimedia or Internet audio are still up for grabs. While people have been studying these topics in universities for some time, the Linux community is further behind. Microsoft will try to dominate these areas with crucial patents that affect how operating systems deal with this kind of data. Their success at this is hard to predict. In any event, while they may be able to cripple the adoption of some new technologies like streaming multimedia, they won't be able to smash the entire world.
This does not preclude the free software world from using some ideas or software. There's no reason why Linux can't run proprietary application software that costs money. Perhaps people will sell licenses for some distributions and patches. Still, the users must shift mental gears when they encounter these packages.
One of the biggest challenges for the free software community will be developing the leadership to undertake these battles. It is one thing to mess around in a garage with your buddies and hang out in some virtual he-man/Microsoft-haters clubhouse cooking up neat code. It's a very different challenge to actually achieve the world domination that the Linux world muses about. When I started writing the book, I thought that an anthem for the free software movement might be Spinal Tap's "Flower People." Now I think it's going to be Buffalo Springfield's "For What It's Worth," which warns, "There's something happening here / What it is ain't exactly clear."
The strength of the free price shouldn't be underestimated. While the cost isn't really nothing after you add up the price of paying Red Hat, Slackware, SuSE, Debian, or someone else to provide support, it's still much cheaper than the proprietary solutions on the market. Price isn't the only thing on people's minds, but it will always be an important one.
Debian Free Software Guidelines See Open Source. (www.debian.org)
driver Most computers are designed to work with optional devices like modems, disk drives, printers, cameras, and keyboards. A driver is a piece of software that translates the signals sent by the device into a set of signals that can be understood by the operating system. Most operating systems are designed to be modular, so these drivers can be added as an afterthought whenever a user connects a new device. They are usually designed to have a standard structure so other software will work with them. The driver for each mouse, for instance, translates the signals from the mouse into a standard description that includes the position of the mouse and its direction. Drivers are an important point of debate in the free software community because volunteers must often create the drivers. Most manufacturers write the drivers for Windows computers because these customers make up the bulk of their sales. The manufacturers often avoid creating drivers for Linux or BSD systems because they perceive the market to be small. Some manufacturers also cite the GNU GPL as an impediment because they feel that releasing the source code to their drivers publishes important competitive information.
GNOME The GNU Network Object Model Environment, which might be summarized as "All of the functionality of Microsoft Windows for Linux." It's actually more. There are many enhancements that make the tool easier to use and more flexible than the prototype from Redmond. See also KDE, another package that accomplishes much of the same. (www.gnome.org)
GNU/Linux The name some people use for Linux as a way of giving credit to the GNU project for its leadership and contribution of code.
Linux The name given to the core of the operating system started by Linus Torvalds in 1991. The word is now generally used to refer to an entire bundle of free software packages that work together. Red Hat Linux, for instance, is a large bundle of software including packages written by many other unrelated projects.
OpenBSD One of the three major versions of BSD available. The development team, led by Theo de Raadt, aims to provide the best possible security by examining the source code in detail and looking for potential holes. (www.openbsd.org) open source A broad term used by the Open Source Initiative (www.opensource.org) to embrace software developed and released under the GNU General Public License, the BSD license, the Artistic License, the X Consortium, and the Netscape License. It includes software licenses that put few restrictions on the redistribution of source code. The Open Source Initiative's definition was adapted from the Debian Free Software Guidelines. The OSI's definition includes 10 criteria, which range from insisting that the software and the source code must be freely redistributable to insisting that the license not discriminate.
Ananian, C. Scott. "A Linux Lament: As Red Hat Prepares to Go Public, One Linux Hacker's Dreams of IPO Glory Are Crushed by the Man." Salon magazine, July 30, 1999.
"Questions Not to Ask on Linux-Kernel." May 1998.
Brown, Zack. "The 'Linux' vs. 'GNU/Linux' Debate." Kernel Traffic, April 13, 1999.
Chalmers, Rachel. "Challenges Ahead for the Linux Standards Base."LinuxWorld, April 1999.
Coates, James. "A Rebellious Reaction to the Linux Revolution."Chicago Tribune, April 25, 1999.
D'Amico, Mary Lisbeth. "German Division of Microsoft Protests 'Where Do You Want to Go Tomorrow' Slogan: Linux Site Holds Contest for New Slogan While Case Is Pending." LinuxWorld, April 13, 1999.
Jelinek, Jakub. "Re: Mach64 Problems in UltraPenguin 1.1.9." Linux Weekly News, April 27, 1999.
Kidd, Eric. "Why You Might Want to Use the Library GPL for Your Next Library." Linux Gazette, March 1999.
"Linux Beat Windows NT Handily in an Oracle Performance Benchmark." Linux Weekly News, April 29, 1999.
McMillan, Robert, and Nora Mikes. "After the 'Sweet Sixteen': Linus Torvalds's Take on the State of Linux." LinuxWorld, March 1999.
Metcalfe, Bob. "Linux's '60s Technology: Open-Sores Ideology Won't Beat W2K, but What Will?" June 19, 1999.
Raymond, Eric. The Cathedral and the Bazaar:Musings on Linux and Open Source by an Accidental Revolutionary. San Francisco: O'Reilly, 1999.
Rubini, Alessandro. "Tour of the Linux Kernel Source." Linux Documentation Project.
Rusling, David A. "The Linux Kernel."
Searles, Doc. "It's an Industry." Linux Journal, May 21, 1999.
Slind-Flor, Victoria. "Linux May Alter IP Legal Landscape: Some Predict More Contract Work if Alternative to Windows Catches On." National Law Journal, March 12, 1999.
Torvalds, Linus. "Linus Torvalds: Leader of the Revolution." Transcript of Chat with Linus Torvalds, creator of the Linux OS. ABCNews.com.
"Linux's History." July 31, 1992.
Whitenger, Dave. "Words of a Maddog." Linux Today, April 19, 1999.
"Web and File Server Comparison: Microsoft Windows NT Server 4.0 and Red Hat Linux 5.2 Upgraded to the Linux 2.2.2 Kernel." Mindcraft, April 13, 1999.
Williams, Riley. "Linux Kernel Vertsion History."
Many facts needed correction, but deeper changes were also needed. Williams, a non-programmer, blurred fundamental technical and legal distinctions, such as that between modifying an existing program's code, on the one hand, and implementing some of its ideas in a new program, on the other. Thus, the first edition said that both Gosmacs and GNU Emacs were developed by modifying the original PDP-10 Emacs, which in fact neither one was. Likewise, it mistakenly described Linux as a "version of Minix." SCO later made the same false claim in its infamous lawsuit against IBM, and both Torvalds and Tanenbaum rebutted it.
The first edition over dramatized many events by projecting spurious emotions into them. For instance, it said that I "all but shunned" Linux in 1992, and then made a "a dramatic about-face" by deciding in 1993 to sponsor Debian GNU/Linux. Both my interest in 1993 and my lack of interest in 1992 were pragmatic means to pursue the same end: to complete the GNU system. The launch of the GNU Hurd kernel in 1990 was also a pragmatic move directed at that same end.
In this edition, the complete system that combines GNU and Linux is always "GNU/Linux," and "Linux" by itself always refers to Torvalds' kernel, except in quotations where the other usage is marked with "[sic]".
See ‹http://www.gnu.org/gnu/gnu-linux-faq.html› for more explanation of why it is erroneous and unfair to call the whole system "Linux."
In an information economy increasingly dependent on software and increasingly beholden to software standards, the GPL has become the proverbial "big stick." Even companies that once derided it as "software socialism" have come around to recognize the benefits. Linux, the kernel developed by Finnish college student Linus Torvalds in 1991, is licensed under the GPL, as are most parts of the GNU system: GNU Emacs, the GNU Debugger, the GNU C Compiler, etc. Together, these tools form the components of the free software GNU/Linux operating system, developed, nurtured, and owned by the worldwide hacker community. Instead of viewing this community as a threat, high-tech companies like IBM, Hewlett Packard, and Sun Microsystems have come to rely upon it, selling software applications and services built to ride atop the ever-growing free software infrastructure.【3 Although these applications run on GNU/Linux, it does not follow that they are themselves free software. On the contrary, most of them applications are proprietary software, and respect your freedom no more than Windows does. They may contribute to the success of GNU/Linux, but they don't contribute to the goal of freedom for which it exists. 】
The mutual success of GNU/Linux and Windows over the last 10years suggests that both sides on this question are sometimes right. However, free software activists such as Stallman think this is a side issue. The real question, they say, isn't whether free or proprietary software will succeed more, it's which one is more ethical.
The crowd is filled with visitors who share Stallman's fashion and grooming tastes. Many come bearing laptop computers and cellular modems, all the better to record and transmit Stallman's words to a waiting Internet audience. The gender ratio is roughly 15 males to 1 female, and 1 of the 7 or 8 females in the room comes in bearing a stuffed penguin, the official Linux mascot, while another carries a stuffed teddy bear.
Maybe that's why most writers, when describing Stallman, tend to go for the religious angle. In a 1998 Salon.com article titled "The Saint of Free Software," Andrew Leonard describes Stallman's green eyes as "radiating the power of an Old Testament prophet."1【30 See Andrew Leonard, "The Saint of Free Software," Salon.com (August 1998),
http://www.salon.com/21st/feature/1998/08/cov_31feature.html. 】 A 1999 Wired magazine article describes the Stallman beard as "Rasputin-like,"【31 See Leander Kahney, "Linux's Forgotten Man," Wired News (March 5, 1999),
http://www.wired.com/news/print/0,1294,18291,00.html. 】 while a London Guardian profile describes the Stallman smile as the smile of "a disciple seeing Jesus."【32 See "Programmer on moral high ground; Free software is a moral issue for Richard Stallman believes in freedom and free software," London Guardian (November 6, 1999),
These are just a small sampling of the religious comparisons. To date, the most extreme comparison has to go to Linus Torvalds, who, in his autobiography - see Linus Torvalds and David Diamond, Just For Fun: The Story of an Accidental Revolutionary (HarperCollins Publishers, Inc., 2001): 58 - writes, "Richard Stallman is the God of Free Software." Honorable mention goes to Larry Lessig, who, in a footnote description of Stallman in his book - see Larry Lessig, The Future of Ideas (Random House, 2001): 270 - likens Stallman to Moses:...
as with Moses, it was another leader, Linus Torvalds, who finally carried the movement into the promised land by facilitating the development of the final part of the OS puzzle. Like Moses, too, Stallman is both respected and reviled by allies within the movement. He is[an] unforgiving, and hence for many inspiring, leader of a critically important aspect of modern culture. I have deep respect for the principle and commitment of this extraordinary individual, though I also have great respect for those who are courageous enough to question his thinking and then sustain his wrath.
In a final interview with Stallman, I asked him his thoughts about the religious comparisons. "Some people do compare me with an Old Testament prophet, and the reason is Old Testament prophets said certain social practices were wrong. They wouldn't compromise on moral issues. They couldn't be bought off, and they were usually treated with contempt." 】
My own first encounter with the legendary Stallman gaze dates back to the March, 1999, LinuxWorld Convention and Expo in San Jose, California. Billed as a "coming out party" for the "Linux" software community, the convention also stands out as the event that reintroduced Stallman to the technology media. Determined to push for his proper share of credit, Stallman used the event to instruct spectators and reporters alike on the history of the GNU Project and the project's overt political objectives.
As a reporter sent to cover the event, I received my own Stallman tutorial during a press conference announcing the release of GNOME 1.0, a free software graphic user interface. Unwittingly, I push an entire bank of hot buttons when I throw out my very first question to Stallman himself: "Do you think GNOME's maturity will affect the commercial popularity of the Linux operating system?"
"I ask that you please stop calling the operating system Linux," Stallman responds, eyes immediately zeroing in on mine. "The Linux kernel is just a small part of the operating system. Many of the software programs that make up the operating system you call Linux were not developed by Linus Torvalds at all. They were created by GNU Project volunteers, putting in their own personal time so that users might have a free operating system like the one we have today. To not acknowledge the contribution of those programmers is both impolite and a misrepresentation of history. That's why I ask that when you refer to the operating system, please call it by its proper name, GNU/Linux."
Taking the words down in my reporter's notebook, I notice an eerie silence in the crowded room. When I finally look up, I find Stallman's unblinking eyes waiting for me. Timidly, a second reporter throws out a question, making sure to use the term "GNU/Linux" instead of Linux. Miguel de Icaza, leader of the GNOME project, fields the question. It isn't until halfway through de Icaza's answer, however, that Stallman's eyes finally unlock from mine. As soon as they do, a mild shiver rolls down my back. When Stallman starts lecturing another reporter over a perceived error in diction, I feel a guilty tinge of relief. At least he isn't looking at me, I tell myself.
For Stallman, such face-to-face moments would serve their purpose. By the end of the first LinuxWorld show, most reporters know better than to use the term "Linux" in his presence, and Wired.com is running a story comparing Stallman to a pre-Stalinist revolutionary erased from the history books by hackers and entrepreneurs eager to downplay the GNU Project's overly political objectives.【33 See Leander Kahney (1999). 】 Other articles follow, and while few reporters call the operating system GNU/Linux in print, most are quick to credit Stallman for launching the drive to build a free software operating system 15 years before.
I won't meet Stallman again for another 17 months. During the interim, Stallman will revisit Silicon Valley once more for the August, 1999 LinuxWorld show. Although not invited to speak, Stallman does manage to deliver the event's best line. Accepting the show's Linus Torvalds Award for Community Service - an award named after Linux creator Linus Torvalds - on behalf of the Free Software Foundation, Stallman wisecracks, "Giving the Linus Torvalds Award to the Free Software Foundation is a bit like giving the Han Solo Award to the Rebel Alliance."
This time around, however, the comments fail to make much of a media dent. Midway through the week, Red Hat, Inc., a prominent GNU/Linux vendor, goes public. The news merely confirms what many reporters such as myself already suspect: "Linux" has become a Wall Street buzzword, much like "e-commerce" and "dot-com" before it. With the stock market approaching the Y2K rollover like a hyperbola approaching its vertical asymptote, all talk of free software or open source as a political phenomenon falls by the wayside.
Maybe that's why, when LinuxWorld follows up its first two shows with a third LinuxWorld show in August, 2000, Stallman is conspicuously absent.
My second encounter with Stallman and his trademark gaze comes shortly after that third LinuxWorld show. Hearing that Stallman is going to be in Silicon Valley, I set up a lunch interview in Palo Alto, California. The meeting place seems ironic, not only because of his absence from the show but also because of the overall backdrop. Outside of Redmond, Washington, few cities offer a more direct testament to the economic value of proprietary software. Curious to see how Stallman, a man who has spent the better part of his life railing against our culture's predilection toward greed and selfishness, is coping in a city where even garage-sized bungalows run in the half-million-dollar price range, I make the drive down from Oakland.
Stallman goes back to tapping away at his laptop. The laptop is gray and boxy, not like the sleek, modern laptops that seemed to be a programmer favorite at the recent LinuxWorld show. Above the keyboard rides a smaller, lighter keyboard, a testament to Stallman's aging hands. During the mid 1990s, the pain in Stallman's hands became so unbearable that he had to hire a typist. Today, Stallman relies on a keyboard whose keys require less pressure than a typical computer keyboard.
During the wait, Stallman practices a few dance steps. His moves are tentative but skilled. We discuss current events. Stallman says his only regret about not attending LinuxWorld was missing out on a press conference announcing the launch of the GNOME Foundation. Backed by Sun Microsystems and IBM, the foundation is in many ways a vindication for Stallman, who has long championed that free software and free-market economics need not be mutually exclusive. Nevertheless, Stallman remains dissatisfied by the message that came out.
"The way it was presented, the companies were talking about Linux with no mention of the GNU Project at all," Stallman says.
After a brief sigh, Stallman recovers. The moment gives me a chance to discuss Stallman's reputation vis-'a-vis the fairer sex. The reputation is a bit contradictory at times. A number of hackers report Stallman's predilection for greeting females with a kiss on the back of the hand.【37 See Mae Ling Mak, "A Mae Ling Story" (December 17, 1998),
http://crackmonkey.org/pipermail/crackmonkey/1998-December/001777.html. So far, Mak is the only person I've found willing to speak on the record in regard to this practice, although I've heard this from a few other female sources. Mak, despite expressing initial revulsion at it, later managed to put aside her misgivings and dance with Stallman at a 1999 LinuxWorld show. 】 A May 26, 2000 Salon.com article, meanwhile, portrays Stallman as a bit of a hacker lothario. Documenting the free software-free love connection, reporter Annalee Newitz presents Stallman as rejecting traditional family values, telling her, "I believe in love, but not monogamy."【38 See Annalee Newitz, "If Code is Free Why Not Me?", Salon.com (May 26,2000),
I mention a passage from the 1999 book Open Sources in which Stallman confesses to wanting to name the GNU kernel after a girl-friend at the time. The girlfriend's name was Alix, a name that fit perfectly with the Unix developer convention of putting an "x" at the end names of operating systems and kernels - e.g., "Linux." Alix was a Unix system administrator, and had suggested to her friends, "Someone should name a kernel after me." So Stallman decided to name the GNU kernel "Alix" as a surprise for her. The kernel's main developer renamed the kernel "Hurd," but retained the name "Alix" for part of it. One of Alix's friends noticed this part in a source snapshot and told her, and she was touched. A later redesign of the Hurd eliminated that part.【39 See Richard Stallman, "The GNU Operating System and the Free Software Movement," Open Sources (O'Reilly & Associates, Inc., 1999): 65. [RMS: Williams interpreted this vignette as suggesting that I am a hopeless romantic, and that my efforts were meant to impress some as-yet-unidentified woman. No MIT hacker would believe this, since we learned quite young that most women wouldn't notice us, let alone love us, for our programming. We programmed because it was fascinating. Meanwhile, these events were only possible because I had a thoroughly identified girlfriend at the time. If I was a romantic, at the time I was neither a hopeless romantic nor a hopeful romantic, but rather temporarily a successful one. On the strength of that naive interpretation, Williams went on to compare meto Don Quijote. For completeness' sake, here's a somewhat inarticulate quote from the first edition: "I wasn't really trying to be romantic. It was more of a teasing thing. I mean, it was romantic, but it was also teasing, you know? It would have been a delightful surprise."] 】
I decide to bring up the outcast issue again, wondering if Stallman's teenage years conditioned him to take unpopular stands, most notably his uphill battle since 1994 to get computer users and the media to replace the popular term "Linux" with "GNU/Linux."
Many criticize Stallman for rejecting handy political alliances; some psychologize this and describe it as a character trait. In the case of his well-publicized distaste for the term "open source," the unwillingness to participate in recent coalition-building projects seems understand-able. As a man who has spent the last two decades stumping on the behalf of free software, Stallman's political capital is deeply invested in the term. Still, comments such as the "Han Solo" comparison at the 1999 LinuxWorld have only reinforced Stallman's reputation, amongst those who believe virtue consists of following the crowd, as a disgruntled mossback unwilling to roll with political or marketing trends.
[RMS: The term "friends" only partly fits people such as Young, and companies such as Red Hat. It applies to some of what they did, and do: for instance, Red Hat contributes to development of free software, including some GNU programs. But Red Hat does other things that work against the free software movement's goals - for instance, its versions of GNU/Linux contain non-free software. Turning from deeds to words, referring to the whole system as "Linux" is unfriendly treatment of the GNU Project, and promoting "open source" instead of "free software" rejects our values. I could work with Young and Red Hat when we were going in the same direction, but that was not often enough to make them possible allies.]
"As far as I know, that book is still sitting on a shelf somewhere, unusable, uncopyable, just taken out of the system," Chassell says. "It was quite a good introduction if I may say so myself. It would have taken maybe three or four months to convert [the book] into a perfectly usable introduction to GNU/Linux today. The whole experience, aside from what I have in my memory, was lost."
As Stallman putters around the front of the room, a few audience members wearing T-shirts with the logo of the Maui FreeBSD Users Group (MFUG) race to set up camera and audio equipment. FreeBSD, a free software offshoot of the Berkeley Software Distribution, the venerable 1970s academic version of Unix, is technically a competitor to the GNU/Linux operating system. Still, in the hacking world, Stallman speeches are documented with a fervor reminiscent of the Grateful Dead and its legendary army of amateur archivists. As the local free software heads, it's up to the MFUG members to make sure fellow programmers in Hamburg, Mumbai, and Novosibirsk don't miss out on the latest pearls of RMS wisdom.
Once again, Stallman quickly segues into the parable of the Xerox laser printer, taking a moment to deliver the same dramatic finger-pointing gestures to the crowd. He also devotes a minute or two to the GNU/Linux name.
For Stallman, the software-patent issue dramatizes the need for eternal hacker vigilance. It also underlines the importance of stressing the political benefits of free software programs over the competitive benefits. Stallman says competitive performance and price, two areas where free software operating systems such as GNU/Linux and FreeBSD already hold a distinct advantage over their proprietary counterparts, are side issues compared to the large issues of user and developer freedom.
This position is controversial within the community: open source advocates emphasize the utilitarian advantages of free software over the political advantages. Rather than stress the political significance of free software programs, open source advocates have chosen to stress the engineering integrity of the hacker development model. Citing the power of peer review, the open source argument paints programs such as GNU/Linux or FreeBSD as better built, better inspected and, by extension, more trustworthy to the average user.
Discussing the St. IGNUcius persona afterward, Stallman says he first came up with it in 1996, long after the creation of Emacs but well before the emergence of the "open source" term and the struggle for hacker-community leadership that precipitated it. At the time, Stallman says, he wanted a way to "poke fun at himself," to remind listeners that, though stubborn, Stallman was not the fanatic some made him out to be. It was only later, Stallman adds, that others seized the persona as a convenient way to play up his reputation as software ideologue, as Eric Raymond did in an 1999 interview with the Linux.com web site:
When I say RMS calibrates what he does, I'm not belittling or accusing him of insincerity. I'm saying that like all good communicators he's got a theatrical streak. Sometimes it's conscious - have you ever seen him in his St. IGNUcius drag, blessing software with a disk platter on his head? Mostly it's unconscious; he's just learned the degree of irritating stimulus that works, that holds attention without (usually) freaking people out.【84 See "Guest Interview: Eric S. Raymond," Linux.com (May 18, 1999),
That said, Stallman does admit to being a ham. "Are you kidding?" he says at one point. "I love being the center of attention." To facilitate that process, Stallman says he once enrolled in Toastmasters, an organization that helps members bolster their public-speaking skills and one Stallman recommends highly to others. He possesses a stage presence that would be the envy of most theatrical performers and feels a link to vaudevillians of years past. A few days after the Maui High Performance Computing Center speech, I allude to the 1999 LinuxWorld performance and ask Stallman if he has a Groucho Marx complex - i.e., the unwillingness to belong to any club that would have him as a member.【85 RMS: Williams misinterprets Groucho's famous remark by treating it as psychological. It was intended as a jab at the overt antisemitism of many clubs, which was why they would refuse him as a member. I did not understand this either until my mother explained it to me. Williams and I grew up when bigotry had gone underground, and there was no need to veil criticism of bigotry in humor as Groucho did. 】 Stallman's response is immediate: "No, but I admire Groucho Marx in a lot of ways and certainly have been in some things I say inspired by him. But then I've also been inspired in some ways by Harpo."
The St. IGNUcius skit ends with a brief inside joke. On most Unix systems and Unix-related offshoots, the primary competitor program to Emacs is vi, pronounced vee-eye, a text-editing program developed by former UC Berkeley student and current Sun Microsystems chief scientist, Bill Joy. Before doffing his "halo," Stallman pokes fun at the rival program. "People sometimes ask me if it is a sin in the Church of Emacs to use vi," he says. "Using a free version of vi is not a sin;it is a penance. So happy hacking."【86 The service of the Church of Emacs has developed further since 2001. Users can now join the Church by reciting the Confession of the Faith: "There is no system but GNU, and Linux is one of its kernels." Stallman sometimes mentions the religious ceremony known as the Foobar Mitzvah, the Great Schism between various rival versions of Emacs, and the cult of the Virgin of Emacs (which refers to any person that has not yet learned to use Emacs). In addition, "vi vi vi" has been identified as the Editor of the Beast. 】
After a brief question-and-answer session, audience members gather around Stallman. A few ask for autographs. "I'll sign this," says Stallman, holding up one woman's print out of the GNU General Public License, "but only if you promise me to use the term GNU/Linux instead of Linux" (when referring to the system), "and tell all your friends to do likewise."
By the end of the 1980s, the GPL was beginning to exert a gravitational effect on the free software community. A program didn't have to carry the GPL to qualify as free software - witness the case of the BSD network utilities - but putting a program under the GPL sent a definite message. "I think the very existence of the GPL inspired people to think through whether they were making free software, and how they would license it," says Bruce Perens, creator of Electric Fence, a popular Unix utility, and future leader of the Debian GNU/Linux development team. A few years after the release of the GPL, Perens says he decided to discard Electric Fence's homegrown license in favor of Stallman's lawyer-vetted copyright. "It was actually pretty easy to do," Perens recalls.
Interestingly, the GNU system's completion would stem from one of these trips. In April 1991, Stallman paid a visit to the Polytechnic University in Helsinki, Finland. Among the audience members was 21-year-old Linus Torvalds, who was just beginning to develop the Linux kernel - the free software kernel destined to fill the GNU system's main remaining gap.
The posting drew a smattering of responses and within a month, Torvalds had posted a 0.01 version of his kernel - i.e., the earliest possible version fit for outside review - on an Internet FTP site. In the course of doing so, Torvalds had to come up with a name for the new kernel. On his own PC hard drive, Torvalds had saved the program as Linux, a name that paid its respects to the software convention of giving each Unix variant a name that ended with the letter X. Deeming the name too "egotistical," Torvalds changed it to Freax, only to have the FTP site manager change it back.
Initially, Linux was not free software: the license it carried did not qualify as free, because it did not allow commercial distribution. Torvalds was worried that some company would swoop in and take Linux away from him. However, as the growing GNU/Linux combination gained popularity, Torvalds saw that sale of copies would be useful for the community, and began to feel less worried about a possible takeover.【105 Ibid, p. 94-95. 】 This led him to reconsider the licensing of Linux.
Neither compiling Linux with GCC nor running GCC with Linux required him legally to release Linux under the GNU GPL, but Torvalds' use of GCC implied for him a certain obligation to let other users borrow back. As Torvalds would later put it: "I had hoisted myself up on the shoulders of giants."【106 Ibid, p. 95-97. 】 Not surprisingly, he began to think about what would happen when other people looked to him for similar support. A decade after the decision, Torvalds echoes the Free Software Foundation's Robert Chassell when he sums up his thoughts at the time:
You put six months of your life into this thing and you want to make it available and you want to get something out of it, but you don't want people to take advantage of it. I wanted people to be able to see [Linux], and to make changes and improvements to their hearts' content. But I also wanted to make sure that what I got out of it was to see what they were doing. I wanted to always have access to the sources so that if they made improvements, I could make those improvements myself.【107 See Linus Torvalds and David Diamond, Just For Fun: The Story of an Accidental Revolutionary (Harper Collins Publishers, Inc., 2001): 94-95. 】
When it was time to release the 0.12 version of Linux, the first to operate fully with GCC, Torvalds decided to throw his lot in with the free software movement. He discarded the old license of Linux and replaced it with the GPL. Within three years, Linux developers were offering release 1.0 of Linux, the kernel; it worked smoothly with the almost complete GNU system, composed of programs from the GNU Project and elsewhere. In effect, they had completed the GNU operating system by adding Linux to it. The resulting system was basically GNU plus Linux. Torvalds and friends, however, referred to it confusingly as "Linux."
By 1994, the amalgamated system had earned enough respect in the hacker world to make some observers from the business world wonder if Torvalds hadn't given away the farm by switching to the GPL in the project's initial months. In the first issue of Linux Journal, publisher Robert Young sat down with Torvalds for an interview. When Young asked the Finnish programmer if he felt regret at giving up private ownership of the Linux source code, Torvalds said no. "Even with 20/20 hindsight," Torvalds said, he considered the GPL "one of the very best design decisions" made during the early stages of the Linux project.【108 See Robert Young, "Interview with Linus, the Author of Linux," Linux Journal (March 1, 1994),
That the decision had been made with zero appeal or deference to Stallman and the Free Software Foundation speaks to the GPL's growing portability. Although it would take a couple of years to be recognized by Stallman, the explosiveness of Linux development conjured flashbacks of Emacs. This time around, however, the innovation triggering the explosion wasn't a software hack like Control-R but the novelty of running a Unix-like system on the PC architecture. The motives may have been different, but the end result certainly fit the ethical specifications: a fully functional operating system composed entirely of free software.
As his initial email message to the comp.os.minix newsgroup indicates, it would take a few months before Torvalds saw Linux as anything more than a holdover until the GNU developers delivered on the Hurd kernel. As far as Torvalds was concerned, he was simply the latest in a long line of kids taking apart and reassembling things just for fun. Nevertheless, when summing up the runaway success of a project that could have just as easily spent the rest of its days on an abandoned computer hard drive, Torvalds credits his younger self for having the wisdom to give up control and accept the GPL bargain. "I may not have seen the light," writes Torvalds, reflecting on Stallman's 1991 Polytechnic University speech and his subsequent decision to switch to the GPL. "But I guess something from his speech sunk in."【109 See Linus Torvalds and David Diamond, Just For Fun: The Story of an Accidental Revolutionary (Harper Collins Publishers, Inc., 2001): 59. 】
Chapter 10 - GNU/Linux
By 1993, the free software movement was at a crossroads. To the optimistically inclined, all signs pointed toward success for the hacker culture. Wired magazine, a funky, new publication offering stories on data encryption, Usenet, and software freedom, was flying off magazine racks. The Internet, once a slang term used only by hackers and research scientists, had found its way into mainstream lexicon. Even President Clinton was using it. The personal computer, once a hobbyist's toy, had grown to full-scale respectability, giving a whole new generation of computer users access to hacker-built software. And while the GNU Project had not yet reached its goal of a fully intact, free GNU operating system, users could already run the GNU/Linux variant.
Or were they? To the pessimistically inclined, each sign of acceptance carried its own troubling countersign. Sure, being a hacker was suddenly cool, but was cool good for a community that thrived on alienation? Sure, the White House was saying nice things about the Internet, even going so far as to register its own domain name, white-house.gov, *** but it was also meeting with the companies, censorship advocates, and law-enforcement officials looking to tame the Internet's Wild West culture. Sure, PCs were more powerful, but in commoditizing the PC marketplace with its chips, Intel had created a situation in which proprietary software vendors now held the power. For every new user won over to the free software cause via GNU/Linux, hundreds, perhaps thousands, were booting up Microsoft Windows for the first time. GNU/Linux had only rudimentary graphical interfaces, so it was hardly user-friendly. In 1993, only an expert could use it. The GNU Project's first attempt to develop a graphical desktop had been abortive.
Finally, there was the curious nature of GNU/Linux itself. Unrestricted by legal disputes (such as BSD faced), GNU/Linux's high-speed evolution had been so unplanned, its success so accidental, that programmers closest to the software code itself didn't know what to make of it. More compilation album than unified project, it was comprised of a hacker medley of greatest hits: everything from GCC, GDB, and glibc (the GNU Project's newly developed C Library) toX (a Unix-based graphic user interface developed by MIT's Laboratory for Computer Science) to BSD-developed tools such as BIND (the Berkeley Internet Naming Daemon, which lets users substitute easy-to-remember Internet domain names for numeric IP addresses) and TCP/IP. In addition, it contained the Linux kernel - itself designed as a replacement for Minix. Rather than developing a new operating system, Torvalds and his rapidly expanding Linux development team had plugged their work into this matrix. As Torvalds himself would later translate it when describing the secret of his success: "I'm basically a very lazy person who likes to take credit for things other people actually do."【110 Torvalds has offered this quote in many different settings. To date, however, the quote's most notable appearance is in the Eric Raymond essay, "The Cathedral and the Bazaar" (May, 1997),
By late 1993, a growing number of GNU/Linux users had begun to lean toward the latter definition and began brewing private variations on the theme. They began to develop various "distributions" of GNU/Linux and distribute them, sometimes gratis, sometimes for a price. The results were spotty at best.
"This was back before Red Hat and the other commercial distributions," remembers Ian Murdock, then a computer science student at Purdue University. "You'd flip through Unix magazines and find all these business card-sized ads proclaiming 'Linux.' Most of the companies were fly-by-night operations that saw nothing wrong with slipping a little of their own [proprietary] source code into the mix."
Murdock, a Unix programmer, remembers being "swept away" by GNU/Linux when he first downloaded and installed it on his home PC system. "It was just a lot of fun," he says. "It made me want to get involved." The explosion of poorly built distributions began to dampen his early enthusiasm, however. Deciding that the best way to get involved was to build a version free of additives, Murdock set about putting a list of the best free software tools available with the intention of folding them into his own distribution. "I wanted something that would live up to the Linux name," Murdock says.
In a bid to "stir up some interest," Murdock posted his intentions on the Internet, including Usenet's comp.os.linux newsgroup. One of the first responding email messages was from ‹firstname.lastname@example.org.› As a hacker, Murdock instantly recognized the address. It was Richard M. Stallman, founder of the GNU Project and a man Murdock knew even back then as "the hacker of hackers." Seeing the address in his mail queue, Murdock was puzzled. Why on Earth would Stallman, a person leading his own operating-system project, care about Murdock's gripes over "Linux" distributions?
"He said the Free Software Foundation was starting to look closely at Linux and that the FSF was interested in possibly doing a Linux [sic] system, too. Basically, it looked to Stallman like our goals were in line with their philosophy."
Not to over dramatize, the message represented a change in strategy on Stallman's part. Until 1993, Stallman had been content to keep his nose out of Linux affairs. After first hearing of the new kernel, Stallman asked a friend to check its suitability. Recalls Stallman, "Here ported back that the software was modeled after System V, which was the inferior version of Unix. He also told me it wasn't portable."
The friend's report was correct. Built to run on 386-based machines, Linux was firmly rooted to its low-cost hardware platform. What the friend failed to report, however, was the sizable advantage Linux enjoyed as the only free kernel in the marketplace. In other words, while Stallman spent the next year and a half listening to progress reports from the Hurd developer, reporting rather slow progress, Torvalds was winning over the programmers who would later uproot and replant Linux and GNU onto new platforms.
By 1993, the GNU Project's failure to deliver a working kernel was leading to problems both within the GNU Project and in the free software movement at large. A March, 1993, Wired magazine article by Simson Garfinkel described the GNU Project as "bogged down" despite the success of the project's many tools.【111 See Simson Garfinkel, "Is Stallman Stalled?" Wired (March, 1993). 】 Those within the project and its nonprofit adjunct, the Free Software Foundation, remember the mood as being even worse than Garfinkel's article let on. "It was very clear, at least to me at the time, that there was a window of opportunity to introduce a new operating system," says Chassell. "And once that window was closed, people would become less interested. Which is in fact exactly what happened."【112 Chassell's concern about there being a 36-month "window" for a new operating system is not unique to the GNU Project. During the early 1990s, free software versions of the Berkeley Software Distribution were held up by Unix System Laboratories' lawsuit restricting the release of BSD-derived software. While many users consider BSD offshoots such as FreeBSD and OpenBSD to be demonstrably superior to GNU/Linux both in terms of performance and security, the number of FreeBSD and OpenBSD users remains a fraction of the total GNU/Linux user population. To view a sample analysis of the relative success of GNU/Linux in relation to other free software operating systems, see the essay by New Zealand hacker, Liam Greenwood, "Why is Linux Successful" (1999),
Over time, the growing success of GNU together with Linux made it clear that the GNU Project should get on the train that was leaving and not wait for the Hurd. Besides, there were weaknesses in the community surrounding GNU/Linux. Sure, Linux had been licensed under the GPL, but as Murdock himself had noted, the desire to treat GNU/Linux as a purely free software operating system was far from unanimous. By late 1993, the total GNU/Linux user population had grown from a dozen or so enthusiasts to somewhere between 20,000 and 100,000.【114 GNU/Linux user-population numbers are sketchy at best, which is why I've provided such a broad range. The 100,000 total comes from the Red Hat "Milestones" site,
http://www.redhat.com/about/corporate/milestones.html. 】 What had once been a hobby was now a marketplace ripe for exploitation, and some developers had no objection to exploiting it with non-free software. Like Winston Churchill watching Soviet troops sweep into Berlin, Stallman felt an understandable set of mixed emotions when it came time to celebrate the GNU/Linux "victory."【115 I wrote this Winston Churchill analogy before Stallman himself sent me his own unsolicited comment on Churchill:
World War II and the determination needed to win it was a very strong memory as I was growing up. Statements such as Churchill's, "We will fight them in the landing zones, we will fight them on the beaches... we will never surrender," have always resonated for me. 】
Although late to the party, Stallman still had clout. As soon as the FSF announced that it would lend its money and moral support to Murdock's software project, other offers of support began rolling in. Murdock dubbed the new project Debian - a compression of his and his wife, Deborah's, names - and within a few weeks was rolling out the first distribution. "[Richard's support] catapulted Debian almost overnight from this interesting little project to something people within the community had to pay attention to," Murdock says.
In January of 1994, Murdock issued the Debian Manifesto. Written in the spirit of Stallman's GNU Manifesto from a decade before, it explained the importance of working closely with the Free Software Foundation. Murdock wrote:
The Free Software Foundation plays an extremely important role in the future of Debian. By the simple fact that they will be distributing it, a message is sent to the world that Linux [sic] is not a commercial product and that it never should be, but that this does not mean that Linux will never be able to compete commercially. For those of you who disagree, I challenge you to rationalize the success of GNU Emacs and GCC, which are not commercial software but which have had quite an impact on the commercial market regardless of that fact.
The time has come to concentrate on the future of Linux[sic] rather than on the destructive goal of enriching one-self at the expense of the entire Linux community and its future. The development and distribution of Debian may not be the answer to the problems that I have outlined in the Manifesto, but I hope that it will at least attract enough attention to these problems to allow them to be solved.【116 See Ian Murdock, A Brief History of Debian, (January 6, 1994): Appendix A, "The Debian Manifesto,"
Shortly after the Manifesto's release, the Free Software Foundation made its first major request. Stallman wanted Murdock to call its distribution "GNU/Linux." At first, Stallman proposed the term "Lignux" - combining the names Linux and GNU - but the initial reaction was very negative, and this convinced Stallman to go with the longer but less criticized GNU/Linux.
Some dismissed Stallman's attempt to add the "GNU" prefix as a belated quest for credit, never mind whether it was due, but Murdock saw it differently. Looking back, Murdock saw it as an attempt to counteract the growing tension between the GNU Project's developers and those who adapted GNU programs to use with the Linux kernel. "There was a split emerging," Murdock recalls. "Richard was concerned."
The programmers who adapted various GNU programs to work with the kernel Linux followed this common path: they considered only their own platform. But when the maintainers-in-charge asked them to help clean up their changes for future maintenance, several of them were not interested. They did not care about doing the correct thing, or about facilitating future maintenance of the GNU packages they had adapted. They cared only about their own versions and were inclined to maintain them as forks.
Now programmers had forked several of the principal GNU packages at once. At first, Stallman says he considered the forks to be a product of impatience. In contrast to the fast and informal dynamics of the Linux team, GNU source-code maintainers tended to be slower and more circumspect in making changes that might affect a program's long-term viability. They also were unafraid of harshly critiquing other people's code. Over time, however, Stallman began to sense that there was an underlying lack of awareness of the GNU Project and its objectives when reading Linux developers' emails.
"We discovered that the people who considered themselves 'Linux users' didn't care about the GNU Project," Stallman says. "They said, 'Why should I bother doing these things? I don't care about the GNU Project. It [the program]'s working for me. It's working for us Linux users, and nothing else matters to us.' And that was quite surprising, given that people were essentially using a variant of the GNU system, and they cared so little. They cared less than anybody else about GNU." Fooled by their own practice of calling the combination "Linux," they did not realize that their system was more GNU than Linux.
For the sake of unity, Stallman asked the maintainers-in-charge to do the work which normally the change authors should have done. In most cases this was feasible, but not in glibc. Short for GNU C Library, glibc is the package that all programs use to make "system calls" directed at the kernel, in this case Linux. User programs on a Unix-like system communicate with the kernel only through the C library.
The changes to make glibc work as a communication channel between Linux and all the other programs in the system were major and ad-hoc, written without attention to their effect on other platforms. For the glibc maintainer-in-charge, the task of cleaning them up was daunting. Instead the Free Software Foundation paid him to spend most of a year reimplementing these changes from scratch, to make glibc version 6 work "straight out of the box" in GNU/Linux.
Murdock says this was the precipitating cause that motivated Stallman to insist on adding the GNU prefix when Debian rolled out its software distribution. "The fork has since converged. Still, at the time, there was a concern that if the Linux community saw itself as a different thing as the GNU community, it might be a force for disunity."
While some viewed it as politically grasping to describe the combination of GNU and Linux as a "variant" of GNU, Murdock, already sympathetic to the free software cause, saw Stallman's request to call Debian's version GNU/Linux as reasonable. "It was more for unity than for credit," he says.
In 1996, Murdock, following his graduation from Purdue, decided to hand over the reins of the growing Debian project. He had already been ceding management duties to Bruce Perens, the hacker best known for his work on Electric Fence, a Unix utility released under the GPL. Perens, like Murdock, was a Unix programmer who had become enamored of GNU/Linux as soon as the operating system's Unix-like abilities became manifest. Like Murdock, Perens sympathized with the political agenda of Stallman and the Free Software Foundation, albeit from afar.
As a prominent Debian developer, however, Perens regarded Murdock's design battles with Stallman with dismay. Upon assuming leadership of the development team, Perens says he made the command decision to distance Debian from the Free Software Foundation. "I decided we did not want Richard's style of micro-management," he says.
According to Perens, Stallman was taken aback by the decision but had the wisdom to roll with it. "He gave it some time to cool off and sent a message that we really needed a relationship. He requested that we call it GNU/Linux and left it at that. I decided that was fine. I made the decision unilaterally. Everybody breathed a sigh of relief."
Over time, Debian would develop a reputation as the hacker's version of GNU/Linux, alongside Slackware, another popular distribution founded during the same 1993-1994 period. However, Slackware contained some non-free programs, and Debian after its separation from GNU began distributing non-free programs too.【118 Debian Buzz in June 1996 contained non-free Netscape 3.01 in its Contrib section. 】 Despite labeling them as "non-free" and saying that they were "not officially part of Debian," proposing these programs to the user implied a kind of endorsement for them. As the GNU Project became aware of these policies, it came to recognize that neither Slackware nor Debian was a GNU/Linux distro it could recommend to the public.
Outside the realm of hacker-oriented systems, however, GNU/Linux was picking up steam in the commercial Unix marketplace. In North Carolina, a Unix company billing itself as Red Hat was revamping its business to focus on GNU/Linux. The chief executive officer was Robert Young, the former Linux Journal editor who in 1994 had put the question to Linus Torvalds, asking whether he had any regrets about putting the kernel under the GPL. To Young, Torvalds' response had a "profound" impact on his own view toward GNU/Linux. Instead of looking for a way to corner the GNU/Linux market via traditional software tactics, Young began to consider what might happen if a company adopted the same approach as Debian - i.e., building an operating system completely out of free software parts. Cygnus Solutions, the company founded by Michael Tiemann and John Gilmore in 1990, was already demonstrating the ability to sell free software based on quality and customizability. What if Red Hat took the same approach with GNU/Linux?
"In the western scientific tradition we stand on the shoulders of giants," says Young, echoing both Torvalds and Sir Isaac Newton before him. "In business, this translates to not having to reinvent wheels as we go along. The beauty of [the GPL] model is you put your code into the public domain.【119 Young uses the term "public domain" loosely here. Strictly speaking, it means "not copyrighted." Code released under the GNU GPL cannot be in the public domain, since it must be copyrighted in order for the GNU GPL to apply. 】 If you're an independent software vendor and you're trying to build some application and you need a modem-dialer, well, why reinvent modem dialers? You can just steal PPP off of Red Hat [GNU/]Linux and use that as the core of your modem-dialing tool. If you need a graphic tool set, you don't have to write your own graphic library. Just download GTK. Suddenly you have the ability to reuse the best of what went before. And suddenly your focus as an application vendor is less on software management and more on writing the applications specific to your customer's needs." However, Young was no free software activist, and readily included non-free programs in Red Hat's GNU/Linux system.
Young wasn't the only software executive intrigued by the business efficiencies of free software. By late 1996, most Unix companies were starting to wake up and smell the brewing source code. The GNU/Linux sector was still a good year or two away from full commercial breakout mode, but those close enough to the hacker community could feel it: something big was happening. The Intel 386 chip, the Internet, and the World Wide Web had hit the marketplace like a set of monster waves; free software seemed like the largest wave yet.
For Ian Murdock, the wave seemed both a fitting tribute and a fitting punishment for the man who had spent so much time giving the free software movement an identity. Like many Linux aficionados, Murdock had seen the original postings. He'd seen Torvalds' original admonition that Linux was "just a hobby." He'd also seen Torvalds' admission to Minix creator Andrew Tanenbaum: "If the GNU kernel had been ready last spring, I'd not have bothered to even start my project."【120 This quote is taken from the much publicized Torvalds-Tanenbaum "flame war" following the initial release of Linux. In the process of defending his choice of a non-portable monolithic kernel design, Torvalds says he started working on Linux as a way to learn more about his new 386 PC. "If the GNU kernel had been ready last spring, I'd not have bothered to even start my project." See Chris DiBona et al., Open Sources (O'Reilly & Associates, Inc., 1999): 224. 】 Like many, Murdock knew that some opportunities had been missed. He also knew the excitement of watching new opportunities come seeping out of the very fabric of the Internet.
"Being involved with Linux in those early days was fun," recalls Murdock. "At the same time, it was something to do, something to pass the time. If you go back and read those old [comp.os.minix]exchanges, you'll see the sentiment: this is something we can play with until the Hurd is ready. People were anxious. It's funny, but in a lot of ways, I suspect that Linux would never have happened if the Hurd had come along more quickly."
By the end of 1996, however, such "what if" questions were already moot, because Torvalds' kernel had gained a critical mass of users. The 36-month window had closed, meaning that even if the GNU Project had rolled out its Hurd kernel, chances were slim anybody outside the hard-core hacker community would have noticed. Linux, by filling the GNU system's last gap, had achieved the GNU Project's goal of producing a Unix-like free software operating system. However, most of the users did not recognize what had happened: they thought the whole system was Linux, and that Torvalds had done it all. Most of them installed distributions that came with non-free software; with Torvalds as their ethical guide, they saw no principled reason to reject it. Still, a precarious freedom was available for those that appreciated it.
In November, 1995, Peter Salus, a member of the Free Software Foundation and author of the 1994 book, A Quarter Century of Unix, issued a call for papers to members of the GNU Project's "system-discuss" mailing list. Salus, the conference's scheduled chairman, wanted to tip off fellow hackers about the upcoming Conference on Freely Redistributable Software in Cambridge, Massachusetts. Slated for February, 1996, and sponsored by the Free Software Foundation, the event promised to be the first engineering conference solely dedicated to free software and, in a show of unity with other free software programmers, welcomed papers on "any aspect of GNU, Linux, NetBSD, 386BSD, FreeBSD, Perl, Tcl/tk, and other tools for which the code is accessible and redistributable." Salus wrote:
Despite the falling out, Raymond remained active in the free software community. So much so that when Salus suggested a conference pairing Stallman and Torvalds as keynote speakers, Raymond eagerly seconded the idea. With Stallman representing the older, wiser contingent of ITS/Unix hackers and Torvalds representing the younger, more energetic crop of Linux hackers, the pairing indicated a symbolic show of unity that could only be beneficial, especially to ambitious younger (i.e., below 40) hackers such as Raymond. "I sort of had afoot in both camps," Raymond says.
Stallman, for his part, doesn't remember any tension at the 1996conference; he probably wasn't present when Torvalds made that statement. But he does remember later feeling the sting of Torvalds' celebrated "cheekiness." "There was a thing in the Linux documentation which says print out the GNU coding standards and then tear them up," says Stallman, recalling one example. "When you look closely, what he disagreed with was the least important part of it, the recommendation for how to indent C code."
For Raymond, the warm reception other hackers gave to Torvalds' comments confirmed a suspicion: the dividing line separating Linux developers from GNU developers was largely generational. Many Linux hackers, like Torvalds, had grown up in a world of proprietary software. They had begun contributing to free software without perceiving any injustice in non-free software. For most of them, nothing was at stake beyond convenience. Unless a program was technically inferior, they saw little reason to reject it on licensing issues alone. Some day hackers might develop a free software alternative to PowerPoint. Until then, why criticize PowerPoint or Microsoft; why not use it?
Raymond says the response was enthusiastic, but not nearly as enthusiastic as the one he received during the 1997 Linux Kongress, a gathering of GNU/Linux users in Germany the next spring.
Eventually, Raymond would convert the speech into a paper, also titled "The Cathedral and the Bazaar." The paper drew its name from Raymond's central analogy. Previously, programs were "cathedrals," impressive, centrally planned monuments built to stand the test of time. Linux, on the other hand, was more like "a great babbling bazaar," a software program developed through the loose decentralizing dynamics of the Internet.
Raymond's paper associated the Cathedral style, which he and Stallman and many others had used, specifically with the GNU Project and Stallman, thus casting the contrast between development models as a comparison between Stallman and Torvalds. Where Stallman was his chosen example of the classic cathedral architect - i.e., a programming "wizard" who could disappear for 18 months and return with something like the GNU C Compiler - Torvalds was more like a genial dinner-party host. In letting others lead the Linux design discussion and stepping in only when the entire table needed a referee, Torvalds had created a development model very much reflective of his own laid-back personality. From Torvalds' perspective, the most important managerial task was not imposing control but keeping the ideas flowing.
Summarized Raymond, "I think Linus's cleverest and most consequential hack was not the construction of the Linux kernel itself, but rather his invention of the Linux development model."【124 See Eric Raymond, "The Cathedral and the Bazaar" (1997). 】
When Netscape CEO Jim Barksdale cited Raymond's "Cathedral and the Bazaar" essay as a major influence upon the company's decision, the company instantly elevated Raymond to the level of hacker celebrity. He invited a few people including Larry Augustin, founder of VA Research which sold workstations with the GNU/Linux operating system pre-installed; Tim O'Reilly, founder of the publisher O'Reilly& Associates; and Christine Peterson, president of the Foresight Institute, a Silicon Valley think tank specializing in nano technology, to talk. "The meeting's agenda boiled down to one item: how to take advantage of Netscape's decision so that other companies might follow suit?"
Snub or no snub, both O'Reilly and Raymond say the term "open-source" won over just enough summit-goers to qualify as a success. The attendees shared ideas and experiences and brainstormed on how to improve free software's image. Of key concern was how to point out the successes of free software, particularly in the realm of Internet infrastructure, as opposed to playing up the GNU/Linux challenge to Microsoft Windows. But like the earlier meeting at VA, the discussion soon turned to the problems associated with the term "free software." O'Reilly, the summit host, remembers a comment from Torvalds, a summit attendee.
In addition, Stallman thought that the ideas of "open source" led people to put too much emphasis on winning the support of business. While such support in itself wasn't necessarily bad in itself, he expected that being too desperate for it would lead to harmful compromises. "Negotiation 101 would teach you that if you are desperate to get someone's agreement, you are asking for a bad deal," he says. "You need to be prepared to say no." Summing up his position at the 1999 LinuxWorld Convention and Expo, an event billed by Torvalds himself as a "coming out party" for the "Linux" community, Stallman implored his fellow hackers to resist the lure of easy compromise.
Even before the LinuxWorld show, however, Stallman was showing an increased willingness to alienate open source supporters. A few months after the Freeware Summit, O'Reilly hosted its second annual Perl Conference. This time around, Stallman was in attendance. During a panel discussion lauding IBM's decision to employ the free software Apache web server in its commercial offerings, Stallman, taking advantage of an audience microphone, made a sharp denunciation of panelist John Ousterhout, creator of the Tcl scripting language. Stallman branded Ousterhout a "parasite" on the free software community for marketing a proprietary version of Tcl via Ousterhout's startup company, Scriptics. Ousterhout had stated that Scriptics would contribute only the barest minimum of its improvements to the free version of Tcl, meaning it would in effect use that small contribution to win community approval for much a larger amount of non-free software development. Stallman rejected this position and denounced Scriptics' plans. "I don't think Scriptics is necessary for the continued existenceof Tcl," Stallman said to hisses from the fellow audience members.【126 Ibid. 】
Stallman's energies would do little to counteract the public-relations momentum of open source proponents. In August of 1998, when chip-maker Intel purchased a stake in GNU/Linux vendor Red Hat, an accompanying New York Times article described the company as the product of a movement "known alternatively as free software and opensource."【128 See Amy Harmon, "For Sale: Free Operating System," New York Times (September 28, 1998),
http://www.nytimes.com/library/tech/98/09/biztech/articles/28linux.html. 】 Six months later, a John Markoff article on Apple Computerwas proclaiming the company's adoption of the "open source" Apache server in the article headline.【129 See John Markoff, "Apple Adopts 'Open Source' for its Server Computers," New York Times (March 17, 1999),
Such momentum would coincide with the growing momentum of companies that actively embraced the "open source" term. By August of 1999, Red Hat, a company that now eagerly billed itself as "opensource," was selling shares on Nasdaq. In December, VA Linux -formerly VA Research - was floating its own IPO to historic effect. Opening at $30 per share, the company's stock price exploded past the $300 mark in initial trading only to settle back down to the $239 level. Shareholders lucky enough to get in at the bottom and stay until the end experienced a 698% increase in paper wealth, a Nasdaq record. Eric Raymond, as a board member, owned shares worth $36 million. However, these high prices were temporary; they tumbled when the dot-com boom ended.
These methods won great success for open source, but not for the ideals of free software. What they had done to "spread the message" was to omit the most important part of it: the idea of freedom as an ethical issue. The effects of this omission are visible today: as of 2009, nearly all GNU/Linux distributions include proprietary programs, Torvalds' version of Linux contains proprietary firmware programs, and the company formerly called VA Linux bases its business on proprietary software. Over half of all the world's web servers run some version of Apache, and the usual version of Apache is free software, but many of those sites run a proprietary modified version distributed by IBM.
Ironically, the success of open source and open source advocates such as Raymond would not diminish Stallman's role as a leader - but it would lead many to misunderstand what he is a leader of. Since the free software movement lacks the corporate and media recognition of open source, most users of GNU/Linux do not hear that it exists, let alone what its views are. They have heard the ideas and values of opensource, and they never imagine that Stallman might have different ones. Thus he receives messages thanking him for his advocacy of "open source," and explains in response that he has never been a supporter of that, using the occasion to inform the sender about free-software.
Four years after "The Cathedral and the Bazaar," Stallman still chafes over the Raymond critique. He also grumbles over Linus Torvalds' elevation to the role of world's most famous hacker. He recalls a popular T-shirt that began showing at Linux tradeshows around 1999. Designed to mimic the original promotional poster for Star Wars, the shirt depicted Torvalds brandishing a light-saber like Luke Skywalker, while Stallman's face rides atop R2D2. The shirt still grates on Stallman's nerves not only because it depicts him as Torvalds' sidekick, but also because it elevates Torvalds to the leadership role in the free-software community, a role even Torvalds himself is loath to accept. "It's ironic," says Stallman mournfully. "Picking up that sword is exactly what Linus refuses to do. He gets everybody focusing on him as the symbol of the movement, and then he won't fight. What good is it?"
Then again, it is that same unwillingness to "pick up the sword," on Torvalds' part, that has left the door open for Stallman to bolster his reputation as the hacker community's ethical arbiter. Despite his grievances, Stallman has to admit that the last few years have been quite good, both to himself and to his organization. Relegated to the periphery by the ironic success of the GNU/Linux system because users thought of it as "Linux," Stallman has nonetheless successfully recaptured the initiative. His speaking schedule between January 2000and December 2001 included stops on six continents and visits to countries where the notion of software freedom carries heavy overtones -China and India, for example.
Outside the bully pulpit, Stallman has taken advantage of the leverage of the GNU General Public License (GPL), of which he remains the steward. During the summer of 2000, while the air was rapidly leaking out of the 1999 Linux IPO bubble, Stallman and the Free Software Foundation scored two major victories. In July, 2000, Troll tech, a Norwegian software company and developer of Qt, a graphical interface library that ran on the GNU/Linux operating system, announced it was licensing its software under the GPL. A few weeks later, Sun Microsystems, a company that, until then, had been warily trying to ride the open source bandwagon without actually contributing its code, finally relented and announced that it, too, was dual licensing its new OpenOffice【131 Sun was compelled by a trademark complaint to use the clumsy name "OpenOffice.org." 】 application suite under the Lesser GNU Public License(LGPL) and the Sun Industry Standards Source License (SISSL).
In the case of Trolltech, this victory was the result of a protracted effort by the GNU Project. The non-freeness of Qt was a serious problem for the free software community because KDE, a free graphical desktop environment that was becoming popular, depended on it. Qt was non-free software but Trolltech had invited free software projects(such as KDE) to use it gratis. Although KDE itself was free software, users that insisted on freedom couldn't run it, since they had to reject Qt. Stallman recognized that many users would want a graphical desktop on GNU/Linux, and most would not value freedom enough to reject the temptation of KDE, with Qt hiding within. The danger was that GNU/Linux would become a motor for the installation of KDE, and therefore also of non-free Qt. This would undermine the freedom which was the purpose of GNU.
To deal with this danger, Stallman recruited people to launch two parallel counter projects. One was GNOME, the GNU free graphical desktop environment. The other was Harmony, a compatible free replacement for Qt. If GNOME succeeded, KDE would not be necessary; if Harmony succeeded, KDE would not need Qt. Either way, users would be able to have a graphical desktop on GNU/Linux without non-free Qt.
Sun desired to play according to the Free Software Foundation's conditions. At the 1999 O'Reilly Open Source Conference, Sun Microsystems co-founder and chief scientist Bill Joy defended his company's "community source" license, essentially a watered-down compromise letting users copy and modify Sun-owned software but not sell copies of said software without negotiating a royalty agreement with Sun. (With this restriction, the license did not qualify as free, nor for that matter as open source.) A year after Joy's speech, Sun Microsystems vice president Marco Boerries was appearing on the same stage spelling out the company's new licensing compromise in the case of OpenOffice, an office-application suite designed specifically for the GNU/Linux operating system.
What history says about the GNU Project, twenty years from now, will depend on who wins the battle of freedom to use public knowledge. If we lose, we will be just a footnote. If we win, it is uncertain whether people will know the role of the GNU operating system - if they think the system is "Linux," they will build a false picture of what happened and why.
In an effort to drive that image home, Moglen reflects on a shared moment in the spring of 2000. The success of the VA Linux IPO was still resonating in the business media, and a half dozen issues related to free software were swimming through the news. Surrounded by a swirling hurricane of issues and stories each begging for comment, Moglen recalls sitting down for lunch with Stallman and feeling like a castaway dropped into the eye of the storm. For the next hour, he says, the conversation calmly revolved around a single topic: strengthening the GPL.
The story starts in April, 2000. At the time, I was writing stories for the ill-fated web site BeOpen.com. One of my first assignments was a phone interview with Richard M. Stallman. The interview went well, so well that Slashdot (「http://www.slashdot.org)」, the popular "news for nerds" site owned by VA Software, Inc. (formerly VA LinuxSystems and before that, VA Research), gave it a link in its daily list of feature stories. Within hours, the web servers at BeOpen were heating up as readers clicked over to the site.
I read your interview with Richard Stallman on BeOpen
with great interest. I've been intrigued by RMS and his
work for some time now and was delighted to find your
piece which I really think you did a great job of capturing
some of the spirit of what Stallman is trying to do with
GNU-Linux and the Free Software Foundation.
I have to admit, getting Stallman to participate in an e-book project was an afterthought on my part. As a reporter who covered the open source beat, I knew Stallman was a stickler. I'd already received a half dozen emails at that point upbraiding me for the use of "Linux" instead of "GNU/Linux."
During the summer, I began to contemplate turning my interview notes into a magazine article. Ethically, I felt in the clear doing so,since the original interview terms said nothing about traditional print media. To be honest, I also felt a bit more comfortable writing about Stallman after eight months of radio silence. Since our telephone conversation in September, I'd only received two emails from Stallman.Both chastised me for using "Linux" instead of "GNU/Linux" in a pair of articles for the web magazine Upside Today. Aside from that, I had enjoyed the silence. In June, about a week after the New York University speech, I took a crack at writing a 5,000-word magazine-length story about Stallman. This time, the words flowed. The distance had helped restore my lost sense of emotional perspective, I suppose.
For me -- for pretty much every writer -- the big problem isn't piracy, it's obscurity (thanks to Tim O'Reilly for this great aphorism). Of all the people who failed to buy this book today, the majority did so because they never heard of it, not because someone gave them a free copy. Mega-hit best-sellers in science fiction sell half a million copies -- in a world where 175,000 attend the San Diego Comic Con alone, you've got to figure that most of the people who "like science fiction" (and related geeky stuff like comics, games, Linux, and so on) just don't really buy books. I'm more interested in getting more of that wider audience into the tent than making sure that everyone who's in the tent bought a ticket to be there.
Hackers blow through those countermeasures. The Xbox was cracked by a kid from MIT who wrote a best-selling book about it, and then the 360 went down, and then the short-lived Xbox Portable (which we all called the "luggable" -- it weighed three pounds!) succumbed. The Universal was supposed to be totally bulletproof. The high school kids who broke it were Brazilian Linux hackers who lived in a favela -- a kind of squatter's slum.
Once the Brazilians published their crack, we all went nuts on it. Soon there were dozens of alternate operating systems for the Xbox Universal. My favorite was ParanoidXbox, a flavor of Paranoid Linux. Paranoid Linux is an operating system that assumes that its operator is under assault from the government (it was intended for use by Chinese and Syrian dissidents), and it does everything it can to keep your communications and documents a secret. It even throws up a bunch of "chaff" communications that are supposed to disguise the fact that you're doing anything covert. So while you're receiving a political message one character at a time, ParanoidLinux is pretending to surf the Web and fill in questionnaires and flirt in chat-rooms. Meanwhile, one in every five hundred characters you receive is your real message, a needle buried in a huge haystack.
Tonight, I'd make the sacrifice. It took about twenty minutes to get up and running. Not having a TV was the hardest part, but eventually I remembered that I had a little overhead LCD projector that had standard TV RCA connectors on the back. I connected it to the Xbox and shone it on the back of my door and got ParanoidLinux installed.
Now I was up and running, and ParanoidLinux was looking for other Xbox Universals to talk to. Every Xbox Universal comes with built-in wireless for multiplayer gaming. You can connect to your neighbors on the wireless link and to the Internet, if you have a wireless Internet connection. I found three different sets of neighbors in range. Two of them had their Xbox Universals also connected to the Internet. ParanoidXbox loved that configuration: it could siphon off some of my neighbors' Internet connections and use them to get online through the gaming network. The neighbors would never miss the packets: they were paying for flat-rate Internet connections, and they weren't exactly doing a lot of surfing at 2AM.
I held up a laptop Jolu and I had rebuilt the night before, from the ground up. "I trust this machine. Every component in it was laid by our own hands. It's running a fresh out-of-the-box version of ParanoidLinux, booted off of the DVD. If there's a trustworthy computer left anywhere in the world, this might well be it.
I set up the laptop on a dry bit of rock and booted it from the DVD with her watching. "I'm going to reboot it for every person. This is a standard ParanoidLinux disc, though I guess you'd have to take my word for it."
My mailbox overflowed with suggestions from people. They sent me dumps off their phones and their pocket-cameras. Then I got an email from a name I recognized -- Dr Eeevil (three "e"s), one of the prime maintainers of ParanoidLinux.
> Luckily, it's not hard to strip out the signatures, if you care to. There's a utility on the ParanoidLinux distro you're using that does this -- it's called photonomous, and you'll find it in /usr/bin. Just read the man pages for documentation. It's simple though.
I went with her into her office to do the scan. I'd expected a stylish, low-powered computer that fit in with her decor, but instead, her spare-bedroom/office was crammed with top-of-the-line PCs, big flat-panel monitors, and a scanner big enough to lay a whole sheet of newsprint on. She was fast with it all, too. I noted with some approval that she was running ParanoidLinux. This lady took her job seriously.
> Within days of the Xnet launch, we went to work on exploiting ParanoidLinux. The exploits so far have been small and insubstantial, but a break is inevitable. Once we have a zero-day break, you're dead.
> Even if they don't break ParanoidLinux, there are poisoned ParanoidXbox distros floating around. They don't match the checksums, but how many people look at the checksums? Besides me and you? Plenty of kids are already dead, though they don't know it.
Whatever -- I didn't have to have anything to do with it, and I got a desk and an office with a storefront, right there on Valencia Street, where we gave away ParanoidXbox CDs and held workshops on building better WiFi antennas. A surprising number of average people dropped in to make personal donations, both of hardware (you can run ParanoidLinux on just about anything, not just Xbox Universals) and cash money. They loved us.
Linux is subversive. Who would have thought even five years ago (1991) that a world-class operating system could coalesce as if by magic out of part-time hacking by several thousand developers scattered all over the planet, connected only by the tenuous strands of the Internet?
Certainly not I. By the time Linux swam onto my radar screen in early 1993, I had already been involved in Unix and open-source development for ten years. I was one of the first GNU contributors in the mid-1980s. I had released a good deal of open-source software onto the net, developing or co-developing several programs (nethack, Emacs's VC and GUD modes, xlife, and others) that are still in wide use today. I thought I knew how it was done.
Linux overturned much of what I thought I knew. I had been preaching the Unix gospel of small tools, rapid prototyping and evolutionary programming for years. But I also believed there was a certain critical complexity above which a more centralized, a priori approach was required. I believed that the most important software (operating systems and really large tools like the Emacs programming editor) needed to be built like cathedrals, carefully crafted by individual wizards or small bands of mages working in splendid isolation, with no beta to be released before its time.
Linus Torvalds's style of development—release early and often, delegate everything you can, be open to the point of promiscuity—came as a surprise. No quiet, reverent cathedral-building here—rather, the Linux community seemed to resemble a great babbling bazaar of differing agendas and approaches (aptly symbolized by the Linux archive sites, who'd take submissions from anyone) out of which a coherent and stable system could seemingly emerge only by a succession of miracles.
The fact that this bazaar style seemed to work, and work well, came as a distinct shock. As I learned my way around, I worked hard not just at individual projects, but also at trying to understand why the Linux world not only didn't fly apart in confusion but seemed to go from strength to strength at a speed barely imaginable to cathedral-builders.
This is the story of that project. I'll use it to propose some aphorisms about effective open-source development. Not all of these are things I first learned in the Linux world, but we'll see how the Linux world gives them particular point. If I'm correct, they'll help you understand exactly what it is that makes the Linux community such a fountain of good software—and, perhaps, they will help you become more productive yourself.
Perhaps this should have been obvious (it's long been proverbial that "Necessity is the mother of invention") but too often software developers spend their days grinding away for pay at programs they neither need nor love. But not in the Linux world—which may explain why the average quality of software originated in the Linux community is so high.
Linus Torvalds, for example, didn't actually try to write Linux from scratch. Instead, he started by reusing code and ideas from Minix, a tiny Unix-like operating system for PC clones. Eventually all the Minix code went away or was completely rewritten—but while it was there, it provided scaffolding for the infant that would eventually become Linux.
The source-sharing tradition of the Unix world has always been friendly to code reuse (this is why the GNU project chose Unix as a base OS, in spite of serious reservations about the OS itself). The Linux world has taken this tradition nearly to its technological limit; it has terabytes of open sources generally available. So spending time looking for some else's almost-good-enough is more likely to give you good results in the Linux world than anywhere else.
But I had a more theoretical reason to think switching might be as good an idea as well, something I learned long before Linux.
Another strength of the Unix tradition, one that Linux pushes to a happy extreme, is that a lot of users are hackers too. Because source code is available, they can be effective hackers. This can be tremendously useful for shortening debugging time. Given a bit of encouragement, your users will diagnose problems, suggest fixes, and help improve the code far more quickly than you could unaided.
In fact, I think Linus's cleverest and most consequential hack was not the construction of the Linux kernel itself, but rather his invention of the Linux development model. When I expressed this opinion in his presence once, he smiled and quietly repeated something he has often said: "I'm basically a very lazy person who likes to get credit for things other people actually do." Lazy like a fox. Or, as Robert Heinlein famously wrote of one of his characters, too lazy to fail.
In retrospect, one precedent for the methods and success of Linux can be seen in the development of the GNU Emacs Lisp library and Lisp code archives. In contrast to the cathedral-building style of the Emacs C core and most other GNU tools, the evolution of the Lisp code pool was fluid and very user-driven. Ideas and prototype modes were often rewritten three or four times before reaching a stable final form. And loosely-coupled collaborations enabled by the Internet, a la Linux, were frequent.
Indeed, my own most successful single hack previous to fetchmail was probably Emacs VC (version control) mode, a Linux-like collaboration by email with three other people, only one of whom (Richard Stallman, the author of Emacs and founder of the Free Software Foundation) I have met to this day. It was a front-end for SCCS, RCS and later CVS from within Emacs that offered "one-touch" version control operations. It evolved from a tiny, crude sccs.el mode somebody else had written. And the development of VC succeeded because, unlike Emacs itself, Emacs Lisp code could go through release/test/improve generations very quickly.
Early and frequent releases are a critical part of the Linux development model. Most developers (including me) used to believe this was bad policy for larger than trivial projects, because early versions are almost by definition buggy versions and you don't want to wear out the patience of your users.
The most important of these, the Ohio State Emacs Lisp archive, anticipated the spirit and many of the features of today's big Linux archives. But few of us really thought very hard about what we were doing, or about what the very existence of that archive suggested about problems in the FSF's cathedral-building development model. I made one serious attempt around 1992 to get a lot of the Ohio code formally merged into the official Emacs Lisp library. I ran into political trouble and was largely unsuccessful.
But by a year later, as Linux became widely visible, it was clear that something different and much healthier was going on there. Linus's open development policy was the very opposite of cathedral-building. Linux's Internet archives were burgeoning, multiple distributions were being floated. And all of this was driven by an unheard-of frequency of core system releases.
I didn't think so. Granted, Linus is a damn fine hacker. How many of us could engineer an entire production-quality operating system kernel from scratch? But Linux didn't represent any awesome conceptual leap forward. Linus is not (or at least, not yet) an innovative genius of design in the way that, say, Richard Stallman or James Gosling (of NeWS and Java) are. Rather, Linus seems to me to be a genius of engineering and implementation, with a sixth sense for avoiding bugs and development dead-ends and a true knack for finding the minimum-effort path from point A to point B. Indeed, the whole design of Linux breathes this quality and mirrors Linus's essentially conservative and simplifying design approach.
And that's it. That's enough. If "Linus's Law" is false, then any system as complex as the Linux kernel, being hacked over by as many hands as the that kernel was, should at some point have collapsed under the weight of unforseen bad interactions and undiscovered "deep" bugs. If it's true, on the other hand, it is sufficient to explain Linux's relative lack of bugginess and its continuous uptimes spanning months or even years.
One special feature of the Linux situation that clearly helps along the Delphi effect is the fact that the contributors for any given project are self-selected. An early respondent pointed out that contributions are received not from a random sample, but from people who are interested enough to use the software, learn about how it works, attempt to find solutions to problems they encounter, and actually produce an apparently reasonable fix. Anyone who passes all these filters is highly likely to have something useful to contribute.
In practice, the theoretical loss of efficiency due to duplication of work by debuggers almost never seems to be an issue in the Linux world. One effect of a "release early and often" policy is to minimize such duplication by propagating fed-back fixes quickly [JH].
Linus coppers his bets, too. In case there are serious bugs, Linux kernel version are numbered in such a way that potential users can make a choice either to run the last version designated "stable" or to ride the cutting edge and risk bugs in order to get new features. This tactic is not yet systematically imitated by most Linux hackers, but perhaps it should be; the fact that either choice is available makes both more attractive. [HBS]
Linux and fetchmail both went public with strong, attractive basic designs. Many people thinking about the bazaar model as I have presented it have correctly considered this critical, then jumped from that to the conclusion that a high degree of design intuition and cleverness in the project leader is indispensable.
But Linus got his design from Unix. I got mine initially from the ancestral popclient (though it would later change a great deal, much more proportionately speaking than has Linux). So does the leader/coordinator for a bazaar-style effort really have to have exceptional design talent, or can he get by through leveraging the design talent of others?
Both the Linux and fetchmail projects show evidence of this. Linus, while not (as previously discussed) a spectacularly original designer, has displayed a powerful knack for recognizing good design and integrating it into the Linux kernel. And I have already described how the single most powerful design idea in fetchmail (SMTP forwarding) came from somebody else.
So I believe the fetchmail project succeeded partly because I restrained my tendency to be clever; this argues (at least) against design originality being essential for successful bazaar projects. And consider Linux. Suppose Linus Torvalds had been trying to pull off fundamental innovations in operating system design during the development; does it seem at all likely that the resulting kernel would be as stable and successful as what we have?
So it was with Carl Harris and the ancestral popclient, and so with me and fetchmail. But this has been understood for a long time. The interesting point, the point that the histories of Linux and fetchmail seem to demand we focus on, is the next stage—the evolution of software in the presence of a large and active community of users and co-developers.
In The Mythical Man-Month, Fred Brooks observed that programmer time is not fungible; adding developers to a late software project makes it later. As we've seen previously, he argued that the complexity and communication costs of a project rise with the square of the number of developers, while work done only rises linearly. Brooks's Law has been widely regarded as a truism. But we've examined in this essay an number of ways in which the process of open-source development falsifies the assumptionms behind it—and, empirically, if Brooks's Law were the whole picture Linux would be impossible.
The bazaar method, by harnessing the full power of the "egoless programming" effect, strongly mitigates the effect of Brooks's Law. The principle behind Brooks's Law is not repealed, but given a large developer population and cheap communications its effects can be swamped by competing nonlinearities that are not otherwise visible. This resembles the relationship between Newtonian and Einsteinian physics—the older system is still valid at low energies, but if you push mass and velocity high enough you get surprises like nuclear explosions or Linux.
The history of Unix should have prepared us for what we're learning from Linux (and what I've verified experimentally on a smaller scale by deliberately copying Linus's methods [EGCS]). That is, while coding remains an essentially solitary activity, the really great hacks come from harnessing the attention and brainpower of entire communities. The developer who uses only his or her own brain in a closed project is going to fall behind the developer who knows how to create an open, evolutionary context in which feedback exploring the design space, code contributions, bug-spotting, and other improvements come from from hundreds (perhaps thousands) of people.
Linux was the first project for which a conscious and successful effort to use the entire world as its talent pool was made. I don't think it's a coincidence that the gestation period of Linux coincided with the birth of the World Wide Web, and that Linux left its infancy during the same period in 1993–1994 that saw the takeoff of the ISP industry and the explosion of mainstream interest in the Internet. Linus was the first person who learned how to play by the new rules that pervasive Internet access made possible.
While cheap Internet was a necessary condition for the Linux model to evolve, I think it was not by itself a sufficient condition. Another vital factor was the development of a leadership style and set of cooperative customs that could allow developers to attract co-developers and get maximum leverage out of the medium.
The "severe effort of many converging wills" is precisely what a project like Linux requires—and the "principle of command" is effectively impossible to apply among volunteers in the anarchist's paradise we call the Internet. To operate and compete effectively, hackers who want to lead collaborative projects have to learn how to recruit and energize effective communities of interest in the mode vaguely suggested by Kropotkin's "principle of understanding". They must learn to use Linus's Law.[SP]
Earlier I referred to the "Delphi effect" as a possible explanation for Linus's Law. But more powerful analogies to adaptive systems in biology and economics also irresistably suggest themselves. The Linux world behaves in many respects like a free market or an ecology, a collection of selfish agents attempting to maximize utility which in the process produces a self-correcting spontaneous order more elaborate and efficient than any amount of central planning could have achieved. Here, then, is the place to seek the "principle of understanding".
The "utility function" Linux hackers are maximizing is not classically economic, but is the intangible of their own ego satisfaction and reputation among other hackers. (One may call their motivation "altruistic", but this ignores the fact that altruism is itself a form of ego satisfaction for the altruist). Voluntary cultures that work this way are not actually uncommon; one other in which I have long participated is science fiction fandom, which unlike hackerdom has long explicitly recognized "egoboo" (ego-boosting, or the enhancement of one's reputation among other fans) as the basic drive behind volunteer activity.
Linus, by successfully positioning himself as the gatekeeper of a project in which the development is mostly done by others, and nurturing interest in the project until it became self-sustaining, has shown an acute grasp of Kropotkin's "principle of shared understanding". This quasi-economic view of the Linux world enables us to see how that understanding is applied.
Many people (especially those who politically distrust free markets) would expect a culture of self-directed egoists to be fragmented, territorial, wasteful, secretive, and hostile. But this expectation is clearly falsified by (to give just one example) the stunning variety, quality, and depth of Linux documentation. It is a hallowed given that programmers hate documenting; how is it, then, that Linux hackers generate so much documentation? Evidently Linux's free market in egoboo works better to produce virtuous, other-directed behavior than the massively-funded documentation shops of commercial software producers.
Both the fetchmail and Linux kernel projects show that by properly rewarding the egos of many other hackers, a strong developer/coordinator can use the Internet to capture the benefits of having lots of co-developers without having a project collapse into a chaotic mess. So to Brooks's Law I counter-propose the following:
Perhaps this is not only the future of open-source software. No closed-source developer can match the pool of talent the Linux community can bring to bear on a problem. Very few could afford even to hire the more than 200 (1999: 600, 2000: 800) people who have contributed to fetchmail!
This suggests a reason for questioning the advantages of conventionally-managed software development that is independent of the rest of the arguments over cathedral vs. bazaar mode. If it's possible for GNU Emacs to express a consistent architectural vision over 15 years, or for an operating system like Linux to do the same over 8 years of rapidly changing hardware and platform technology; and if (as is indeed the case) there have been many well-architected open-source projects of more than 5 years duration -- then we are entitled to wonder what, if anything, the tremendous overhead of conventionally-managed development is actually buying us.
Whatever it is certainly doesn't include reliable execution by deadline, or on budget, or to all features of the specification; it's a rare `managed' project that meets even one of these goals, let alone all three. It also does not appear to be ability to adapt to changes in technology and economic context during the project lifetime, either; the open-source community has proven far more effective on that score (as one can readily verify, for example, by comparing the 30-year history of the Internet with the short half-lives of proprietary networking technologies—or the cost of the 16-bit to 32-bit transition in Microsoft Windows with the nearly effortless upward migration of Linux during the same period, not only along the Intel line of development but to more than a dozen other hardware platforms, including the 64-bit Alpha as well).
Relating to your own work process with fear and loathing (even in the displaced, ironic way suggested by hanging up Dilbert cartoons) should therefore be regarded in itself as a sign that the process has failed. Joy, humor, and playfulness are indeed assets; it was not mainly for the alliteration that I wrote of "happy hordes" above, and it is no mere joke that the Linux mascot is a cuddly, neotenous penguin.
In the mean time, however, the open-source idea has scored successes and found backers elsewhere. Since the Netscape release we've seen a tremendous explosion of interest in the open-source development model, a trend both driven by and driving the continuing success of the Linux operating system. The trend Mozilla touched off is continuing at an accelerating rate.
Jon Johansen, a 16-year-old Norwegian, was the unwitting catalyst for one of the most important cases interpreting the DMCA. He and two anonymous helpers wrote a program called DeCSS. Depending on whom you listen to, DeCSS is described either as a way of allowing people who use Linux or other open source operating systems to play DVDs on their computers, or as a tool for piracy that threatened the entire movie industry and violated the DMCA.
Let us return to Mr. Johansen, the 16-year-old Norwegian. He and his two anonymous collaborators claimed that they were affected by another limitation imposed by the CSS licensing body. At that time, there was no way to play DVDs on a computer running Linux, or any other free or open source operating system. (I will talk more about free and open source software later.) Let's say you buy a laptop. A Sony Vaio running Windows, for example. It has a slot in the side for DVDs to slide in and software that comes along with it which allows the DVD reader to decode and play the disk. The people who wrote the software have been licensed by the DVD Copy Control Association and provided with a CSS key. But at the time Mr. Johansen set out to create DeCSS, the licensing body had not licensed keys to any free or open source software developers. Say Mr. Johansen buys the Sony Vaio, but with the Linux operating system on it instead of Windows. The computer is the same. The little slot is still there. Writing an open source program to control the DVD player is trivial. But without the CSS key, there is no way for the player to decode and play the movie. (The licensing authority later did license an open source player, perhaps because they realized its unavailability gave Mr. Johansen a strong defense, perhaps because they feared an antitrust suit, or perhaps because they just got around to it.)
“2600: The Hacker Quarterly has included articles on such topics as how to steal an Internet domain name, how to write more secure ASP code, access other people's e-mail, secure your Linux box, intercept cellular phone calls, how to put Linux on an Xbox, how to remove spyware, and break into the computer systems at Costco stores and Federal Express. One issue contains a guide to the federal criminal justice system for readers charged with computer hacking. In addition, 2600 operates a web site located at 2600.com (「http://www.2600.com)」, which is managed primarily by Mr. Corley and has been in existence since 1995.”
This was the issue in Reimerdes. True, if I cut through the digital fence on a DVD in order to excerpt a small portion in a critical documentary, I would not be violating your copyright, but I would be violating the anticircumvention provisions. And DeCSS seemed to be a tool for doing what the DMCA forbids. By providing links to it, Mr. Corley and 2600 were “trafficking” in a technology that allows others to circumvent a technological protection measure. DeCSS could, of course, be used for purposes that did not violate copyright—to make the DVD play on a computer running Linux, for example. It enabled various noninfringing fair uses. It could also be used to aid illicit copying. But the alleged violation of the DMCA had nothing to do with that. The alleged violation of the DMCA was making the digital wire cutters available in the first place. So one First Amendment problem with the DMCA can be stated quite simply. It appeared to make it illegal to exercise at least some of the limitations and exceptions copyright law needs in order to pass First Amendment scrutiny. Or did it just make it very, very difficult to exercise those rights legally? I could, after all, make a videotape of the DVD playing on my television, and use that grainy, blurry image in my documentary criticizing the filmmaker. The DMCA would not be violated, though my movie might be painful to watch.
Congress could have passed many laws less restrictive than the DMCA. It could have only penalized the use of programs such as DeCSS for an illicit purpose. If it wished to reach those who create the tools as well as use them, it could have required proof that the creator intended them to be used for illegal purposes. Just as we look at the government's intention in creating the law, we could make the intent of the software writer critical for the purposes of assessing whether or not his actions are illegal. If I write a novel detailing a clever way to kill someone and you use it to carry out a real murder, the First Amendment does not allow the state to punish me. If I write a manual on how to be a hit man and sell it to you, it may. First Amendment law is generally skeptical of statutes that impose “strict liability” without a requirement of intent. But Judge Kaplan believed that the DMCA made the motives of Mr. Johansen irrelevant, except insofar as they were relevant to the narrowly tailored exceptions of the DMCA, such as encryption research. In other words, even if Mr. Johansen made DeCSS so that he and his friends could watch DVDs they purchased legally on computers running Linux, they could still be liable for breaking the DMCA.
The legal implementation of this conclusion would be simple. It would be unconstitutional to punish an individual for gaining access in order to make a fair use. However, if they cut down the digital fence to make illicit copies, both the cutting and the copying would be illegal. But what about the prohibition of trafficking in digital wire cutters, technologies such as DeCSS? There the constitutional question is harder. I would argue that the First Amendment requires an interpretation of the antitrafficking provisions that comes closer to the ruling in the Sony case. If Mr. Johansen did indeed make DeCSS to play DVDs on his Linux computer, and if that were indeed a substantial noninfringing use, then it cannot be illegal for him to develop the technology. But I accept that this is a harder line to draw constitutionally. About my first conclusion, though, I think the argument is both strong and clear.
The creators of free and open source software were able to use the fact that software is copyrighted, and that the right attaches automatically upon creation and fixation, to set up new, distributed methods of innovation. For example, free and open source software under the General Public License—such as Linux—is a “commons” to which all are granted access. Anyone may use the software without any restrictions. They are guaranteed access to the human-readable “source code,” rather than just the inscrutable “machine code,” so that they can understand, tinker, and modify. Modifications can be distributed so long as the new creation is licensed under the open terms of the original. This creates a virtuous cycle: each addition builds on the commons and is returned to it. The copyright over the software was the “hook” that allowed software engineers to create a license that gave free access and the right to modify and required future programmers to keep offering those freedoms. Without the copyright, those features of the license would not have been enforceable. For example, someone could have modified the open program and released it without the source code—denying future users the right to understand and modify easily. To use an analogy beloved of free software enthusiasts, the hood of the car would be welded shut. Home repair, tinkering, customization, and redesign become practically impossible.
For anyone interested in the way that networks can enable new collaborative methods of production, the free software movement, and the broader but less political movement that goes under the name of open source software, provide interesting case studies.【216 See Glyn Moody, Rebel Code: Linux and the Open Source Revolution (Cambridge, Mass.: Perseus Pub., 2001); Peter Wayner, Free for All: How Linux and the Free Software Movement Undercut the High-Tech Titans (New York: HarperBusiness, 2000); Eben Moglen, “Anarchism Triumphant: Free Software and the Death of Copyright,” First Monday 4 (1999), ‹http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/684/594› [Ed. note: originally published as ‹http://firstmonday.org/issues/issue4_8/index.html›, the link has changed]. 】 Open source software is released under a series of licenses, the most important being the General Public License (GPL). The GPL specifies that anyone may copy the software, provided the license remains attached and the source code for the software always remains available.【217 Proprietary, or “binary only,” software is generally released only after the source code has been compiled into machine-readable object code, a form that is impenetrable to the user. Even if you were a master programmer, and the provisions of the Copyright Act, the appropriate licenses, and the DMCA did not forbid you from doing so, you would be unable to modify commercial proprietary software to customize it for your needs, remove a bug, or add a feature. Open source programmers say, disdainfully, that it is like buying a car with the hood welded shut. See, e.g., Wayner, Free for All, 264. 】 Users may add to or modify the code, may build on it and incorporate it into their own work, but if they do so, then the new program created is also covered by the GPL. Some people refer to this as the “viral” nature of the license; others find the term offensive.【218 See Brian Behlendorf, “Open Source as a Business Strategy,” in Open Sources: Voices from the Open Source Revolution, ed. Chris DiBona et al. (Sebastopol, Calif.: O'Reilly, 1999), 149, 163. 】 The point, however, is that the open quality of the creative enterprise spreads. It is not simply a donation of a program or a work to the public domain, but a continual accretion in which all gain the benefits of the program on pain of agreeing to give their additions and innovations back to the communal project.
Governments have taken notice. The United Kingdom, for example, concluded last year that open source software “will be considered alongside proprietary software and contracts will be awarded on a value-for-money basis.” The Office of Government Commerce said open source software is “a viable desktop alternative for the majority of government users” and “can generate significant savings. . . . These trials have proved that open source software is now a real contender alongside proprietary solutions. If commercial companies and other governments are taking it seriously, then so must we.”【221 “UK Government Report Gives Nod to Open Source,” Desktop Linux (October 28, 2004), available at ‹http://www.desktoplinux.com/news/NS5013620917.html›. 】 Sweden found open source software to be in many cases “equivalent to—or better than—commercial products” and concluded that software procurement “shall evaluate open software as well as commercial solutions, to provide better competition in the market.”【222 “Cases of Official Recognition of Free and Open Source Software,” available at ‹http://ec.europa.eu/information_society/activities/opensource/cases/index_en.htm›. 】
Yochai Benkler and I would argue that these questions are fun to debate but ultimately irrelevant.【226 Benkler's reasoning is characteristically elegant, even formal in its precision, while mine is clunkier. See Yochai Benkler, “Coase's Penguin, or, Linux and the Nature of the Firm,” Yale Law Journal 112 (2002): 369–446. 】 Assume a random distribution of incentive structures in different people, a global network—transmission, information sharing, and copying costs that approach zero—and a modular creation process. With these assumptions, it just does not matter why they do it. In lots of cases, they will do it. One person works for love of the species, another in the hope of a better job, a third for the joy of solving puzzles, and a fourth because he has to solve a particular problem anyway for his own job and loses nothing by making his hack available for all. Each person has their own reserve price, the point at which they say, “Now I will turn off Survivor and go and create something.” But on a global network, there are a lot of people, and with numbers that big and information overhead that small, even relatively hard projects will attract motivated and skilled people whose particular reserve price has been crossed.
More conventionally, many people write free software because they are paid to do so. Amazingly, IBM now earns more from what it calls “Linux-related revenues” than it does from traditional patent licensing, and IBM is the largest patent holder in the world.【227 Yochai Benkler, The Wealth of Networks: How Social Production Transforms Markets and Freedom (New Haven, Conn.: Yale University Press, 2006), 46–47. 】 It has decided that the availability of an open platform, to which many firms and individuals contribute, will actually allow it to sell more of its services, and, for that matter, its hardware. A large group of other companies seem to agree. They like the idea of basing their services, hardware, and added value on a widely adopted “commons.” This does not seem like a community in decline.
People used to say that collaborative creation could never produce a quality product. That has been shown to be false. So now they say that collaborative creation cannot be sustained because the governance mechanisms will not survive the success of the project. Professor Epstein conjures up a “central committee” from which insiders will be unable to cash out—a nice mixture of communist and capitalist metaphors. All governance systems—including democracies and corporate boards—have problems. But so far as we can tell, those who are influential in the free software and open source governance communities (there is, alas, no “central committee”) feel that they are doing very well indeed. In the last resort, when they disagree with decisions that are taken, there is always the possibility of “forking the code,” introducing a change to the software that not everyone agrees with, and then letting free choice and market selection converge on the preferred iteration. The free software ecosystem also exhibits diversity. Systems based on GNU-Linux, for example, have distinct “flavors” with names like Ubuntu, Debian, and Slackware, each with passionate adherents and each optimized for a particular concern—beauty, ease of use, technical manipulability. So far, the tradition of “rough consensus and running code” seems to be proving itself empirically as a robust governance system.
The most remarkable and important book on “distributed creativity” and the sharing economy is Yochai Benkler, The Wealth of Networks: How Social Production Transforms Markets and Freedom (New Haven, Conn.: Yale University Press, 2006). Benkler sets the idea of “peer production” alongside other mechanisms of market and political governance and offers a series of powerful normative arguments about why we should prefer that future. Comprehensive though this book may seem, it is incomplete unless it is read in conjunction with one of Benkler's essays: Yochai Benkler, “Coase's Penguin, or, Linux and the Nature of the Firm,” Yale Law Journal 112 (2002): 369–446. In that essay, Benkler puts forward the vital argument—described in this chapter—about what collaborative production does to Coase's theory of the firm.
Free and open source software has been a subject of considerable interest to commentators. Glyn Moody's Rebel Code: Linux and the Open Source Revolution (Cambridge, Mass.: Perseus Pub., 2001), and Peter Wayner's Free for All: How Linux and the Free Software Movement Undercut the High-Tech Titans (New York: HarperBusiness, 2000), both offer readable and accessible histories of the phenomenon. Eric S. Raymond, The Cathedral and the Bazaar: Musings on Linux and Open Source by an Accidental Revolutionary, revised edition (Sebastopol, Calif.: O'Reilly, 2001), is a classic philosophy of the movement, written by a key participant—author of the phrase, famous among geeks, “given enough eyeballs, all bugs are shallow.” Steve Weber, in The Success of Open Source (Cambridge, Mass.: Harvard University Press, 2004), offers a scholarly argument that the success of free and open source software is not an exception to economic principles but a vindication of them. I agree, though the emphasis that Benkler and I put forward is rather different. To get a sense of the argument that free software (open source software's normatively charged cousin) is desirable for its political and moral implications, not just because of its efficiency or commercial success, one should read the essays of Richard Stallman, the true father of free software and a fine polemical, but rigorous, essayist. Richard Stallman, Free Software, Free Society: Selected Essays of Richard M. Stallman, ed. Joshua Gay (Boston: GNU Press, 2002). Another strong collection of essays can be found in Joseph Feller, Brian Fitzgerald, Scott A. Hissam, and Karim R. Lakhani, eds., Perspectives on Free and Open Source Software (Cambridge, Mass.: MIT Press, 2005). If you only have time to read a single essay on the subject it should be Eben Moglen's “Anarchism Triumphant: Free Software and the Death of Copyright,” First Monday 4 (1999), available at ‹http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/684/594› [Ed. note: originally published as ‹http://www.firstmonday.dk/issues/issue4_8/moglen/›, the link has changed].
It is easy to miss these changes. They run against the grain of some of our most basic Economics 101 intuitions, intuitions honed in the industrial economy at a time when the only serious alternative seen was state Communism--an alternative almost universally considered unattractive today. The undeniable economic success of free software has prompted some leading-edge economists to try to understand why many thousands of loosely networked free software developers can compete with Microsoft at its own game and produce a massive operating system--GNU/Linux. That growing literature, consistent with its own goals, has focused on software and the particulars of the free and open-source software development communities, although Eric von Hippel's notion of "user-driven innovation" has begun to expand that focus to thinking about how individual need and creativity drive innovation at the individual level, and its diffusion through networks of likeminded individuals. The political implications of free software have been central to the free software movement and its founder, Richard Stallman, and were developed provocatively and with great insight by Eben Moglen. Free software is but one salient example of a much broader phenomenon. Why can fifty thousand volunteers successfully coauthor Wikipedia, the most serious online alternative to the Encyclopedia Britannica, and then turn around and give it away for free? Why do 4.5 million volunteers contribute their leftover computer cycles to create the most powerful supercomputer on Earth, SETI@Home? Without a broadly accepted analytic model to explain these phenomena, we tend to treat them as curiosities, perhaps transient fads, possibly of significance in one market segment or another. We [pg 6] should try instead to see them for what they are: a new mode of production emerging in the middle of the most advanced economies in the world-- those that are the most fully computer networked and for which information goods and services have come to occupy the highest-valued roles.
An excellent example of a business strategy based on nonexclusivity is IBM's. The firm has obtained the largest number of patents every year from 1993 to 2004, amassing in total more than 29,000 patents. IBM has also, however, been one of the firms most aggressively engaged in adapting its business model to the emergence of free software. Figure 2.1 shows what happened to the relative weight of patent royalties, licenses, and sales in IBM's revenues and revenues that the firm described as coming from "Linuxrelated services." Within a span of four years, the Linux-related services category moved from accounting for practically no revenues, to providing double the revenues from all patent-related sources, of the firm that has been the most patent-productive in the United States. IBM has described itself as investing more than a billion dollars in free software developers, hired programmers to help develop the Linux kernel and other free software; and donated patents to the Free Software Foundation. What this does for the firm is provide it with a better operating system for its server business-- making the servers better, faster, more reliable, and therefore more valuable to consumers. Participating in free software development has also allowed IBM to develop service relationships with its customers, building on free software to offer customer-specific solutions. In other words, IBM has combined both supply-side and demand-side strategies to adopt a nonproprietary business model that has generated more than $2 billion yearly of business [pg 47] for the firm. Its strategy is, if not symbiotic, certainly complementary to free software.
Industrial organization literature provides a prominent place for the transaction costs view of markets and firms, based on insights of Ronald Coase and Oliver Williamson. On this view, people use markets when the gains from doing so, net of transaction costs, exceed the gains from doing the same thing in a managed firm, net of the costs of organizing and managing a firm. Firms emerge when the opposite is true, and transaction costs can best be reduced by [pg 60] bringing an activity into a managed context that requires no individual transactions to allocate this resource or that effort. The emergence of free and open-source software, and the phenomenal success of its flagships, the GNU/ Linux operating system, the Apache Web server, Perl, and many others, should cause us to take a second look at this dominant paradigm.【18 For an excellent history of the free software movement and of open-source development, see Glyn Moody, Rebel Code: Inside Linux and the Open Source Revolution (New York: Perseus Publishing, 2001). 】 Free software projects do not rely on markets or on managerial hierarchies to organize production. Programmers do not generally participate in a project because someone who is their boss told them to, though some do. They do not generally participate in a project because someone offers them a price to do so, though some participants do focus on long-term appropriation through money-oriented activities, like consulting or service contracts. However, the critical mass of participation in projects cannot be explained by the direct presence of a price or even a future monetary return. This is particularly true of the all-important, microlevel decisions: who will work, with what software, on what project. In other words, programmers participate in free software projects without following the signals generated by marketbased, firm-based, or hybrid models. In chapter 2 I focused on how the networked information economy departs from the industrial information economy by improving the efficacy of nonmarket production generally. Free software offers a glimpse at a more basic and radical challenge. It suggests that the networked environment makes possible a new modality of organizing production: radically decentralized, collaborative, and nonproprietary; based on sharing resources and outputs among widely distributed, loosely connected individuals who cooperate with each other without relying on either market signals or managerial commands. This is what I call "commons-based peer production."
Free software has played a critical role in the recognition of peer production, because software is a functional good with measurable qualities. It can be more or less authoritatively tested against its market-based competitors. And, in many instances, free software has prevailed. About 70 percent of Web server software, in particular for critical e-commerce sites, runs on the Apache Web server--free software.【21 Netcraft, April 2004 Web Server Survey, ‹http://news.netcraft.com/archives/web_› server_survey.html. 】 More than half of all back-office e-mail functions are run by one free software program or another. Google, Amazon, and CNN.com, for example, run their Web servers on the GNU/Linux operating system. They do this, presumably, because they believe this peerproduced operating system is more reliable than the alternatives, not because the system is "free." It would be absurd to risk a higher rate of failure in their core business activities in order to save a few hundred thousand dollars on licensing fees. Companies like IBM and Hewlett Packard, consumer electronics manufacturers, as well as military and other mission-critical government agencies around the world have begun to adopt business and service strategies that rely and extend free software. They do this because it allows them to build better equipment, sell better services, or better fulfill their public role, even though they do not control the software development process and cannot claim proprietary rights of exclusion in the products of their contributions.
The next major step came when a person with a more practical, rather than prophetic, approach to his work began developing one central component of the operating system--the kernel. Linus Torvalds began to share the early implementations of his kernel, called Linux, with others, under the GPL. These others then modified, added, contributed, and shared among themselves these pieces of the operating system. Building on top of Stallman's foundation, Torvalds crystallized a model of production that was fundamentally [pg 66] different from those that preceded it. His model was based on voluntary contributions and ubiquitous, recursive sharing; on small incremental improvements to a project by widely dispersed people, some of whom contributed a lot, others a little. Based on our usual assumptions about volunteer projects and decentralized production processes that have no managers, this was a model that could not succeed. But it did.
The most surprising thing that the open source movement has shown, in real life, is that this simple model can operate on very different scales, from the small, three-person model I described for simple projects, up to the many thousands of people involved in writing the Linux kernel and the GNU/ Linux operating system--an immensely difficult production task. SourceForge, the most popular hosting-meeting place of such projects, has close to 100,000 registered projects, and nearly a million registered users. The economics of this phenomenon are complex. In the larger-scale models, actual organization form is more diverse than the simple, three-person model. In particular, in some of the larger projects, most prominently the Linux kernel development process, a certain kind of meritocratic hierarchy is clearly present. However, it is a hierarchy that is very different in style, practical implementation, and organizational role than that of the manager in the firm. I explain this in chapter 4, as part of the analysis of the organizational forms of peer production. For now, all we need is a broad outline of how peer-production projects look, as we turn to observe case studies of kindred production models in areas outside of software. [pg 68]
First and foremost, the Wikipedia project is self-consciously an encyclopedia-- rather than a dictionary, discussion forum, web portal, etc. Wikipedia's participants [pg 73] commonly follow, and enforce, a few basic policies that seem essential to keeping the project running smoothly and productively. First, because we have a huge variety of participants of all ideologies, and from around the world, Wikipedia is committed to making its articles as unbiased as possible. The aim is not to write articles from a single objective point of view--this is a common misunderstanding of the policy--but rather, to fairly and sympathetically present all views on an issue. See "neutral point of view" page for further explanation. 【26 Yochai Benkler, "Coase's Penguin, or Linux and the Nature of the Firm," Yale Law Journal 112 (2001): 369. 】
Most of the distributed computing projects provide a series of utilities and statistics intended to allow contributors to attach meaning to their contributions in a variety of ways. The projects appear to be eclectic in their implicit social and psychological theories of the motivations for participation in the projects. Sites describe the scientific purpose of the models and the specific scientific output, including posting articles that have used the calculations. In these components, the project organizers seem to assume some degree of taste for generalized altruism and the pursuit of meaning in contributing to a common goal. They also implement a variety of mechanisms to reinforce the sense of purpose, such as providing aggregate statistics about the total computations performed by the project as a whole. However, the sites also seem to assume a healthy dose of what is known in the anthropology of gift literature as agonistic giving--that is, giving intended to show that the person giving is greater than or more important than others, who gave less. For example, most of the sites allow individuals to track their own contributions, and provide "user of the month"-type rankings. An interesting characteristic of quite a few of these is the ability to create "teams" of users, who in turn compete on who has provided more cycles or work units. SETI@home in particular taps into ready-made nationalisms, by offering country-level statistics. Some of the team names on Folding@home also suggest other, out-of-project bonding measures, such as national or ethnic bonds (for example, Overclockers Australia or Alliance Francophone), technical minority status (for example, Linux or MacAddict4Life), and organizational affiliation (University of Tennessee or University of Alabama), as well as shared cultural reference points (Knights who say Ni!). In addition, the sites offer platforms for simple connectedness and mutual companionship, by offering user fora to discuss the science and the social participation involved. It is possible that these sites are shooting in the dark, as far as motivating sharing is concerned. It also possible, however, that they have tapped into a valuable insight, which is that people behave sociably and generously for all sorts of different reasons, and that at least in this domain, adding reasons to participate--some agonistic, some altruistic, some reciprocity-seeking--does not have a crowding-out effect.
The increasing salience of nonmarket production in general, and peer production in particular, raises three puzzles from an economics perspective. First, why do people participate? What is their motivation when they work for or contribute resources to a project for which they are not paid or directly rewarded? Second, why now, why here? What, if anything, is special about the digitally networked environment that would lead us to believe that peer production is here to stay as an important economic phenomenon, as opposed to a fad that will pass as the medium matures and patterns of behavior settle toward those more familiar to us from the economy of steel, coal, and temp agencies. Third, is it efficient to have all these people sharing their computers and donating their time and creative effort? Moving through the answers to these questions, it becomes clear that the diverse and complex patterns of behavior observed on the Internet, from Viking ship hobbyists to the developers of the GNU/ Linux operating system, are perfectly consistent with much of our contemporary understanding of human economic behavior. We need to assume no fundamental change in the nature of humanity; [pg 92] we need not declare the end of economics as we know it. We merely need to see that the material conditions of production in the networked information economy have changed in ways that increase the relative salience of social sharing and exchange as a modality of economic production. That is, behaviors and motivation patterns familiar to us from social relations generally continue to cohere in their own patterns. What has changed is that now these patterns of behavior have become effective beyond the domains of building social relations of mutual interest and fulfilling our emotional and psychological needs of companionship and mutual recognition. They have come to play a substantial role as modes of motivating, informing, and organizing productive behavior at the very core of the information economy. And it is this increasing role as a modality of information production that ripples through the rest this book. It is the feasibility of producing information, knowledge, and culture through social, rather than market and proprietary relations--through cooperative peer production and coordinate individual action--that creates the opportunities for greater autonomous action, a more critical culture, a more discursively engaged and better informed republic, and perhaps a more equitable global community.
Cooperation in peer-production processes is usually maintained by some combination of technical architecture, social norms, legal rules, and a technically backed hierarchy that is validated by social norms. Wikipedia is the strongest example of a discourse-centric model of cooperation based on social norms. However, even Wikipedia includes, ultimately, a small number of people with system administrator privileges who can eliminate accounts or block users in the event that someone is being genuinely obstructionist. This technical fallback, however, appears only after substantial play has been given to self-policing by participants, and to informal and quasi-formal communitybased dispute resolution mechanisms. Slashdot, by contrast, provides a strong model of a sophisticated technical system intended to assure that no one can "defect" from the cooperative enterprise of commenting and moderating comments. It limits behavior enabled by the system to avoid destructive behavior before it happens, rather than policing it after the fact. The Slash code does this by technically limiting the power any given person has to moderate anyone else up or down, and by making every moderator the subject of a peer review system whose judgments are enforced technically-- that is, when any given user is described by a sufficiently large number of other users as unfair, that user automatically loses the technical ability to moderate the comments of others. The system itself is a free software project, licensed under the GPL (General Public License)--which is itself the quintessential example of how law is used to prevent some types of defection from the common enterprise of peer production of software. The particular type of defection that the GPL protects against is appropriation of the joint product by any single individual or firm, the risk of which would make it less attractive for anyone to contribute to the project to begin with. The GPL assures that, as a legal matter, no one who contributes to a free software project need worry that some other contributor will take the project and make it exclusively their own. The ultimate quality judgments regarding what is incorporated into the "formal" releases of free software projects provide the clearest example of the extent to which a meritocratic hierarchy can be used to integrate diverse contributions into a finished single product. In the case of the Linux kernel development project (see chapter 3), it was always within the power of Linus Torvalds, who initiated the project, to decide which contributions should be included in a new release, and which should not. But it is a funny sort of hierarchy, whose quirkiness Steve Weber [pg 105] well explicates.【38 Steve Weber, The Success of Open Source (Cambridge, MA: Harvard University Press, 2004). 】 Torvalds's authority is persuasive, not legal or technical, and certainly not determinative. He can do nothing except persuade others to prevent them from developing anything they want and add it to their kernel, or to distribute that alternative version of the kernel. There is nothing he can do to prevent the entire community of users, or some subsection of it, from rejecting his judgment about what ought to be included in the kernel. Anyone is legally free to do as they please. So these projects are based on a hierarchy of meritocratic respect, on social norms, and, to a great extent, on the mutual recognition by most players in this game that it is to everybody's advantage to have someone overlay a peer review system with some leadership.
Consider the example I presented in chapter 2 of IBM's relationship to the free and open source software development community. IBM, as I explained there, has shown more than $2 billion a year in "Linux-related revenues." Prior to IBM's commitment to adapting to what the firm sees as the inevitability of free and open source software, the company either developed in house or bought from external vendors the software it needed as part of its hardware business, on the one hand, and its software services-- customization, enterprise solutions, and so forth--on the other hand. In each case, the software development follows a well-recognized supply chain model. Through either an employment contract or a supply contract the [pg 124] company secures a legal right to require either an employee or a vendor to deliver a given output at a given time. In reliance on that notion of a supply chain that is fixed or determined by a contract, the company turns around and promises to its clients that it will deliver the integrated product or service that includes the contracted-for component. With free or open source software, that relationship changes. IBM is effectively relying for its inputs on a loosely defined cloud of people who are engaged in productive social relations. It is making the judgment that the probability that a sufficiently good product will emerge out of this cloud is high enough that it can undertake a contractual obligation to its clients, even though no one in the cloud is specifically contractually committed to it to produce the specific inputs the firm needs in the time-frame it needs it. This apparent shift from a contractually deterministic supply chain to a probabilistic supply chain is less dramatic, however, than it seems. Even when contracts are signed with employees or suppliers, they merely provide a probability that the employee or the supplier will in fact supply in time and at appropriate quality, given the difficulties of coordination and implementation. A broad literature in organization theory has developed around the effort to map the various strategies of collaboration and control intended to improve the likelihood that the different components of the production process will deliver what they are supposed to: from early efforts at vertical integration, to relational contracting, pragmatic collaboration, or Toyota's fabled flexible specialization. The presence of a formalized enforceable contract, for outputs in which the supplier can claim and transfer a property right, may change the probability of the desired outcome, but not the fact that in entering its own contract with its clients, the company is making a prediction about the required availability of necessary inputs in time. When the company turns instead to the cloud of social production for its inputs, it is making a similar prediction. And, as with more engaged forms of relational contracting, pragmatic collaborations, or other models of iterated relations with co-producers, the company may engage with the social process in order to improve the probability that the required inputs will in fact be produced in time. In the case of companies like IBM or Red Hat, this means, at least partly, paying employees to participate in the open source development projects. But managing this relationship is tricky. The firms must do so without seeking to, or even seeming to seek to, take over the project; for to take over the project in order to steer it more "predictably" toward the firm's needs is to kill the goose that lays the golden eggs. For IBM and more recently Nokia, supporting [pg 125] the social processes on which they rely has also meant contributing hundreds of patents to the Free Software Foundation, or openly licensing them to the software development community, so as to extend the protective umbrella created by these patents against suits by competitors. As the companies that adopt this strategic reorientation become more integrated into the peer-production process itself, the boundary of the firm becomes more porous. Participation in the discussions and governance of open source development projects creates new ambiguity as to where, in relation to what is "inside" and "outside" of the firm boundary, the social process is. In some cases, a firm may begin to provide utilities or platforms for the users whose outputs it then uses in its own products. The Open Source Development Group (OSDG), for example, provides platforms for Slashdot and SourceForge. In these cases, the notion that there are discrete "suppliers" and "consumers," and that each of these is clearly demarcated from the other and outside of the set of stable relations that form the inside of the firm becomes somewhat attenuated.
Second Life and Jedi Saga are merely examples, perhaps trivial ones, within the entertainment domain. They represent a shift in possibilities open both to human beings in the networked information economy and to the firms that sell them the tools for becoming active creators and users of their information environment. They are stark examples because of the centrality of the couch potato as the image of human action in television culture. Their characteristics are representative of the shift in the individual's role that is typical of the networked information economy in general and of peer production in particular. Linus Torvalds, the original creator of the Linux kernel [pg 137] development community, was, to use Eric Raymond's characterization, a designer with an itch to scratch. Peer-production projects often are composed of people who want to do something in the world and turn to the network to find a community of peers willing to work together to make that wish a reality. Michael Hart had been working in various contexts for more than thirty years when he--at first gradually, and more recently with increasing speed--harnessed the contributions of hundreds of volunteers to Project Gutenberg in pursuit of his goal to create a globally accessible library of public domain e-texts. Charles Franks was a computer programmer from Las Vegas when he decided he had a more efficient way to proofread those e-texts, and built an interface that allowed volunteers to compare scanned images of original texts with the e-texts available on Project Gutenberg. After working independently for a couple of years, he joined forces with Hart. Franks's facility now clears the volunteer work of more than one thousand proofreaders, who proof between two hundred and three hundred books a month. Each of the thousands of volunteers who participate in free software development projects, in Wikipedia, in the Open Directory Project, or in any of the many other peer-production projects, is living some version, as a major or minor part of their lives, of the possibilities captured by the stories of a Linus Torvalds, a Michael Hart, or The Jedi Saga. Each has decided to take advantage of some combination of technical, organizational, and social conditions within which we have come to live, and to become an active creator in his or her world, rather than merely to accept what was already there. The belief that it is possible to make something valuable happen in the world, and the practice of actually acting on that belief, represent a qualitative improvement in the condition of individual freedom. They mark the emergence of new practices of self-directed agency as a lived experience, going beyond mere formal permissibility and theoretical possibility.
The software industry offers a baseline case because of the proven large scope for peer production in free software. As in other information-intensive industries, government funding and research have played an enormously important role, and university research provides much of the basic science. However, the relative role of individuals, nonprofits, and nonproprietary market producers is larger in software than in the other sectors. First, twothirds of revenues derived from software in the United States are from services [pg 321] and do not depend on proprietary exclusion. Like IBM's "Linux-related services" category, for which the company claimed more than two billion dollars of revenue for 2003, these services do not depend on exclusion from the software, but on charging for service relationships.【111 For the sources of numbers for the software industry, see chapter 2 in this volume. IBM numbers, in particular, are identified in figure 2.1. 】 Second, some of the most basic elements of the software environment--like standards and protocols--are developed in nonprofit associations, like the Internet Engineering Taskforce or the World Wide Web Consortium. Third, the role of individuals engaged in peer production--the free and open-source software development communities--is very large. Together, these make for an organizational ecology highly conducive to nonproprietary production, whose outputs can be freely usable around the globe. The other sectors have some degree of similar components, and commons-based strategies for development can focus on filling in the missing components and on leveraging nonproprietary components already in place.
The licensing or pooling component is more proactive, and is likely the most significant of the project. BIOS is setting up a licensing and pooling arrangement, "primed" by CAMBIA's own significant innovations in tools, which are licensed to all of the initiative's participants on a free model, with grant-back provisions that perform an openness-binding function similar to copyleft.【124 Wim Broothaertz et al., "Gene Transfer to Plants by Diverse Species of Bacteria," Nature 433 (2005): 629. 】 In coarse terms, this means that anyone who builds upon the [pg 343] contributions of others must contribute improvements back to the other participants. One aspect of this model is that it does not assume that all research comes from academic institutions or from traditional governmentfunded, nongovernmental, or intergovernmental research institutes. It tries to create a framework that, like the open-source development community, engages commercial and noncommercial, public and private, organized and individual participants into a cooperative research network. The platform for this collaboration is "BioForge," styled after Sourceforge, one of the major free and open-source software development platforms. The commitment to engage many different innovators is most clearly seen in the efforts of BIOS to include major international commercial providers and local potential commercial breeders alongside the more likely targets of a commons-based initiative. Central to this move is the belief that in agricultural science, the basic tools can, although this may be hard, be separated from specific applications or products. All actors, including the commercial ones, therefore have an interest in the open and efficient development of tools, leaving competition and profit making for the market in applications. At the other end of the spectrum, BIOS's focus on making tools freely available is built on the proposition that innovation for food security involves more than biotechnology alone. It involves environmental management, locale-specific adaptations, and social and economic adoption in forms that are locally and internally sustainable, as opposed to dependent on a constant inflow of commoditized seed and other inputs. The range of participants is, then, much wider than envisioned by PIPRA or the GCP. It ranges from multinational corporations through academic scientists, to farmers and local associations, pooling their efforts in a communications platform and institutional model that is very similar to the way in which the GNU/Linux operating system has been developed. As of this writing, the BIOS project is still in its early infancy, and cannot be evaluated by its outputs. However, its structure offers the crispest example of the extent to which the peer-production model in particular, and commons-based production more generally, can be transposed into other areas of innovation at the very heart of what makes for human development--the ability to feed oneself adequately.
Another case did not end so well for the defendant. It involved a suit by the eight Hollywood studios against a hacker magazine, 2600. The studios sought an injunction prohibiting 2600 from making available a program called DeCSS, which circumvents the copy-protection scheme used to control access to DVDs, named CSS. CSS prevents copying or any use of DVDs unauthorized by the vendor. DeCSS was written by a fifteen-year-old Norwegian named Jon Johanson, who claimed (though the district court discounted his claim) to have written it as part of an effort to create a DVD player for GNU/Linux-based machines. A copy of DeCSS, together with a story about it was posted on the 2600 site. The industry obtained an injunction against 2600, prohibiting not only the posting of DeCSS, but also its linking to other sites that post the program--that is, telling users where they can get the program, rather than actually distributing a circumvention program. That decision may or may not have been correct on the merits. There are strong arguments in favor of the proposition that making DVDs compatible with GNU/Linux systems is a fair use. There are strong arguments that the DMCA goes much farther than it needs to in restricting speech of software programmers and Web authors, and so is invalid under the First Amendment. The court rejected these arguments.
Sean Doyle and Adrian Gropper opened the doors to this project, providing unparalleled insight, hospitality, challenge, and curiosity. Axel Roch introduced me to Volker Grassmuck, and to much else. Volker Grassmuck introduced me to Berlin's Free Software world and invited me to participate in the Wizards of OS conferences. Udhay Shankar introduced me to almost everyone I know, sometimes after the fact. Shiv Sastry helped me find lodging in Bangalore at his Aunt Anasuya Sastry's house, which is called "Silicon Valley" and which was truly a lovely place to stay. Bharath Chari and Ram Sundaram let me haunt their office and cat-5 cables [PAGE xiv] during one of the more turbulent periods of their careers. Glenn Otis Brown visited, drank, talked, invited, challenged, entertained, chided, encouraged, drove, was driven, and gave and received advice. Ross Reedstrom welcomed me to the Rice Linux Users' Group and to Connexions. Brent Hendricks did yeoman's work, suffering my questions and intrusions. Geneva Henry, Jenn Drummond, Chuck Bearden, Kathy Fletcher, Manpreet Kaur, Mark Husband, Max Starkenberg, Elvena Mayo, Joey King, and Joel Thierstein have been welcoming and enthusiastic at every meeting. Sid Burris has challenged and respected my work, which has been an honor. Rich Baraniuk listens to everything I say, for better or for worse; he is a magnificent collaborator and friend.
The fifth component, the practice of coordination and collaboration (chapter 7), is the most talked about: the idea of tens or hundreds of thousands of people volunteering their time to contribute to the creation of complex software. In this chapter I show how novel forms of coordination developed in the 1990s and how they worked in the canonical cases of Apache and Linux; I also highlight how coordination facilitates the commitment to adaptability (or modifiability) over against planning and hierarchy, and how this commitment resolves the tension between individual virtuosity and the need for collective control.
Empirically speaking, the actors in my stories are figuring something out, something unfamiliar, troubling, imprecise, and occasionally shocking to everyone involved at different times and to differing extents.【13 The language of "figuring out" has its immediate source in the work of Kim Fortun, "Figuring Out Ethnography." Fortun's work refines two other sources, the work of Bruno Latour in Science in Action and that of Hans-Jorg Rheinberger in Towards History of Epistemic Things. Latour describes the difference between "science made" and "science in the making" and how the careful analysis of new objects can reveal how they come to be. Rheinberger extends this approach through analysis of the detailed practices involved in figuring out a new object or a new process—practices which participants cannot quite name or explain in precise terms until after the fact. 】 There are two kinds of figuring-out stories: the contemporary ones in which I have been an active participant (those of Connexions and Creative Commons), and the historical ones conducted through "archival" research and rereading of certain kinds of texts, discussions, and analyses-at-the-time (those of UNIX, EMACS, Linux, Apache, and Open Systems). Some are stories of technical figuring out, but most are stories of figuring out a problem that appears to have emerged. Some of these stories involve callow and earnest actors, some involve scheming and strategy, but in all of them the figuring out is presented "in the making" and not as something that can be conveniently narrated as obvious and uncontested with the benefit of hindsight. Throughout this book, I tell stories that illustrate what geeks are like in some respects, but, more important, that show them in the midst of figuring things out—a practice that can happen both in discussion and in the course of designing, planning, executing, writing, debugging, hacking, and fixing.
Because the stories I tell here are in fact recent by the standards of historical scholarship, there is not much by way of comparison in terms of the empirical material. I rely on a number of books and articles on the history of the early Internet, especially Janet Abbate's scholarship and the single historical work on UNIX, Peter Salus's A Quarter Century of Unix.【16 In addition to Abbate and Salus, see Norberg and O'Neill, Transforming Computer Technology; Naughton, A Brief History of the Future; Hafner, Where Wizards Stay Up Late; Waldrop, The Dream Machine; Segaller, Nerds 2.0.1. For a classic autodocumentation of one aspect of the Internet, see Hauben and Hauben, Netizens. 】 There are also a couple of excellent journalistic works, such as Glyn Moody's Rebel Code: Inside Linux and the Open Source Revolution (which, like Two Bits, relies heavily on the novel accessibility of detailed discussions carried out on public mailing lists). Similarly, the scholarship on Free Software and its history is just starting to establish itself around a coherent set of questions.【17 Kelty, "Culture's Open Sources"; Coleman, "The Social Construction of Freedom"; Ratto, "The Pressure of Openness"; Joseph Feller et al., Perspectives [pg 315] on Free and Open Source Software; see also ‹http://freesoftware.mit.edu/›, organized by Karim Lakhani, which is a large collection of work on Free Software projects. Early work in this area derived both from the writings of practitioners such as Raymond and from business and management scholars who noticed in Free Software a remarkable, surprising set of seeming contradictions. The best of these works to date is Steven Weber, The Success of Open Source. Weber's conclusions are similar to those presented here, and he has a kind of cryptoethnographic familiarity (that he does not explicitly avow) with the actors and practices. Yochai Benkler's Wealth of Networks extends and generalizes some of Weber's argument. 】
Until the mid-1990s, hacker, geek, and computer nerd designated a very specific type: programmers and lurkers on relatively underground networks, usually college students, computer scientists, and "amateurs" or "hobbyists." A classic mock self-diagnostic called the Geek Code, by Robert Hayden, accurately and humorously detailed the various ways in which one could be a geek in 1996—UNIX/ Linux skills, love/hate of Star Trek, particular eating and clothing habits—but as Hayden himself points out, the geeks of the early 1990s exist no longer. The elite subcultural, relatively homogenous group it once was has been overrun: "The Internet of 1996 was still a wild untamed virgin paradise of geeks and eggheads unpopulated by script kiddies, and the denizens of AOL. When things changed, I seriously lost my way. I mean, all the ‘geek' that was the Internet [pg 36] was gone and replaced by Xfiles buzzwords and politicians passing laws about a technology they refused to comprehend."【25 See The Geek Code, ‹http://www.geekcode.com/›. 】
Berlin, November 1999. I am in a very hip club in Mitte called WMF. It's about eight o'clock—five hours too early for me to be a hipster, but the context is extremely cool. WMF is in a hard-to-find, abandoned building in the former East; it is partially converted, filled with a mixture of new and old furnishings, video projectors, speakers, makeshift bars, and dance-floor lighting. A crowd of around fifty people lingers amid smoke and Beck's beer bottles, [pg 37] sitting on stools and chairs and sofas and the floor. We are listening to an academic read a paper about Claude Shannon, the MIT engineer credited with the creation of information theory. The author is smoking and reading in German while the audience politely listens. He speaks for about seventy minutes. There are questions and some perfunctory discussion. As the crowd breaks up, I find myself, in halting German that quickly converts to English, having a series of animated conversations about the GNU General Public License, the Debian Linux Distribution, open standards in net radio, and a variety of things for which Claude Shannon is the perfect ghostly technopaterfamilias, even if his seventy-minute invocation has clashed heavily with the surroundings.
Before long, I am talking with Volker Grassmuck, founding member of Mikro and organizer of the successful "Wizards of OS" conference, held earlier in the year, which had the very intriguing subtitle "Operating Systems and Social Systems." Grassmuck is inviting me to participate in a planning session for the next WOS, held at the Chaos Computer Congress, a hacker gathering that occurs each year in December in Berlin. In the following months I will meet a huge number of people who seem, uncharacteristically for artists [pg 38] and activists, strangely obsessed with configuring their Linux distributions or hacking the http protocol or attending German Parliament hearings on copyright reform. The political lives of these folks have indeed mixed up operating systems and social systems in ways that are more than metaphorical.
Perhaps the most familiar and famous of these wars is that between Apple and Microsoft (formerly between Apple and IBM), a conflict that is often played out in dramatic and broad strokes that imply fundamental differences, when in fact the differences are extremely slight.【59 The Apple-Microsoft conflict was given memorable expression by Umberto Eco in a widely read piece that compared the Apple user interface [pg 320] to Catholicism and the PC user interface to Protestantism ("La bustina di Minerva," Espresso, 30 September 1994, back page). 】 Geeks are also familiar with a wealth of less well-known "holy wars": EMACS versus vi; KDE versus Gnome; Linux versus BSD; Oracle versus all other databases.【60 One entry on Wikipedia differentiates religious wars from run-of-the-mill "flame wars" as follows: "Whereas a flame war is usually a particular spate of flaming against a non-flamy background, a holy war is a drawn-out disagreement that may last years or even span careers" ("Flaming [Internet]," ‹http://en.wikipedia.org/wiki/Flame_war› [accessed 16 January 2006]). 】
In addition to the obvious pleasure with which they deploy the sectarian aspects of the Protestant Reformation, geeks also allow themselves to see their struggles as those of Luther-like adepts, confronted by powerful worldly institutions that are distinct but intertwined: the Catholic Church and absolutist monarchs. Sometimes these comparisons are meant to mock theological argument; sometimes they are more straightforwardly hagiographic. For instance, a 1998 article in Salon compares Martin Luther and Linus Torvalds (originator of the Linux kernel).
A critical point in the emergence of Free Software occurred in 1998-99: new names, new narratives, but also new wealth and new stakes. "Open Source" was premised on dotcom promises of cost-cutting and "disintermediation" and various other schemes to make money on it (Cygnus Solutions, an early Free Software company, playfully tagged itself as "Making Free Software More Affordable"). VA Linux, for instance, which sold personal-computer systems pre-installed with Open Source operating systems, had the largest single initial public offering (IPO) of the stock-market bubble, seeing a 700 percent share-price increase in one day. "Free Software" by contrast fanned kindling flames of worry over intellectual-property expansionism and hitched itself to a nascent legal resistance to the 1998 Digital Millennium Copyright Act and Sonny Bono Copyright Term Extension Act. Prior to 1998, Free Software referred either to the Free Software Foundation (and the watchful, micromanaging eye of Stallman) or to one of thousands of different commercial, avocational, or university-research projects, processes, licenses, and ideologies that had a variety of names: sourceware, freeware, shareware, open software, public domain software, and so on. The term Open Source, by contrast, sought to encompass them all in one movement.
Fomenting Movements The period from 1 April 1998, when the Mozilla source code was first released, to 1 April 1999, when Zawinski announced its failure, couldn't have been a headier, more exciting time for participants in Free Software. Netscape's decision to release the source code was a tremendous opportunity for geeks involved in Free Software. It came in the midst of the rollicking dotcom bubble. It also came in the midst of the widespread adoption of [pg 108] key Free Software tools: the Linux operating system for servers, the Apache Web server for Web pages, the perl and python scripting languages for building quick Internet applications, and a number of other lower-level tools like Bind (an implementation of the DNS protocol) or sendmail for e-mail.
Perhaps most important, Netscape's decision came in a period of fevered and intense self-reflection among people who had been involved in Free Software in some way, stretching back to the mid-1980s. Eric Raymond's article "The Cathedral and The Bazaar," delivered at the Linux Kongress in 1997 and the O'Reilly Perl Conference the same year, had started a buzz among Free Software hackers. It was cited by Frank Hecker and Eric Hahn at Netscape as one of the sources for their thinking about the decision to free Mozilla; Raymond and Bruce Perens had both been asked to consult with Netscape on Free Software strategy. In April of the same year Tim O'Reilly, a publisher of handbooks for Free Software, organized a conference called the Freeware Summit.
All through 1998 and 1999, buzz around Open Source built. Little-known companies such as Red Hat, VA Linux, Cygnus, Slackware, and SuSe, which had been providing Free Software support and services to customers, suddenly entered media and business consciousness. Articles in the mainstream press circulated throughout the spring and summer of 1998, often attempting to make sense of the name change and whether it meant a corresponding change in practice. A front-cover article in Forbes, which featured photos of Stallman, Larry Wall, Brian Behlendorf, and Torvalds (figure 2), was noncommittal, cycling between Free Software, Open Source, and Freeware.【105 Josh McHugh, "For the Love of Hacking," Forbes, 10 August 1998, 94-100. 】
By December 1999, the buzz had reached a fever pitch. When VA Linux, a legitimate company which actually made something real—computers with Linux installed on them—went public, its shares' value gained 700 percent in one day and was the single [pg 112] most valuable initial public offering of the era. VA Linux took the unconventional step of allowing contributors to the Linux kernel to buy into the stock before the IPO, thus bringing at least a partial set of these contributors into the mainstream Ponzi scheme of the Internet dotcom economy. Those who managed to sell their stock ended up benefiting from the boom, whether or not their contributions to Free Software truly merited it. In a roundabout way, Raymond, O'Reilly, Perens, and others behind the name change had achieved recognition for the central role of Free Software in the success of the Internet—and now its true name could be known: Open Source.
According to most of the scholarly literature, the function of a movement is to narrate the shared goals and to recruit new members. But is this what happens in Free Software or Open Source?【106 On social movements—the closest analog, developed long ago—see Gerlach and Hine, People, Power, Change, and Freeman and Johnson, Waves of Protest. However, the Free Software and Open Source Movements do not have "causes" of the kind that conventional movements do, other than the perpetuation of Free and Open Source Software (see Coleman, "Political Agnosticism"; Chan, "Coding Free Software"). Similarly, there is no single development methodology that would cover only Open Source. Advocates of Open Source are all too willing to exclude those individuals or organizations who follow the same "development methodology" but do not use a Free Software license—such as Microsoft's oft-mocked "shared-source" program. The list of licenses approved by both the Free Software Foundation and the Open Source Initiative is substantially the same. Further, the Debian Free Software Guidelines and the "Open Source Definition" are almost identical (compare ‹http://www.gnu.org/philosophy/license-list.html› with ‹http://www.opensource.org/licenses/› [both accessed 30 June 2006]). 】 To begin with, movement is an awkward word; not all participants would define their participation this way. Richard Stallman suggests that Free Software is social movement, while Open Source is a development methodology. Similarly some Open Source proponents see it as a pragmatic methodology and Free Software as a dogmatic philosophy. While there are specific entities like the Free Software Foundation and the Open Source Initiative, they do not comprise all Free Software or Open Source. Free Software and Open Source are neither corporations nor organizations nor consortia (for there are no organizations to consort); they are neither national, subnational, nor international; they are not "collectives" because no membership is required or assumed—indeed to hear someone assert "I belong" to Free Software or Open Source would sound absurd to anyone who does. Neither are they shady bands of hackers, crackers, or thieves meeting in the dead of night, which is to say that they are not an "informal" organization, because there is no formal equivalent to mimic or annul. Nor are they quite a crowd, for a crowd can attract participants who have no idea what the goal of the crowd is; also, crowds are temporary, while movements extend over time. It may be that movement is the best term of the lot, but unlike social movements, whose organization and momentum are fueled by shared causes or broken by ideological dispute, Free Software and Open Source share practices first, and ideologies second. It is this fact that is the strongest confirmation that they are a recursive public, a form of public that is as concerned with the material practical means of becoming public as it is with any given public debate.
The movement, as a practice of discussion and argument, is made up of stories. It is a practice of storytelling: affect- and intellect-laden lore that orients existing participants toward a particular problem, contests other histories, parries attacks from outside, and draws in new recruits.【107 It is, in the terms of Actor Network Theory, a process of "enrollment" in which participants find ways to rhetorically align—and to disalign—their interests. It does not constitute the substance of their interest, however. See Latour, Science in Action; Callon, "Some Elements of a Sociology of Translation." 】 This includes proselytism and evangelism (and the usable pasts of protestant reformations, singularities, rebellion and iconoclasm are often salient here), whether for the reform of intellectual-property law or for the adoption of Linux in the trenches of corporate America. It includes both heartfelt allegiance in the name of social justice as well as political agnosticism stripped of all ideology.【108 Coleman, "Political Agnosticism." 】 Every time Free Software is introduced to someone, discussed in the media, analyzed in a scholarly work, or installed in a workplace, a story of either Free Software or Open Source is used to explain its purpose, its momentum, and its temporality. At the extremes are the prophets and proselytes themselves: Eric Raymond describes Open Source as an evolutionarily necessary outcome of the natural tendency of human societies toward economies of abundance, while Richard Stallman describes it as a defense of the fundamental freedoms of creativity and speech, using a variety of philosophical theories of liberty, justice, and the defense of freedom.【109 See, respectively, Raymond, The Cathedral and the Bazaar, and Williams, Free as in Freedom. 】 Even scholarly analyses must begin with a potted history drawn from the self-narration of geeks who make or advocate free software.【110 For example, Castells, The Internet Galaxy, and Weber, The Success of Open Source both tell versions of the same story of origins and development. 】 Indeed, as a methodological aside, one reason it is so easy to track such stories and narratives is because geeks like to tell and, more important, like to archive such stories—to create Web pages, definitions, encyclopedia entries, dictionaries, and mini-histories and to save every scrap of correspondence, every fight, and every resolution related to their activities. This "archival hubris" yields a very peculiar and specific kind of fieldsite: one in which a kind [pg 115] of "as-it-happens" ethnographic observation is possible not only through "being there" in the moment but also by being there in the massive, proliferating archives of moments past. Understanding the movement as a changing entity requires constantly glancing back at its future promises and the conditions of their making.
Throughout the 1970s, the low licensing fees, the inclusion of the source code, and its conceptual integrity meant that UNIX was ported to a remarkable number of other machines. In many ways, academics found it just as appealing, if not more, to be involved in the creation and improvement of a cutting-edge system by licensing and porting the software themselves, rather than by having it provided to them, without the source code, by a company. Peter Salus, for instance, suggests that people experienced the lack of support from Bell Labs as a kind of spur to develop and share their own fixes. The means by which source code was shared, and the norms and practices of sharing, porting, forking, and modifying source code were developed in this period as part of the development of UNIX itself—the technical design of the system facilitates and in some cases mirrors the norms and practices of sharing that developed: operating systems and social systems.【133 The simultaneous development of the operating system and the norms for creating, sharing, documenting, and extending it are often referred to as the "UNIX philosophy." It includes the central idea that one should build on the ideas (software) of others (see Gancarz, The Unix Philosophy and Linux and the UNIX Philosophy). See also Raymond, The Art of UNIX Programming. 】
Minix was not commercial software, but nor was it Free Software. It was copyrighted and controlled by Tanenbaum's publisher, Prentice Hall. Because it used no AT&T source code, Minix was also legally independent, a legal object of its own. The fact that it was intended to be legally distinct from, yet conceptually true to UNIX is a clear indication of the kinds of tensions that govern the creation and sharing of source code. The ironic apotheosis of Minix as the pedagogical gold standard for studying UNIX came in 1991-92, when a young Linus Torvalds created a "fork" of Minix, also rewritten from scratch, that would go on to become the paradigmatic piece of Free Software: Linux. Tanenbaum's purpose for Minix was that it remain a pedagogically useful operating system—small, concise, and illustrative—whereas Torvalds wanted to extend and expand his version of Minix to take full advantage of the kinds of hardware being produced in the 1990s. Both, however, were committed to source-code visibility and sharing as the swiftest route to complete comprehension of operating-systems principles.
According to Don Libes, Bell Labs allowed Berkeley to distribute its extensions to UNIX so long as the recipients also had a license from Bell Labs for the original UNIX (an arrangement similar to the one that governed Lions's Commentary).【144 Libes and Ressler, Life with UNIX, 16-17. 】 From about 1976 until about 1981, BSD slowly became an independent distribution—indeed, a complete version of UNIX—well-known for the vi editor and the Pascal compiler, but also for the addition of virtual memory and its implementation on DEC's VAX machines.【145 A recent court case between the Utah-based SCO—the current owner of the legal rights to the original UNIX source code—and IBM raised yet again the question of how much of the original UNIX source code exists in the BSD distribution. SCO alleges that IBM (and Linus Torvalds) inserted SCO-owned UNIX source code into the Linux kernel. However, the incredibly circuitous route of the "original" source code makes these claims hard to ferret out: it was developed at Bell Labs, licensed to multiple universities, used as a basis for BSD, sold to an earlier version of the company SCO (then known as the Santa Cruz Operation), which created a version called Xenix in cooperation with Microsoft. See the diagram by Eric Lévénez at ‹http://www.levenez.com/unix/›. For more detail on this case, see www.groklaw.com. 】 It should be clear that the unusual quasi-commercial status of AT&T's UNIX allowed for this situation in a way that a fully commercial computer corporation would never have allowed. Consider, for instance, the fact that many UNIX users—students at a university, for instance—could not essentially know whether they were using an AT&T product or something called BSD UNIX created at Berkeley. The operating system functioned in the same way and, except for the presence of copyright notices that occasionally flashed on the screen, did not make any show of asserting its brand identity (that would come later, in the 1980s). Whereas a commercial computer manufacturer would have allowed something like BSD only if it were incorporated into and distributed as a single, marketable, and identifiable product with a clever name, AT&T turned something of a blind eye to the proliferation and spread of AT&T UNIX and the result were forks in the project: distinct bodies of source code, each an instance of something called UNIX.
Openness and open systems are key to understanding the practices of Free Software: the open-systems battles of the 1980s set the context for Free Software, leaving in their wake a partially articulated infrastructure of operating systems, networks, and markets that resulted from figuring out open systems. The failure to create a standard UNIX operating system opened the door for Microsoft Windows NT, but it also set the stage for the emergence of the Linux-operating-system kernel to emerge and spread. The success of the TCP/IP protocols forced multiple competing networking schemes into a single standard—and a singular entity, the Internet—which carried with it a set of built-in goals that mirror the moral-technical order of Free Software.
Such discussion has continued and grown exponentially over the last twenty years, to the point that Free Software hackers are now nearly as deeply educated about intellectual property law as they are about software code.【258 See Coleman, "The Social Construction of Freedom," chap. 6, on the Debian New Maintainer Process, for an example of how induction into a Free Software project stresses the legal as much as the technical, if not more. 】 Far from representing the triumph of the hacker ethic, the GNU General Public License represents the concrete, tangible outcome of a relatively wide-ranging cultural conversation hemmed in by changing laws, court decisions, practices both commercial and academic, and experiments with the limits and forms of new media and new technology.
The rest of the story is quickly told: Stallman resigned from the AI Lab at MIT and started the Free Software Foundation in 1985; he created a raft of new tools, but ultimately no full UNIX operating system, and issued General Public License 1.0 in 1989. In 1990 he was awarded a MacArthur "genius grant." During the 1990s, he was involved in various high-profile battles among a new generation of hackers; those controversies included the debate around Linus Torvalds's creation of Linux (which Stallman insisted be referred to as GNU/Linux), the forking of EMACS into Xemacs, and Stallman's own participation in—and exclusion from—conferences and events devoted to Free Software. [pg 207]
The Free Software Foundation represents a recognition on his part that individual and communal independence would come at the price of a legally and bureaucratically recognizable entity, set apart from MIT and responsible only to itself. The Free Software Foundation took a classic form: a nonprofit organization with a hierarchy. But by the early 1990s, a new set of experiments would begin that questioned the look of such an entity. The stories of Linux and Apache reveal how these ventures both depended on the work of the Free Software Foundation and departed from the hierarchical tradition it represented, in order to innovate new similarly embedded sociotechnical forms of coordination.
The final component of Free Software is coordination. For many participants and observers, this is the central innovation and essential significance of Open Source: the possibility of enticing potentially huge numbers of volunteers to work freely on a software project, leveraging the law of large numbers, "peer production," "gift economies," and "self-organizing social economies."【261 Research on coordination in Free Software forms the central core of recent academic work. Two of the most widely read pieces, Yochai Benkler's "Coase's Penguin" and Steven Weber's The Success of Open Source, are directed at classic research questions about collective action. Rishab Ghosh's "Cooking Pot Markets" and Eric Raymond's The Cathedral and the Bazaar set many of the terms of debate. Josh Lerner's and Jean Tirole's "Some Simple Economics of Open Source" was an early contribution. Other important works on the subject are Feller et al., Perspectives on Free and Open Source Software; Tuomi, Networks of Innovation; Von Hippel, Democratizing Innovation. 】 Coordination in Free Software is of a distinct kind that emerged in the 1990s, directly out of the issues of sharing source code, conceiving open systems, and writing copyright licenses—all necessary precursors to the practices of coordination. The stories surrounding these issues find continuation in those of the Linux operating-system kernel, of the Apache Web server, and of Source Code Management tools (SCMs); together these stories reveal how coordination worked and what it looked like in the 1990s.
Coordination is important because it collapses and resolves the distinction between technical and social forms into a meaningful [pg 211] whole for participants. On the one hand, there is the coordination and management of people; on the other, there is the coordination of source code, patches, fixes, bug reports, versions, and distributions—but together there is a meaningful technosocial practice of managing, decision-making, and accounting that leads to the collaborative production of complex software and networks. Such coordination would be unexceptional, essentially mimicking long-familiar corporate practices of engineering, except for one key fact: it has no goals. Coordination in Free Software privileges adaptability over planning. This involves more than simply allowing any kind of modification; the structure of Free Software coordination actually gives precedence to a generalized openness to change, rather than to the following of shared plans, goals, or ideals dictated or controlled by a hierarchy of individuals.【262 On the distinction between adaptability and adaptation, see Federico Iannacci, "The Linux Managing Model," ‹http://opensource.mit.edu/papers/iannacci2.pdf›. Matt Ratto characterizes the activity of Linux-kernel developers as a "culture of re-working" and a "design for re-design," and captures the exquisite details of such a practice both in coding and in the discussion between developers, an activity he dubs the "pressure of openness" that "results as a contradiction between the need to maintain productive collaborative activity and the simultaneous need to remain open to new development directions" ("The Pressure of Openness," 112-38). 】
Adaptability does not mean randomness or anarchy, however; it is a very specific way of resolving the tension between the individual curiosity and virtuosity of hackers, and the collective coordination necessary to create and use complex software and networks. No man is an island, but no archipelago is a nation, so to speak. Adaptability preserves the "joy" and "fun" of programming without sacrificing the careful engineering of a stable product. Linux and Apache should be understood as the results of this kind of coordination: experiments with adaptability that have worked, to the surprise of many who have insisted that complexity requires planning and hierarchy. Goals and planning are the province of governance—the practice of goal-setting, orientation, and definition of control—but adaptability is the province of critique, and this is why Free Software is a recursive public: it stands outside power and offers powerful criticism in the form of working alternatives. It is not the domain of the new—after all Linux is just a rewrite of UNIX—but the domain of critical and responsive public direction of a collective undertaking.
Linux and Apache are more than pieces of software; they are organizations of an unfamiliar kind. My claim that they are "recursive publics" is useful insofar as it gives a name to a practice that is neither corporate nor academic, neither profit nor nonprofit, neither governmental nor nongovernmental. The concept of recursive public includes, within the spectrum of political activity, the creation, modification, and maintenance of software, networks, and legal documents. While a "public" in most theories is a body of [pg 212] people and a discourse that give expressive form to some concern, "recursive public" is meant to suggest that geeks not only give expressive form to some set of concerns (e.g., that software should be free or that intellectual property rights are too expansive) but also give concrete infrastructural form to the means of expression itself. Linux and Apache are tools for creating networks by which expression of new kinds can be guaranteed and by which further infrastructural experimentation can be pursued. For geeks, hacking and programming are variants of free speech and freedom of assembly.
From UNIX to Minix to Linux
Linux and Apache are the two paradigmatic cases of Free Software in the 1990s, both for hackers and for scholars of Free Software. Linux is a UNIX-like operating-system kernel, bootstrapped out of the Minix operating system created by Andrew Tanenbaum.【263 Linux is often called an operating system, which Stallman objects to on the theory that a kernel is only one part of an operating system. Stallman suggests that it be called GNU/Linux to reflect the use of GNU operating-system tools in combination with the Linux kernel. This not-so-subtle ploy to take credit for Linux reveals the complexity of the distinctions. The kernel is at the heart of hundreds of different "distributions"—such as Debian, Red Hat, SuSe, and Ubuntu Linux—all of which also use GNU tools, but [pg 338] which are often collections of software larger than just an operating system. Everyone involved seems to have an intuitive sense of what an operating system is (thanks to the pedagogical success of UNIX), but few can draw any firm lines around the object itself. 】 Apache is the continuation of the original National Center for Supercomputing Applications (NCSA) project to create a Web server (Rob McCool's original program, called httpd), bootstrapped out of a distributed collection of people who were using and improving that software.
Linux and Apache are both experiments in coordination. Both projects evolved decision-making systems through experiment: a voting system in Apache's case and a structured hierarchy of decision-makers, with Linus Torvalds as benevolent dictator, in Linux's case. Both projects also explored novel technical tools for coordination, especially Source Code Management (SCM) tools such as Concurrent Versioning System (cvs). Both are also cited as exemplars of how "fun," "joy," or interest determine individual participation and of how it is possible to maintain and encourage that participation and mutual aid instead of narrowing the focus or eliminating possible routes for participation.
Beyond these specific experiments, the stories of Linux and Apache are detailed here because both projects were actively central to the construction and expansion of the Internet of the 1990s by allowing a massive number of both corporate and noncorporate sites to cheaply install and run servers on the Internet. Were Linux and Apache nothing more than hobbyist projects with a few thousand [pg 213] interested tinkerers, rather than the core technical components of an emerging planetary network, they would probably not represent the same kind of revolutionary transformation ultimately branded a "movement" in 1998-99.
Linus Torvalds's creation of the Linux kernel is often cited as the first instance of the real "Open Source" development model, and it has quickly become the most studied of the Free Software projects.【264 Eric Raymond directed attention primarily to Linux in The Cathedral and the Bazaar. Many other projects preceded Torvalds's kernel, however, including the tools that form the core of both UNIX and the Internet: Paul Vixie's implementation of the Domain Name System (DNS) known as BIND; Eric Allman's sendmail for routing e-mail; the scripting languages perl (created by Larry Wall), python (Guido von Rossum), and tcl/tk (John Ousterhout); the X Windows research project at MIT; and the derivatives of the original BSD UNIX, FreeBSD and OpenBSD. On the development model of FreeBSD, see Jorgensen, "Putting It All in the Trunk" and "Incremental and Decentralized Integration in FreeBSD." The story of the genesis of Linux is very nicely told in Moody, Rebel Code, and Williams, Free as in Freedom; there are also a number of papers—available through Free/Opensource Research Community, ‹http://freesoftware.mit.edu/—that› analyze the development dynamics of the Linux kernel. See especially Ratto, "Embedded Technical Expression" and "The Pressure of Openness." I have conducted much of my analysis of Linux by reading the Linux Kernel Mailing List archives, ‹http://lkml.org›. There are also annotated summaries of the Linux Kernel Mailing List discussions at ‹http://kerneltraffic.org›. 】 Following its appearance in late 1991, Linux grew quickly from a small, barely working kernel to a fully functional replacement for the various commercial UNIX systems that had resulted from the UNIX wars of the 1980s. It has become versatile enough to be used on desktop PCs with very little memory and small CPUs, as well as in "clusters" that allow for massively parallel computing power.
When Torvalds started, he was blessed with an eager audience of hackers keen on seeing a UNIX system run on desktop computers and a personal style of encouragement that produced enormous positive feedback. Torvalds is often given credit for creating, through his "management style," a "new generation" of Free Software—a younger generation than that of Stallman and Raymond. Linus and Linux are not in fact the causes of this change, but the results of being at the right place at the right time and joining together a number of existing components. Indeed, the title of Torvalds's semi-autobiographical reflection on Linux—Just for Fun: The Story of an Accidental Revolutionary—captures some of the character of its genesis.
The fact of Linus Torvalds's pedagogical embedding in the world of UNIX, Minix, the Free Software Foundation, and the Usenet should not be underestimated, as it often is in hagiographical accounts of the Linux operating system. Without this relatively robust moral-technical order or infrastructure within which it was possible to be at the right place at the right time, Torvalds's late-night dorm-room project would have amounted to little more than that—but the pieces were all in place for his modest goals to be transformed into something much more significant.
So the system is based on Minix, just as Minix had been based on UNIX—piggy-backed or bootstrapped, rather than rewritten in an entirely different fashion, that is, rather than becoming a different kind of operating system. And yet there are clearly concerns about the need to create something that is not Minix, rather than simply extending or "debugging" Minix. This concern is key to understanding what happened to Linux in 1991.
By all accounts, Prentice Hall was not restrictive in its sublicensing of the operating system, if people wanted to create an "enhanced" [pg 217] version of Minix. Similarly, Tanenbaum's frequent presence on comp.os.minix testified to his commitment to sharing his knowledge about the system with anyone who wanted it—not just paying customers. Nonetheless, Torvalds's pointed use of the word free and his decision not to reuse any of the code is a clear indication of his desire to build a system completely unencumbered by restrictions, based perhaps on a kind of intuitive folkloric sense of the dangers associated with cases like that of EMACS.【270 Indeed, initially, Torvalds's terms of distribution for Linux were more restrictive than the GPL, including limitations on distributing it for a fee or for handling costs. Torvalds eventually loosened the restrictions and switched to the GPL in February 1992. Torvalds's release notes for Linux 0.12 say, "The Linux copyright will change: I've had a couple of requests [pg 339] to make it compatible with the GNU copyleft, removing the ‘you may not distribute it for money' condition. I agree. I propose that the copyright be changed so that it conforms to GNU—pending approval of the persons who have helped write code. I assume this is going to be no problem for anybody: If you have grievances (‘I wrote that code assuming the copyright would stay the same') mail me. Otherwise The GNU copyleft takes effect as of the first of February. If you do not know the gist of the GNU copyright—read it" (「http://www.kernel.org/pub/linux/kernel/Historic/old-versions/RELNOTES-0.12)」. 】
The most significant aspect of Torvalds's initial message, however, is his request: "I'd like to know what features most people would want. Any suggestions are welcome, but I won't promise I'll implement them." Torvalds's announcement and the subsequent interest it generated clearly reveal the issues of coordination and organization that would come to be a feature of Linux. The reason Torvalds had so many eager contributors to Linux, from the very start, was because he enthusiastically took them off of Tanenbaum's hands.
Tanenbaum's role in the story of Linux is usually that of the straw man—a crotchety old computer-science professor who opposes the revolutionary young Torvalds. Tanenbaum did have a certain revolutionary reputation himself, since Minix was used in classrooms around the world and could be installed on IBM PCs (something no other commercial UNIX vendors had achieved), but he was also a natural target for people like Torvalds: the tenured professor espousing the textbook version of an operating system. So, despite the fact that a very large number of people were using or knew of Minix as a UNIX operating system (estimates of comp.os.minix subscribers were at 40,000), Tanenbaum was emphatically not interested in collaboration or collaborative debugging, especially if debugging also meant creating extensions and adding features that would make the system bigger and harder to use as a stripped-down tool for teaching. For Tanenbaum, this point was central: "I've been repeatedly offered virtual memory, paging, symbolic links, window systems, and all manner of features. I have usually declined because I am still trying to keep the system simple enough for students to understand. You can put all this stuff in your version, but I won't [pg 218] put it in mine. I think it is this point which irks the people who say ‘MINIX is not free,' not the $60."【271 Message-ID: email@example.com . 】
By contrast, Torvalds's "fun" project had no goals. Being a cocky nineteen-year-old student with little better to do (no textbooks to write, no students, grants, research projects, or committee meetings), Torvalds was keen to accept all the ready-made help he could find to make his project better. And with 40,000 Minix users, he had a more or less instant set of contributors. Stallman's audience for EMACS in the early 1980s, by contrast, was limited to about a hundred distinct computers, which may have translated into thousands, but certainly not tens of thousands of users. Tanenbaum's work in creating a generation of students who not only understood the internals of an operating system but, more specifically, understood the internals of the UNIX operating system created a huge pool of competent and eager UNIX hackers. It was the work of porting UNIX not only to various machines but to a generation of minds as well that set the stage for this event—and this is an essential, though often overlooked component of the success of Linux.
Many accounts of the Linux story focus on the fight between Torvalds and Tanenbaum, a fight carried out on comp.os.minix with the subject line "Linux is obsolete."【272 Message-ID: firstname.lastname@example.org . Key parts of the controversy were reprinted in Dibona et al. Open Sources. 】 Tanenbaum argued that Torvalds was reinventing the wheel, writing an operating system that, as far as the state of the art was concerned, was now obsolete. Torvalds, by contrast, asserted that it was better to make something quick and dirty that worked, invite contributions, and worry about making it state of the art later. Far from illustrating some kind of outmoded conservatism on Tanenbaum's part, the debate highlights the distinction between forms of coordination and the meanings of collaboration. For Tanenbaum, the goals of Minix were either pedagogical or academic: to teach operating-system essentials or to explore new possibilities in operating-system design. By this model, Linux could do neither; it couldn't be used in the classroom because [pg 219] it would quickly become too complex and feature-laden to teach, and it wasn't pushing the boundaries of research because it was an out-of-date operating system. Torvalds, by contrast, had no goals. What drove his progress was a commitment to fun and to a largely inarticulate notion of what interested him and others, defined at the outset almost entirely against Minix and other free operating systems, like FreeBSD. In this sense, it could only emerge out of the context—which set the constraints on its design—of UNIX, open systems, Minix, GNU, and BSD.
Both Tanenbaum and Torvalds operated under a model of coordination in which one person was ultimately responsible for the entire project: Tanenbaum oversaw Minix and ensured that it remained true to its goals of serving a pedagogical audience; Torvalds would oversee Linux, but he would incorporate as many different features as users wanted or could contribute. Very quickly—with a pool of 40,000 potential contributors—Torvalds would be in the same position Tanenbaum was in, that is, forced to make decisions about the goals of Linux and about which enhancements would go into it and which would not. What makes the story of Linux so interesting to observers is that it appears that Torvalds made no decision: he accepted almost everything.
Tanenbaum's goals and plans for Minix were clear and autocratically formed. Control, hierarchy, and restriction are after all appropriate in the classroom. But Torvalds wanted to do more. He wanted to go on learning and to try out alternatives, and with Minix as the only widely available way to do so, his decision to part ways starts to make sense; clearly he was not alone in his desire to explore and extend what he had learned. Nonetheless, Torvalds faced the problem of coordinating a new project and making similar decisions about its direction. On this point, Linux has been the subject of much reflection by both insiders and outsiders. Despite images of Linux as either an anarchic bazaar or an autocratic dictatorship, the reality is more subtle: it includes a hierarchy of contributors, maintainers, and "trusted lieutenants" and a sophisticated, informal, and intuitive sense of "good taste" gained through reading and incorporating the work of co-developers.
While it was possible for Torvalds to remain in charge as an individual for the first few years of Linux (1991-95, roughly), he eventually began to delegate some of that control to people who would make decisions about different subcomponents of the kernel. [pg 220] It was thus possible to incorporate more of the "patches" (pieces of code) contributed by volunteers, by distributing some of the work of evaluating them to people other than Torvalds. This informal hierarchy slowly developed into a formal one, as Steven Weber points out: "The final de facto ‘grant' of authority came when Torvalds began publicly to reroute relevant submissions to the lieutenants. In 1996 the decision structure became more formal with an explicit differentiation between ‘credited developers' and ‘maintainers.' . . . If this sounds very much like a hierarchical decision structure, that is because it is one—albeit one in which participation is strictly voluntary."【273 Steven Weber, The Success of Open Source, 164. 】
By 1995-96, Torvalds and lieutenants faced considerable challenges with regard to hierarchy and decision-making, as the project had grown in size and complexity. The first widely remembered response to the ongoing crisis of benevolent dictatorship in Linux was the creation of "loadable kernel modules," conceived as a way to release some of the constant pressure to decide which patches would be incorporated into the kernel. The decision to modularize [pg 221] Linux was simultaneously technical and social: the software-code base would be rewritten to allow for external loadable modules to be inserted "on the fly," rather than all being compiled into one large binary chunk; at the same time, it meant that the responsibility to ensure that the modules worked devolved from Torvalds to the creator of the module. The decision repudiated Torvalds's early opposition to Tanenbaum in the "monolithic vs. microkernel" debate by inviting contributors to separate core from peripheral functions of an operating system (though the Linux kernel remains monolithic compared to classic microkernels). It also allowed for a significant proliferation of new ideas and related projects. It both contracted and distributed the hierarchy; now Linus was in charge of a tighter project, but more people could work with him according to structured technical and social rules of responsibility.
Creating loadable modules changed the look of Linux, but not because of any planning or design decisions set out in advance. The choice is an example of the privileged adaptability of the Linux, resolving the tension between the curiosity and virtuosity of individual contributors to the project and the need for hierarchical control in order to manage complexity. The commitment to adaptability dissolves the distinction between the technical means of coordination and the social means of management. It is about producing a meaningful whole by which both people and code can be coordinated—an achievement vigorously defended by kernel hackers.
The adaptable organization and structure of Linux is often described in evolutionary terms, as something without teleological purpose, but responding to an environment. Indeed, Torvalds himself has a weakness for this kind of explanation.
Let's just be honest, and admit that it [Linux] wasn't designed.
And I know better than most that what I envisioned 10 years ago has nothing in common with what Linux is today. There was certainly no premeditated design there.【274 Quoted in Zack Brown, "Kernel Traffic #146 for 17Dec2001," Kernel Traffic, ‹http://www.kerneltraffic.org/kernel-traffic/kt20011217_146.html›; also quoted in Federico Iannacci, "The Linux Managing Model," ‹http://opensource.mit.edu/papers/iannacci2.pdf›. 】
Adaptability does not answer the questions of intelligent design. Why, for example, does a car have four wheels and two headlights? Often these discussions are polarized: either technical objects are designed, or they are the result of random mutations. What this opposition overlooks is the fact that design and the coordination of collaboration go hand in hand; one reveals the limits and possibilities of the other. Linux represents a particular example of such a problematic—one that has become the paradigmatic case of Free Software—but there have been many others, including UNIX, for which the engineers created a system that reflected the distributed collaboration of users around the world even as the lawyers tried to make it conform to legal rules about licensing and practical concerns about bookkeeping and support.
Because it privileges adaptability over planning, Linux is a recursive public: operating systems and social systems. It privileges openness to new directions, at every level. It privileges the right to propose changes by actually creating them and trying to convince others to use and incorporate them. It privileges the right to fork the software into new and different kinds of systems. Given what it privileges, Linux ends up evolving differently than do systems whose life and design are constrained by corporate organization, or by strict engineering design principles, or by legal or marketing definitions of products—in short, by clear goals. What makes this distinction between the goal-oriented design principle and the principle of adaptability important is its relationship to politics. Goals and planning are the subject of negotiation and consensus, or of autocratic decision-making; adaptability is the province of critique. It should be remembered that Linux is by no means an attempt to create something radically new; it is a rewrite of a UNIX operating system, as Torvalds points out, but one that through adaptation can end up becoming something new.
The Apache Web server and the Apache Group (now called the Apache Software Foundation) provide a second illuminating example of the how and why of coordination in Free Software of the 1990s. As with the case of Linux, the development of the Apache project illustrates how adaptability is privileged over planning [pg 223] and, in particular, how this privileging is intended to resolve the tensions between individual curiosity and virtuosity and collective control and decision-making. It is also the story of the progressive evolution of coordination, the simultaneously technical and social mechanisms of coordinating people and code, patches and votes.
Harthill's injunction to collaborate seems surprising in the context of a mailing list and project created to facilitate collaboration, but the injunction is specific: collaborate by making plans and sharing goals. Implicit in his words is the tension between a project with clear plans and goals, an overarching design to which everyone contributes, as opposed to a group platform without clear goals that provides individuals with a setting to try out alternatives. Implicit in his words is the spectrum between debugging an existing piece of software with a stable identity and rewriting the fundamental aspects of it to make it something new. The meaning of collaboration bifurcates here: on the one hand, the privileging of the autonomous work of individuals which is submitted to a group peer review and then incorporated; on the other, the privileging of a set of shared goals to which the actions and labor of individuals is subordinated.【292 Gabriella Coleman captures this nicely in her discussion of the tension between the individual virtuosity of the hacker and the corporate populism of groups like Apache or, in her example, the Debian distribution of Linux. See Coleman, The Social Construction of Freedom. 】
The technical and social forms that Linux and Apache take are enabled by the tools they build and use, from bug-tracking tools and mailing lists to the Web servers and kernels themselves. One such tool plays a very special role in the emergence of these organizations: Source Code Management systems (SCMs). SCMs are tools for coordinating people and code; they allow multiple people in dispersed locales to work simultaneously on the same object, the same source code, without the need for a central coordinating overseer and without the risk of stepping on each other's toes. The history of SCMs—especially in the case of Linux—also illustrates the recursive-depth problem: namely, is Free Software still free if it is created with non-free tools?
Both the Apache project and the Linux kernel project use SCMs. In the case of Apache the original patch-and-vote system quickly began to strain the patience, time, and energy of participants as the number of contributors and patches began to grow. From the very beginning of the project, the contributor Paul Richards had urged the group to make use of cvs. He had extensive experience with the system in the Free-BSD project and was convinced that it provided a superior alternative to the patch-and-vote system. Few other contributors had much experience with it, however, so it wasn't until over a year after Richards began his admonitions that cvs was eventually adopted. However, cvs is not a simple replacement for a patch-and-vote system; it necessitates a different kind of organization. Richards recognized the trade-off. The patch-and-vote system created a very high level of quality assurance and peer review of the patches that people submitted, while the cvs system allowed individuals to make more changes that might not meet the same level of quality assurance. The cvs system allowed branches—stable, testing, experimental—with different levels of quality assurance, while the patch-and-vote system was inherently directed at one final and stable version. As the case of Shambhala [pg 232] exhibited, under the patch-and-vote system experimental versions would remain unofficial garage projects, rather than serve as official branches with people responsible for committing changes.
The Linux kernel has also struggled with various issues surrounding SCMs and the management of responsibility they imply. The story of the so-called VGER tree and the creation of a new SCM called Bitkeeper is exemplary in this respect.【296 See Steven Weber, The Success of Open Source, 117-19; Moody, Rebel Code, 172-78. See also Shaikh and Cornford, "Version Management Tools." 】 By 1997, Linux developers had begun to use cvs to manage changes to the source code, though not without resistance. Torvalds was still in charge of the changes to the official stable tree, but as other "lieutenants" came on board, the complexity of the changes to the kernel grew. One such lieutenant was Dave Miller, who maintained a "mirror" of the stable Linux kernel tree, the VGER tree, on a server at Rutgers. In September 1998 a fight broke out among Linux kernel developers over two related issues: one, the fact that Torvalds was failing to incorporate (patch) contributions that had been forwarded to him by various people, including his lieutenants; and two, as a result, the VGER cvs repository was no longer in synch with the stable tree maintained by Torvalds. Two different versions of Linux threatened to emerge.
A great deal of yelling ensued, as nicely captured in Moody's Rebel Code, culminating in the famous phrase, uttered by Larry McVoy: "Linus does not scale." The meaning of this phrase is that the ability of Linux to grow into an ever larger project with increasing complexity, one which can handle myriad uses and functions (to "scale" up), is constrained by the fact that there is only one Linus Torvalds. By all accounts, Linus was and is excellent at what he does—but there is only one Linus. The danger of this situation is the danger of a fork. A fork would mean one or more new versions would proliferate under new leadership, a situation much like [pg 233] the spread of UNIX. Both the licenses and the SCMs are designed to facilitate this, but only as a last resort. Forking also implies dilution and confusion—competing versions of the same thing and potentially unmanageable incompatibilities.
McVoy was well-known in geek circles before Linux. In the late stages of the open-systems era, as an employee of Sun, he had penned an important document called "The Sourceware Operating System Proposal." It was an internal Sun Microsystems document that argued for the company to make its version of UNIX freely available. It was a last-ditch effort to save the dream of open systems. It was also the first such proposition within a company to "go open source," much like the documents that would urge Netscape to Open Source its software in 1998. Despite this early commitment, McVoy chose not to create Bitkeeper as a Free Software project, but to make it quasi-proprietary, a decision that raised a very central question in ideological terms: can one, or should one, create Free Software using non-free tools?
On one side of this controversy, naturally, was Richard Stallman and those sharing his vision of Free Software. On the other were pragmatists like Torvalds claiming no goals and no commitment to "ideology"—only a commitment to "fun." The tension laid bare the way in which recursive publics negotiate and modulate the core components of Free Software from within. Torvalds made a very strong and vocal statement concerning this issue, responding to Stallman's criticisms about the use of non-free software to create Free Software: "Quite frankly, I don't _want_ people using Linux for ideological reasons. I think ideology sucks. This world would be a much better place if people had less ideology, and a whole lot more ‘I do this because it's FUN and because others might find it useful, not because I got religion.'"【297 Linus Torvalds, "Re: [PATCH] Remove Bitkeeper Documentation from Linux Tree," 20 April 2002, ‹http://www.uwsg.indiana.edu/hypermail/linux/kernel/0204.2/1018.html›. Quoted in Shaikh and Cornford, "Version Management Tools." 】
Torvalds emphasizes pragmatism in terms of coordination: the right tool for the job is the right tool for the job. In terms of licenses, [pg 234] however, such pragmatism does not play, and Torvalds has always been strongly committed to the GPL, refusing to let non-GPL software into the kernel. This strategic pragmatism is in fact a recognition of where experimental changes might be proposed, and where practices are settled. The GPL was a stable document, sharing source code widely was a stable practice, but coordinating a project using SCMs was, during this period, still in flux, and thus Bitkeeper was a tool well worth using so long as it remained suitable to Linux development. Torvalds was experimenting with the meaning of coordination: could a non-free tool be used to create Free Software?
McVoy, on the other hand, was on thin ice. He was experimenting with the meaning of Free Software licenses. He created three separate licenses for Bitkeeper in an attempt to play both sides: a commercial license for paying customers, a license for people who sell Bitkeeper, and a license for "free users." The free-user license allowed Linux developers to use the software for free—though it required them to use the latest version—and prohibited them from working on a competing project at the same time. McVoy's attempt to have his cake and eat it, too, created enormous tension in the developer community, a tension that built from 2002, when Torvalds began using Bitkeeper in earnest, to 2005, when he announced he would stop.
The developer Andrew Trigdell, well known for his work on a project called Samba and his reverse engineering of a Microsoft networking protocol, began a project to reverse engineer Bitkeeper by looking at the metadata it produced in the course of being used for the Linux project. By doing so, he crossed a line set up by McVoy's experimental licensing arrangement: the "free as long as you don't copy me" license. Lawyers advised Trigdell to stay silent on the topic while Torvalds publicly berated him for "willful destruction" and a moral lapse of character in trying to reverse engineer Bitkeeper. Bruce Perens defended Trigdell and censured Torvalds for his seemingly contradictory ethics.【298 Andrew Orlowski, "‘Cool it, Linus'—Bruce Perens," Register, 15 April 2005, ‹http://www.theregister.co.uk/2005/04/15/perens_on_torvalds/page2.html›. 】 McVoy never sued Trigdell, and Bitkeeper has limped along as a commercial project, because, [pg 235] much like the EMACS controversy of 1985, the Bitkeeper controversy of 2005 ended with Torvalds simply deciding to create his own SCM, called git.
The story of the VGER tree and Bitkeeper illustrate common tensions within recursive publics, specifically, the depth of the meaning of free. On the one hand, there is Linux itself, an exemplary Free Software project made freely available; on the other hand, however, there is the ability to contribute to this process, a process that is potentially constrained by the use of Bitkeeper. So long as the function of Bitkeeper is completely circumscribed—that is, completely planned—there can be no problem. However, the moment one user sees a way to change or improve the process, and not just the kernel itself, then the restrictions and constraints of Bitkeeper can come into play. While it is not clear that Bitkeeper actually prevented anything, it is also clear that developers clearly recognized it as a potential drag on a generalized commitment to adaptability. Or to put it in terms of recursive publics, only one layer is properly open, that of the kernel itself; the layer beneath it, the process of its construction, is not free in the same sense. It is ironic that Torvalds—otherwise the spokesperson for antiplanning and adaptability—willingly adopted this form of constraint, but not at all surprising that it was collectively rejected.
Novelty, both in the case of Linux and in intellectual property law more generally, is directly related to the interplay of social and technical coordination: goal direction vs. adaptability. The ideal of adaptability promoted by Torvalds suggests a radical alternative to the dominant ideology of creation embedded in contemporary intellectual-property systems. If Linux is "new," it is new through adaptation and the coordination of large numbers of creative contributors who challenge the "design" of an operating system from the bottom up, not from the top down. By contrast, McVoy represents a moral imagination of design in which it is impossible to achieve novelty without extremely expensive investment in top-down, goal-directed, unpolitical design—and it is this activity that the intellectual-property system is designed to reward. Both are engaged, however, in an experiment; both are engaged in "figuring out" what the limits of Free Software are.
Coordination is a key component of Free Software, and is frequently identified as the central component. Free Software is the result of a complicated story of experimentation and construction, and the forms that coordination takes in Free Software are specific outcomes of this longer story. Apache and Linux are both experiments—not scientific experiments per se but collective social experiments in which there are complex technologies and legal tools, systems of coordination and governance, and moral and technical orders already present.
Free Software is an experimental system, a practice that changes with the results of new experiments. The privileging of adaptability makes it a peculiar kind of experiment, however, one not directed by goals, plans, or hierarchical control, but more like what John Dewey suggested throughout his work: the experimental praxis of science extended to the social organization of governance in the service of improving the conditions of freedom. What gives this experimentation significance is the centrality of Free Software—and specifically of Linux and Apache—to the experimental expansion of the Internet. As an infrastructure or a milieu, the Internet is changing the conditions of social organization, changing the relationship of knowledge to power, and changing the orientation of collective life toward governance. Free Software is, arguably, the best example of an attempt to make this transformation public, to ensure that it uses the advantages of adaptability as critique to counter the power of planning as control. Free Software, as a recursive public, proceeds by proposing and providing alternatives. It is a bit like Kant's version of enlightenment: insofar as geeks speak (or hack) as scholars, in a public realm, they have a right to propose criticisms and changes of any sort; as soon as they relinquish [pg 240] that commitment, they become private employees or servants of the sovereign, bound by conscience and power to carry out the duties of their given office. The constitution of a public realm is not a universal activity, however, but a historically specific one: Free Software confronts the specific contemporary technical and legal infrastructure by which it is possible to propose criticisms and offer alternatives. What results is a recursive public filled not only with individuals who govern their own actions but also with code and concepts and licenses and forms of coordination that turn these actions into viable, concrete technical forms of life useful to inhabitants of the present.
At about the same time as his idea for a textbook, Rich's research group was switching over to Linux, and Rich was first learning about Open Source and the emergence of a fully free operating system created entirely by volunteers. It isn't clear what Rich's aha! moment was, other than simply when he came to an understanding that such a thing as Linux was actually possible. Nonetheless, at some point, Rich had the idea that his textbook could be an Open Source textbook, that is, a textbook created not just by him, but by DSP researchers all over the world, and made available to everyone to make use of and modify and improve as they saw fit, just like Linux. Together with Brent Hendricks, Yan David Erlich, [pg 249] and Ross Reedstrom, all of whom, as geeks, had a deep familiarity with the history and practices of Free and Open Source Software, Rich started to conceptualize a system; they started to think about modulations of different components of Free and Open Source Software. The idea of a Free Software textbook repository slowly took shape.
The modulated meaning of source code creates all kinds of new questions—specifically with respect to the other four components. In terms of openness, for instance, Connexions modulates this component very little; most of the actors involved are devoted to the ideals of open systems and open standards, insofar as it is a Free Software project of a conventional type. It builds on UNIX (Linux) and the Internet, and the project leaders maintain a nearly fanatical devotion to openness at every level: applications, programming languages, standards, protocols, mark-up languages, interface tools. Every place where there is an open (as opposed to a [pg 256] proprietary) solution—that choice trumps all others (with one noteworthy exception).【310 The most significant exception has been the issue of tools for authoring content in XML. For most of the life of the Connexions project, the XML mark-up language has been well-defined and clear, but there has been no way to write a module in XML, short of directly writing the text and the tags in a text editor. For all but a very small number of possible users, this feels too much like programming, and they experience it as too frustrating to be worth it. The solution (albeit temporary) was to encourage users to make use of a proprietary XML editor (like a word processor, but capable of creating XML content). Indeed, the Connexions project's devotion to openness was tested by one of the most important decisions its participants made: to pursue the creation of an Open Source XML text editor in order to provide access to completely open tools for creating completely open content. 】 James Boyle recently stated it well: "Wherever possible, design the system to run with open content, on open protocols, to be potentially available to the largest possible number of users, and to accept the widest possible range of experimental modifications from users who can themselves determine the development of the technology."【311 Boyle, "Mertonianism Unbound," 14. 】
Perhaps unsurprisingly, the Connexions team spent a great deal of time at the outset of the project creating a pdf-document-creation system that would essentially mimic the creation of a conventional textbook, with the push of a button.【333 Conventional here is actually quite historically proximate: the system creates a pdf document by translating the XML document into a LaTeX document, then into a pdf document. LaTeX has been, for some twenty years, a standard text-formatting and typesetting language used by some [pg 345] sectors of the publishing industry (notably mathematics, engineering, and computer science). Were it not for the existence of this standard from which to bootstrap, the Connexions project would have faced a considerably more difficult challenge, but much of the infrastructure of publishing has already been partially transformed into a computer-mediated and -controlled system whose final output is a printed book. Later in Connexions's lifetime, the group coordinated with an Internet-publishing startup called Qoop.com to take the final step and make Connexions courses available as print-on-demand, cloth-bound textbooks, complete with ISBNs and back-cover blurbs. 】 But even this process causes a subtle transformation: the concept of "edition" becomes much harder to track. While a conventional textbook is a stable entity that goes through a series of printings and editions, each of which is marked on its publication page, a Connexions document can go through as many versions as an author wants to make changes, all the while without necessarily changing editions. In this respect, the modulation of the concept of source code translates the practices of updating and "versioning" into the realm of textbook writing. Recall the cases ranging from the "continuum" of UNIX versions discussed by Ken Thompson to the complex struggles over version control in the Linux and Apache projects. In the case of writing source code, exactitude demands that the change of even a single character be tracked and labeled as a version change, whereas a [pg 278] conventional-textbook spelling correction or errata issuance would hardly create the need for a new edition.
In the case of shared software source code, one of the principal reasons for sharing it was to reuse it: to build on it, to link to it, to employ it in ways that made building more complex objects into an easier task. The very design philosophy of UNIX well articulates the necessity of modularity and reuse, and the idea is no less powerful in other areas, such as textbooks. But just as the reuse of software is not simply a feature of software's technical characteristics, the idea of "reusing" scholarly materials implies all kinds of questions that are not simply questions of recombining texts. The ability to share source code—and the ability to create complex software based on it—requires modulations of both the legal meaning of software, as in the case of EMACS, and the organizational form, as in the [pg 290] emergence of Free Software projects other than the Free Software Foundation (the Linux kernel, Perl, Apache, etc.).
The Creative Commons licenses allow authors to grant the use of their work in about a dozen different ways—that is, the license itself comes in versions. One can, for instance, require attribution, prohibit commercial exploitation, allow derivative or modified works to be made and circulated, or some combination of all these. These different combinations actually create different licenses, each of which grants intellectual-property rights under slightly different conditions. For example, say Marshall Sahlins decides to write a paper about how the Internet is cultural; he copyrights the paper ("© 2004 Marshall Sahlins"), he requires that any use of it or any copies of it maintain the copyright notice and the attribution of [pg 295] authorship (these can be different), and he furthermore allows for commercial use of the paper. It would then be legal for a publishing house to take the paper off Sahlins's Linux-based Web server and publish it in a collection without having to ask permission, as long as the paper remains unchanged and he is clearly and unambiguously listed as author of the paper. The publishing house would not get any rights to the work, and Sahlins would not get any royalties. If he had specified noncommercial use, the publisher would instead have needed to contact him and arrange for a separate license (Creative Commons licenses are nonexclusive), under which he could demand some share of revenue and his name on the cover of the book.【345 In December 2006 Creative Commons announced a set of licenses that facilitate the "follow up" licensing of a work, especially one initially issued under a noncommercial license. 】 But say he was, instead, a young scholar seeking only peer recognition and approbation—then royalties would be secondary to maximum circulation. Creative Commons allows authors to assert, as its members put it, "some rights reserved" or even "no rights reserved."
Benkler, Yochai. "Coase's Penguin, or Linux and the Nature of the Firm." Yale Law Journal 112.3 (2002): 369-446.
Gancarz, Mike. Linux and the UNIX Philosophy. Boston: Digital Press, 2003.
Moody, Glyn. Rebel Code: Inside Linux and the Open Source Revolution. Cambridge, Mass.: Perseus, 2001.
———. "The Pressure of Openness: The Hybrid work of Linux Free/Open Source Kernel Developers." Ph.D. diss., University of California, San Diego, 2003.
———. The Cathedral and the Bazaar: Musings on Linux and Open Source by an Accidental Revolutionary. Sebastopol, Calif.: O'Reilly Press, 2001. See esp. "Homesteading the Noosphere," 79-135.
Shaikh, Maha, and Tony Cornford. "Version Management Tools: CVS to BK in the Linux Kernel." Paper presented at the Twenty-fifth International Conference on Software Engineering—Taking Stock of the Bazaar: The Third Workshop on Open Source Software Engineering, Portland, Oregon, 3-10 May 2003.
Wayner, Peter. Free for All: How LINUX and the Free Software Movement Undercut the High-Tech Titans. New York: Harper Business, 2000.
The salience of electronic commerce has, at times, obscured an important fact — that the commons is one of the most potent forces driving innovation in our time. Individuals working with one another via social networks are a growing force in our economy and society. This phenomenon has many manifestations, and goes by many names — “peer production,” “social production,” “smart mobs,” the “wisdom of crowds,” “crowdsourcing,” and “the commons.”【3 “Social production” and “peer production” are associated with the work of Yale law professor Yochai Benkler, especially in his 2006 book, The Wealth of Networks. “Smart mobs” is a coinage of Howard Rheingold, author of a 2003 book by the same name.“Crowdsourcing” is the name of a blog run by Jeff Howe and the title of a June 2006 Wired article on the topic.“Wisdom of crowds” is a term coined by James Surowiecki and used as the title of his 2004 book. 】 The basic point is that socially created value is increasingly competing with conventional markets, as GNU/Linux has famously shown. Through an open, accessible commons, one can efficiently tap into the “wisdom of the crowd,” nurture experimentation, accelerate innovation, and foster new forms of democratic practice.
By the late 1990s, this legal scholarship was in full flower, Internet usage was soaring, and the free software movement produced its first significant free operating system, GNU/Linux. The commoners were ready to take practical action. Lessig, then a professor at Harvard Law School, engineered a major constitutional test case, Eldred v. Reno (later Eldred v. Ashcroft), to try to strike down a twentyyear extension of copyright terms — a case that reached the U.S. Supreme Court in 2002. At the same time, Lessig and a number of his colleagues, including MIT computer scientist Hal Abelson, Duke law professor James Boyle, and Villanova law professor Michael W. Carroll, came together to explore innovative ways to protect the public domain. It was a rare moment in history in which an ad hoc salon of brilliant, civic-minded thinkers from diverse fields of endeavor found one another, gave themselves the freedom to dream big thoughts, and embarked upon practical plans to make them real.
Open business. One of the most surprising recent developments has been the rise of “open business” models. Unlike traditional businesses that depend upon proprietary technology or content, a new breed of businesses see lucrative opportunities in exploiting open, participatory networks. The pioneer in this strategy was IBM, which in 2000 embraced GNU/Linux, the open-source computer operating system, as the centerpiece of its service and consulting business.【16 Steve Lohr, “IBM to Give Free Access to 500 Patents, New York Times, July 11, 2005. See also Steven Weber, The Success of Open Source Software (Cambridge, Mass.: Harvard University Press, 2004), pp. 202–3. See also Pamela Samuelson, “IBM's Pragmatic Embrace of Open Source,” Communications of the ACM 49, no. 21 (October 2006). 】 Dozens of small, Internet-based companies are now exploiting open networks to build more flexible, sustainable enterprises.
Stallman's atavistic zeal to preserve the hacker community, embodied in the GPL, did not immediately inspire others. In fact, most of the tech world was focused on how to convert software into a marketable product. Initially, the GPL functioned like a spore lying dormant, waiting until a more hospitable climate could activate its full potential. Outside of the tech world, few people knew about the GPL, or cared.〖*2 The GPL is not the only software license around, of course, although it was, and remains, the most demanding in terms of protecting the commons of code. Other popular open-source licenses include the MIT, BSD, and Apache licenses, but each of these permit, but do not require, that the source code of derivative works also be freely available. The GPL, however, became the license used for Linux, a quirk of history that has had far-reaching implications. 〗 And even most techies were oblivious to the political implications of free software.
In 1991, Torvalds was a twenty-one-year-old computer science student at the University of Helsinki, in Finland. Frustrated by the expense and complexity of Unix, and its inability to work on personal computers, Torvalds set out to build a Unix-like operating system on his IBM AT, which had a 33-megahertz processor and four megabytes of memory. Torvalds released a primitive version of his program to an online newsgroup and was astonished when a hundred hackers responded within a few months to offer suggestions and additions. Over the next few years, hundreds of additional programmers joined the project, which he named “Linux” by combining his first name, “Linus,” with “Unix.” The first official release of his program came in 1994.【27 One useful history of Torvalds and Linux is Glyn Moody, Rebel Code: Inside Linux and the Open Source Revolution (Cambridge, MA: Perseus, 2001). 】
The Linux kernel, when combined with the GNU programs developed by Stallman and his free software colleagues, constituted a complete computer operating system — an astonishing and unexpected achievement. Even wizened computer scientists could hardly believe that something as complex as an operating system could be developed by thousands of strangers dispersed around the globe, cooperating via the Internet. Everyone assumed that a software program had to be organized by a fairly small group of leaders actively supervising the work of subordinates through a hierarchical authority system — that is, by a single corporation. Yet here was a virtual community of hackers, with no payroll or corporate structure, coming together in a loose, voluntary, quasi-egalitarian way, led by leaders who had earned the trust and respect of some highly talented programmers.
The real innovation of Linux, writes Eric S. Raymond, a leading analyst of the technology, was “not technical, but sociological”:
Linux was rather casually hacked on by huge numbers of volunteers coordinating only through the Internet. Quality was maintained not by rigid standards or autocracy but by the naively simple strategy of releasing every week and getting feedback from hundreds of users within days, creating a sort of rapid Darwinian selection on the mutations introduced by developers. To the amazement of almost everyone, this worked quite well.【28 Eric S. Raymond, “A Brief History of Hackerdom,” ‹http://www.catb.org/~est/writings/cathedral-bazaar/hacker-history/ar01s06.html›. 】
The Free Software Foundation had a nominal project to develop a kernel, but it was not progressing very quickly. The Linux kernel, while primitive, “was running and ready for experimentation,” writes Steven Weber in his book The Success of Open Source: “Its crude functionality was interesting enough to make people believe that it could, with work, evolve into something important. That promise was critical and drove the broader development process from early on.”【29 Steven Weber, The Success of Open Source (Cambridge, MA: Harvard University Press, 2004), p. 100. 】
There were other powerful forces driving the development of Linux. Throughout the 1990s, Microsoft continued to leverage its monopoly grip over the operating system of personal computers, eventually attracting the attention of the U.S. Department of Justice, which filed an antitrust lawsuit against the company. Software competitors such as Hewlett-Packard, Sun Microsystems, and IBM found that rallying behind an open-source alternative — one that was legally protected against being taken private by anyone else— offered a terrific way to compete against Microsoft.
Given these problems, there was great appeal in a Unix-like operating system with freely available source code. Linux helped address the fragmentation of Unix implementations and the difficulties of competing against the Microsoft monopoly. Knowing that Linux was GPL'd, hackers, academics, and software companies could all contribute to its development without fear that someone might take it private, squander their contributions, or use it in hostile ways. A commons of software code offered a highly pragmatic solution to a market dysfunction.
Stallman's GNU Project and Torvalds's Linux software were clearly synergistic, but they represented very different styles. The GNU Project was a slower, more centrally run project compared to the “release early and often” developmental approach used by the Linux community. In addition, Stallman and Torvalds had temperamental and leadership differences. Stallman has tended to be more overbearing and directive than Torvalds, who does not bring a political analysis to the table and is said to be more tolerant of diverse talents.【31 Torvalds included a brief essay, “Linux kernel management style,” dated October 10, 2004, in the files of the Linux source code, with the annotation, “Wisdom passed down the ages on clay tablets.” It was included as an epilogue in the book Open Life: The Philosophy of Open Source, by Henrik Ingo, and is available at ‹http://www.openlife.cc/node/43›. 】
So despite their natural affinities, the Free Software Community and the Linux community never found their way to a grand merger. Stallman has applauded Linux's success, but he has also resented the eclipse of GNU programs used in the operating system by the Linux name. This prompted Stallman to rechristen the program “GNU/Linux,” a formulation that many people now choose to honor.
Yet many hackers, annoyed at Stallman's political crusades and crusty personal style, committed their own linguistic raid by renaming “free software” as “open source software,” with a twist. As GNU/Linux became more widely used in the 1990s, and more corporations began to seriously consider using it, the word free in “free software” was increasingly seen as a problem. The “free as in free speech, not as in free beer” slogan never quite dispelled popular misconceptions about the intended sense of the word free. Corporate information technology (IT) managers were highly wary about putting mission-critical corporate systems in the hands of software that could be had for free. Imagine telling the boss that you put the company's fate in the hands of a program you downloaded from the Internet for free!
One response to this issue was the rebranding of free software as “open-source” software. A number of leading free software programmers, most notably Bruce Perens, launched an initiative to set forth a consensus definition of software that would be called “opensource.” At the time, Perens was deeply involved with a community of hackers in developing a version of Linux known as the Debian GNU/Linux distribution. Perens and other leading hackers not only wanted to shed the off-putting political dimensions of “free software,” they wanted to help people deal with the confusing proliferation of licenses. A lot of software claimed to be free, but who could really tell what that meant when the terms were so complicated and legalistic?
The Open Source Initiative, begun in 1998, helped solve this problem by enumerating criteria that it considered significant in judging a program to be “open.”【33 ‹http://www.opensource.org›. 】 Its criteria, drawn from the Debian community, helped standardize and stabilize the definition of open-source software. Unlike the GPL, permissive software licenses such as BSD and MIT allow a program to be freely copied, modified, and distributed but don't require it. A programmer can choose to make a proprietary derivative without violating the license.
The Linux world behaves in many respects like a free market or an ecology, a collection of selfish agents attempting to maximize utility which in the process produces a selfcorrecting spontaneous order more elaborate and efficient than any amount of central planning could have achieved. . . . The utility function Linux hackers are maximizing is not classically economic, but is the intangible of their own ego satisfaction and reputation among other hackers.【36 Eric Raymond, “The Cathedral and the Bazaar,” available at ‹http://www.catb.org/~esr/writings/cathedral-bazaar/cathedral-bazaar/ar01s11.html›. 】
Red Hat, a company founded in 1993 by Robert Young, was the first to recognize the potential of selling a custom version (or “distribution”) of GNU/Linux as a branded product, along with technical support. A few years later, IBM became one of the first large corporations to recognize the social realities of GNU/Linux and its larger strategic and competitive implications in the networked environment. In 1998 IBM presciently saw that the new software development ecosystem was becoming far too variegated and robust for any single company to dominate. It understood that its proprietary mainframe software could not dominate the burgeoning, diversified Internet-driven marketplace, and so the company adopted the open-source Apache Web server program in its new line of WebSphere business software.
It was a daring move that began to bring the corporate and open-source worlds closer together. Two years later, in 2000, IBM announced that it would spend $1 billion to help develop GNU/Linux for its customer base. IBM shrewdly realized that its customers wanted to slash costs, overcome system incompatibilities, and avoid expensive technology “lock-ins” to single vendors. GNU/Linux filled this need well. IBM also realized that GNU/Linux could help it compete against Microsoft. By assigning its property rights to the commons, IBM could eliminate expensive property rights litigation, entice other companies to help it improve the code (they could be confident that IBM could not take the code private), and unleash a worldwide torrent of creative energy focused on GNU/Linux. Way ahead of the curve, IBM decided to reposition itself for the emerging networked marketplace by making money through tech service and support, rather than through proprietary software alone.【38 Andrew Leonard, “How Big Blue Fell for Linux,” Salon.com, September 12, 2000, available at ‹http://www.salon.com/tech/fsp/2000/09/12/chapter_7_part_one.print.html›. The competitive logic behind IBM's moves are explored in Pamela Samuelson, “IBM's Pragmatic Embrace of Open Source,” Communications of the ACM 49, no. 21 (October 2006), and Robert P. Merges, “A New Dynamism in the Public Domain,” University of Chicago Law Review 71, no. 183 (Winter 2004). 】
It was not long before other large tech companies realized the benefits of going open source. Amazon and eBay both saw that they could not affordably expand their large computer infrastructures without converting to GNU/Linux. GNU/Linux is now used in everything from Motorola cell phones to NASA supercomputers to laptop computers. In 2005, BusinessWeek magazine wrote, “Linux may bring about the greatest power shift in the computer industry since the birth of the PC, because it lets companies replace expensive proprietary systems with cheap commodity servers.”【39 Steve Hamm, “Linux Inc.,” BusinessWeek, January 31, 2005. 】 As many as one-third of the programmers working on open-source projects are corporate employees, according to a 2002 survey.【40 Cited by Elliot Maxwell in “Open Standards Open Source and Open Innovation,” note 80, Berlecon Research, Free/Libre Open Source Software: Survey and Study — Firms' Open Source Activities: Motivations and Policy Implications, FLOSS Final Report, Part 2, at www.berlecon.de/studien/downloads/200207FLOSS _Activities.pdf. 】
With faster computing speeds and cost savings of 50 percent or more on hardware and 20 percent on software, GNU/Linux has demonstrated the value proposition of the commons. Open source demonstrated that it can be cheaper and more efficacious to collaborate in the production of a shared resource based on common standards than to strictly buy and own it as private property.
But how does open source work without a conventional market apparatus? The past few years have seen a proliferation of sociological and economic theories about how open-source communities create value. One formulation, by Rishab Ghosh, compares free software development to a “cooking pot,” in which you can give a little to the pot yet take a lot — with no one else being the poorer. “Value” is not measured economically at the point of transaction, as in a market, but in the nonmonetary flow of value that a project elicits (via volunteers) and generates (through shared software).【41 Rishab Aiyer Ghosh, “Cooking Pot Markets and Balanced Value Flows,” in Rishab Aiyer Ghosh, ed., CODE: Collaborative Ownership and the Digital Economy (Cambridge, MA: MIT Press, 2005), pp. 153–68. 】 Another important formulation, which we will revisit later, comes from Harvard law professor Yochai Benkler, who has written that the Internet makes it cheap and easy to access expertise anywhere on the network, rendering conventional forms of corporate organization costly and cumbersome for many functions. Communities based on social trust and reciprocity are capable of mobilizing creativity and commitment in ways that market incentives often cannot — and this can have profound economic implications.【42 See, e.g., Benkler, “Coase's Penguin, or Linux and the Nature of the Firm,” Yale Law Journal 112, no. 369 (2002); Benkler, “ ‘Sharing Nicely': On Shareable Goods and the Emergence of Sharing as a Modality of Economic Production,” Yale Law Journal 114, no. 273 (2004). 】 Benkler's analysis helps explain how a global corps of volunteers could create an operating system that, in many respects, outperforms software created by a well-paid army of Microsoft employees.
Nearly twenty years after the introduction of the GPL, free software has expanded phenomenally. It has given rise to countless FOSS software applications, many of which are major viral hits such as Thunderbird (e-mail), Firefox (Web browser), Ubuntu (desktop GNU/Linux), and Asterisk (Internet telephony). FOSS has set in motion, directly or indirectly, some powerful viral spirals such as the Creative Commons licenses, the iCommons/free culture movement, the Science Commons project, the open educational resource movement, and a new breed of open-business ventures, Yet Richard Stallman sees little connection between these various “open” movements and free software; he regards “open” projects as too vaguely defined to guarantee that their work is truly “free” in the free software sense of the term. “Openness and freedom are not the same thing,” said Stallman, who takes pains to differentiate free software from open-source software, emphasizing the political freedoms that lie at the heart of the former.【44 Interview with Richard Stallman, January 21, 2008. 】
The DMCA has been roundly denounced by software programmers, music fans, and Internet users for prohibiting them from making personal copies, fair use excerpts, and doing reverse engineering on software, even with legally purchased products. Using digital rights management systems sanctioned by the DMCA, for example, many CDs and DVDs are now coded with geographic codes that prevent consumers from operating them on devices on other continents. DVDs may contain code to prevent them from running on Linux-based computers. Digital journals may “expire” after a given period of time, wiping out library holdings unless another payment is made. Digital textbooks may go blank at the end of the school year, preventing their reuse or resale.
Initially, the goal was more exploratory and improvisational — an earnest attempt to find leverage points for dealing with the intolerable constraints of copyright law. Fortunately, there were instructive precedents, most notably free software, which by 2000, in its opensource guise, was beginning to find champions among corporate IT managers and the business press. Mainstream programmers and corporations started to recognize the virtues of GNU/Linux and opensource software more generally. Moreover, a growing number of people were internalizing the lessons of Code, that the architecture of software and the Internet really does matter.
For all of its brainpower and commitment, Lessig's rump caucus might not have gotten far if it had not found a venturesome source of money, the Center for the Public Domain. The center — originally the Red Hat Center — was a foundation created by entrepreneur Robert Young in 2000 following a highly successful initial public offering of Red Hat stock. As the founder of Red Hat, a commercial vendor of GNU/Linux, Young was eager to repay his debt to the fledgling public-domain subculture. He also realized, with the foresight of an Internet entrepreneur, that strengthening the public domain would only enhance his business prospects over the long term. (It has; Young later founded a print-on-demand publishing house, Lulu.com, that benefits from the free circulation of electronic texts, while making money from printing hard copies.)
Another new business using CC licenses is Lulu, a technology company started by Robert Young, the founder of the Linux vendor Red Hat and benefactor for the Center for the Public Domain.Lulu lets individuals publish and distribute their own books, which can be printed on demand or downloaded. Lulu handles all the details of the publishing process but lets people control their content and rights. Hundreds of people have licensed their works under the CC ShareAlike license and Public Domain Dedication, and under the GNU Project's Free Documentation License.【200 Mia Garlick, “Lulu,” Creative Commons blog, May 17, 2006, at ‹http://creativecommons.org/text/lulu›. 】
Another set of major revisions to the licenses was taken up for discussion in 2006, and agreed upon in February 2007.【248 7. Mia Garlick, “Version 3.0 Launched,” CC blog, ‹http://creativecommons.org/weblog/entry/7249›. 】 Once again, the layperson would care little for the debates leading to the changes, but considerable, sometimes heated discussion went into the revisions. In general, the 3.0 tweaks sought to make the licenses clearer, more useful, and more enforceable. The issue of “moral rights” under copyright law — an issue in many European countries — is explicitly addressed, as are the complications of the CC licenses and collecting societies. New legal language was introduced to ensure that people who remix works under other licenses, such as the GNU Free Documentation License (FDL), would be able to also use CC-licensed materials in the same work — an important provision for preventing free culture from devolving into “autistic islands” of legally incomptabile material. Besides helping align the CC world with Wikipedia (which uses the GNU FDL license), the 3.0 revisions also made harmonizing legal changes to take account of MIT and the Debian software development community.
This history matters, because when Gil was appointed culture minister, he brought with him a rare political sophistication and public veneration. His moral stature and joyous humanity allowed him to transcend politics as conventionally practiced. “Gil wears shoulder-length dreadlocks and is apt to show up at his ministerial offices dressed in the simple white linens that identify him as a follower of the Afro-Brazilian religion candomblé,” wrote American journalist Julian Dibbell in 2004. “Slouching in and out of the elegant Barcelona chairs that furnish his office, taking the occasional sip from a cup of pinkish herbal tea, he looks — and talks — less like an elder statesman than the posthippie, multiculturalist, Taoist intellectual he is.”【257 Julian Dibbell, “We Pledge Allegiance to the Penguin,” Wired, November 2004, at ‹http://www.wired.com/wired/archive/12.11/linux_pr.html›. 】
One of the first collaborations between Creative Commons and the Brazilian government involved the release of a special CC-GPL license in December 2003.【260 Creative Commons press release, “Brazilian Government First to Adopt New ‘CC-GPL,' ” December 2, 2003. 】 This license adapted the General Public License for software by translating it into Portuguese and putting it into the CC's customary “three layers” — a plain-language version, a lawyers' version compatible with the national copyright law, and a machine-readable metadata expression of the license. The CC-GPL license, released in conjunction with the Free Software Foundation, was an important international event because it gave the imprimatur of a major world government to free software and the social ethic of sharing and reuse. Brazil has since become a champion of GNU/Linux and free software in government agencies and the judiciary. It regards free software and open standards as part of a larger fight for a “development agenda” at the World Intellectual Property Organization and the World Trade Organization. In a related vein, Brazil has famously challenged patent and trade policies that made HIV/AIDS drugs prohibitively expensive for thousands of sick Brazilians.
It is worth noting that a commons does not necessarily preclude making money from the fruit of the commons; it's just that any commercial activity cannot interfere with the integrity of social relationships within the commons. In the case of GPL'd software, for example, Red Hat is able to sell its own versions of GNU/Linux only because it does not “take private” any code or inhibit sharing within the commons. The source code is always available to everyone. By contrast, scientists who patent knowledge that they glean from their participation in a scientific community may be seen as “stealing” community knowledge for private gain. The quest for individual profit may also induce ethical corner-cutting, which undermines the integrity of research in the commons.
Free software was one of the earliest demonstrations of the power of online commons as a way to create value. In his classic 1997 essay “The Cathedral and the Bazaar,” hacker Eric S. Raymond provided a seminal analysis explaining how open networks make software development more cost-effective and innovative than software developed by a single firm.【343 Eric Raymond, “The Cathedral and the Bazaar,” May 1997, at ‹http://www.catb.org/~esr/writings/cathedral-bazaar›. The essay has been translated into nineteen languages to date. 】 A wide-open “bazaar” such as the global Linux community can construct a more versatile operating system than one designed by a closed “cathedral” such as Microsoft. “With enough eyes, all bugs are shallow,” Raymond famously declared. Yochai Benkler gave a more formal economic reckoning of the value proposition of open networks in his pioneering 2002 essay “Coase's Penguin, or, Linux and the Nature of the Firm.”【344 Yochai Benkler, “Coase's Penguin, or, Linux and the Nature of the Firm,” Yale Law Journal 112, no. 369 (2002), at ‹http://www.benkler.org/CoasesPenguin.html›. 】 The title is a puckish commentary on how GNU/Linux, whose mascot is a penguin, poses an empirical challenge to economist Ronald Coase's celebrated “transaction cost” theory of the firm. In 1937, Coase stated that the economic rationale for forming a business enterprise is its ability to assert clear property rights and manage employees and production more efficiently than contracting out to the marketplace.
The idea that a company can make money by giving away something for free seems so counterintuitive, if not ridiculous, that conventional business people tend to dismiss it. Sometimes they protesteth too much, as when Microsoft's Steve Ballmer compared the GNU GPL to a “cancer” and lambasted open-source software as having “characteristics of communism.”【352 Joe Wilcox and Stephen Shankland, “Why Microsoft is wary of open source,” CNET, June 18, 2001; and Lea, Graham, “MS' Ballmer: Linux is communism,” Register (U.K.), July 31, 2000. 】 In truth, “sharing the wealth” has become a familiar strategy for companies seeking to develop new technology markets. The company that is the first mover in an emerging commercial ecosystem is likely to become the dominant player, which may enable it to extract a disproportionate share of future market rents. Giving away one's code or content can be a great way to become a dominant first mover.
Netscape was one of the first to demonstrate the power of this model with its release of its famous Navigator browser in 1994. The free distribution to Internet users helped develop the Web as a social and technological ecosystem, while helping fuel sales of Netscape's Web server software. (This was before Microsoft arrived on the scene with its Internet Explorer, but that's another story.) At a much larger scale, IBM saw enormous opportunities for building a better product by using GNU/Linux. The system would let IBM leverage other people's talents at a fraction of the cost and strengthen its service relationships with customers. The company now earns more than $2 billion a year from Linux-related services.【353 Yochai Benkler, The Wealth of Networks (Yale University Press, 2006), Figure 2.1 on p. 47. 】
As chance had it, Baraniuk's research group at Rice was just discovering open-source software. “It was 1999, and we were moving all of our workstations to Linux,” he recalled. “It was just so robust and high-quality, even at that time, and it was being worked on by thousands of people.” Baraniuk remembers having an epiphany: “What if we took books and ‘chunked them apart,' just like software? And what if we made the IP open so that the books would be free to re-use and remix in different ways?'”
History-making citizenship is not without its deficiencies. Rumors, misinformation, and polarized debate are common in this more open, unmediated environment. Its crowning virtue is its potential ability to mobilize the energies and creativity of huge numbers of people. GNU/Linux improbably drew upon the talents of tens of thousands of programmers; certainly our contemporary world with its countless problems could use some of this elixir— platforms that can elicit distributed creativity, specialized talent, passionate commitment, and social legitimacy. In 2005 Joi Ito, then chairman of the board of the Creative Commons, wrote: “Traditional forms of representative democracy can barely manage the scale, complexity and speed of the issues in the world today. Representatives of sovereign nations negotiating with each other in global dialog are limited in their ability to solve global issues. The monolithic media and its increasingly simplistic representation of the world cannot provide the competition of ideas necessary to reach informed, viable consensus.”【447 Joichi Ito, “Emergent Democracy,” chapter 1 in John Lebkowsky and Mitch Ratcliffe, eds., Extreme Democracy (Durham, NC: Lulu.com, 2005), at ‹http://extremedemocracy.com/chapters/Chapter%20One-Ito.pdf›. 】 Ito concluded that a new, not-yetunderstood model of “emergent democracy” is likely to materialize as the digital revolution proceeds. A civic order consisting of “intentional blog communities, ad hoc advocacy coalitions and activist networks” could begin to tackle many urgent problems.
As projects like GNU/Linux, Wikipedia, open courseware, open-access journals, open databases, municipal Wi-Fi, collections of CC-licensed content, and other commons begin to cross-link and coalesce, the commons paradigm is migrating from the margins of culture to the center. The viral spiral, after years of building its infrastructure and social networks, may be approaching a Cambrian explosion, an evolutionary leap.
http://crackmonkey.org/pipermail/crackmonkey/1998-December/001777.html. So far, Mak is the only person I've found willing to speak on the record in regard to this practice, although I've heard this from a few other female sources. Mak, despite expressing initial revulsion at it, later managed to put aside her misgivings and dance with Stallman at a 1999 LinuxWorld show.
SiSU 4.0.16 2013-03-18 (2013w11/1)
SiSU, developed using
with the usual GPL (or OSS) suspects.