What Is Free Softwareby Karl Fogel, author of Producing Open Source Software: How to Run a Successful Free Software Project
In This Article:
- From Free to Proprietary
- Richard Stallman and the Free Software Foundation
- The Rise of Open Collaboration
- Is It Free or Open Source?
- The Future of Free Software
Free software is software that may be modified and redistributed freely by anyone, with no significant restrictions on how the code may be changed, the uses to which it may be put, or the parties with whom it may be shared.
From this simple definition flow many unexpected consequences. Today, free software is a large body of high-quality code on which much of the internet depends for critical functions, and it constitutes the core operating system for an increasing number of desktop machines as well. But free software is much more than just a collection of programs. It is also a political movement, a programming methodology, and a business model--although not necessarily to the same people at the same time. Indeed, even the term free software is controversial; as we'll see later, some people prefer to call it open source software. The story of how free software became so technologically successful, even as it became ideologically fractious, starts in the early days of the computer industry.
In the beginning, most software was free by default--free not only in the sense of "zero cost," but also in the sense of "freedom." The early computer industry was organized mainly around selling hardware, with each company offering its own unique design, incompatible with everyone else's. The customers, mostly engineers and scientists, were encouraged to improve the manufacturer-supplied software, and even to share their improvements with each other. Because hardware was not yet standardized, and since software portability tools such as compilers and interpreters were not yet commonplace, there was little risk of such improvements being useful on a competitor's machine anyway.
But as the industry developed, it slowly standardized on a few basic hardware designs, with multiple manufacturers for each design. At the same time, advances in compiler and interpreter technology made software portable in source code form. (Source code is the set of human-readable instructions that define how a program behaves; to study or modify a program, you need its source code.) With these developments, it became normal to write a single program and expect it to run on different kinds of machines. This had deep implications for the manufacturers: it meant that a customer could now undertake a major software engineering effort without being locked to a particular brand of computer. Furthermore, as computer architectures became standardized, raw performance differences between them got smaller and smaller. Manufacturers realized they would need to distinguish themselves on something other than just the quality of their hardware, and treating software as a sales asset began to make more and more sense.
Thus the era of easy and informal code sharing slowly faded away, and software became a source of proprietary value. People still did share, of course, sometimes legally and sometimes not. But an important mental shift had taken place: unrestricted sharing was no longer the assumed default. One had to first check to make sure it was OK to share, or else share covertly.
In some places, however, sharing was preserved as a standard practice. For example, in universities, the free exchange of information was a cultural norm, and academia was at least partially insulated from the commercial pressures of the computer industry. One such haven was the Artificial Intelligence Laboratory at MIT, where a young programmer named Richard Stallman worked in the 1970s and early '80s. As he later wrote:
We did not call our software "free software," because that term did not yet exist; but that is what it was. Whenever people from another university or a company wanted to port and use a program, we gladly let them. If you saw someone using an unfamiliar and interesting program, you could always ask to see the source code, so that you could read it, change it, or cannibalize parts of it to make a new program.
Around 1980, however, industry trends finally started to affect even the AI Lab. A private company hired away many of the Lab's programmers to work on a proprietary operating system. Since their work would now take place under an exclusive license, they would not be free to share with their former colleagues anymore. At the same time, the AI Lab acquired new computer equipment that also came with a proprietary operating system; the members of the Lab would not be free to examine or change the source code without permission from the company that sold them the machine.
Stallman saw the situation as a stark political choice:
The modern computers of the era, such as the VAX or the 68020, had their own operating systems, but none of them were free software: you had to sign a nondisclosure agreement even to get an executable copy.
This meant that the first step in using a computer was to promise not to help your neighbor. A cooperating community was forbidden. The rule made by the owners of proprietary software was, "If you share with your neighbor, you are a pirate. If you want any changes, beg us to make them."
His response was to resign from the AI Lab and form an independent nonprofit organization, the Free Software Foundation (FSF). Its flagship project would be GNU, a whimsically named but quite serious effort to build a completely free operating system, in which users would be guaranteed the right to study, modify, and share the source code. He was, in other words, trying to re-create what had been destroyed at the AI Lab, but on a worldwide scale and without the vulnerabilities that had led to the AI Lab's demise as a sharing community.