Peer to Peer was Here
Pages: 1, 2
I was wondering why Intel was slammed so hard by people in the P2P field, when Tim Berners-Lee got away with creating the W3C with pretty much the same structure. (Very rarely has anybody grumbled about the W3C -- and usually just when they weren't making progress on something.) My answer to this question centers on the context in which the W3C started, versus the current P2 field. When Berners-Lee proposed the W3C, Netscape had ravaged Mosaic (through superior technology, to be sure) and emerged as the ferocious lion dominating the WWW savanna. While Berners-Lee's organization was slanted toward heavy-weight corporations, that was widely seen as the only group that could cage the lion and save the Web from a looming monopoly.
By contrast, the P2P arena is completely open; there's no way to tell what the relationships are among the players or who will win. People want some coordination and standardization, but they're not going to put up with the faintest attempt to draw a line and say who's in or who's out. Intel, like Sun with their JXTA proposal, is coming into a very different environment from Berners-Lee. It's also, of course, a different era in computing history, characterized by a triumphant open-source movement.
Microsoft, which was present at the conference, has not stepped forward like Intel and Sun, and probably are smart to avoid anything in the peer-to-peer community that people might interpret as embrace-and-extend.) They did, however, help promote SOAP as a potential unifying technology between their .NET and Web services developed outside the .NET framework.
On Friday morning, John Perry Barlow made a surprise appearance on the P2P conference stage. He offered a number of pungent observations, including his oft-heard maxim that "information is not a noun, but a verb." If physical commodities are nouns and information is a verb, perhaps peer-to-peer is a preposition. It's the ineluctable, nearly invisible line that ties everything around it together. And if you follow that metaphor, perhaps what's really important in P2P is the 2.
Politics at the Forefront
Just when we -- the writers, the researchers, the entrepreneurs -- thought we were getting the point across that peer-to-peer was a critical new technological innovation, BANG!: along came another court decision Feb. 12 with a message of doom for Napster. Once again, just two days before the conference, the newspapers were full of talk about illegal activities as the characteristic trait of peer-to-peer.
Although Napster's name was invoked throughout the conference --with Clay Shirky giving a stunning keynote at the start of the conference that drew valuable lessons from Napster's popularity -- the music service came up mostly as an example of good social engineering.
Only on the final day did political issues take center stage. The impetus was a rousing keynote by Lawrence Lessig, a professor at Stanford Law School, who generated more energy by far than anything to date. Lessig immediately showed his deep caring for, and knowledge of, the various disciplines relevant to the social meaning of new technologies. He expertly wove computing history, the technological dilemmas confounding current policy, and, of course, the laws of intellectual property into an exhortation that, "The public should be aware of the extraordinary system of control that is being rammed down our throats in the name of the Constitution."
What we're losing is the public domain, the right to build on the work of others and the subtle mesh of rights that copyright law calls "fair use." These include educational uses, citation for the purposes of commentary or criticism, and (in technology) the ability to investigate a vendor's product in order to hook into it or compete with it.
|Many conferences feature people standing in tight clumps arguing about the best strategy for integrating RMI and COM or some such narrow topic... we also got questions like, "What could people accomplish if they could share all the resources on a million computers?|
I've noted that people do care about policy and demonstrate a desire to influence the political sphere, but rarely do they know how to harness and carry through that desire. Lessig provided that focus on the last day of the conference, and this sensitivity to policy issues cycled on throughout the day.
A Wrap-up and a Look Forward
Overall, the great thing about the P2P conference was that it gave us all a chance to re-examine our fundamental premises and purpose. Many conferences feature people standing in tight clumps arguing about the best strategy for integrating RMI and COM or some such narrow topic. While we got plenty of that at the P2P conference, we also got questions like: "What can allow people to make the best use of the Internet?" or "What could people accomplish if they could share all the resources on a million computers?" Not in a fluffy or superficial way, but a serious exploration of a serious technical question teaming with social implications.
In a session about successors to Web crawlers, Cory Doctorow of OpenCola, Inc. pointed out that artificial intelligence projects use machine intelligence to aggregate human intelligence. For instance, Deep Blue applied machine intelligence to processing huge archives of chess end-games played by grandmasters. Peer-to-peer is a powerful new model for combining the machine intelligence (programmed by innumerable programmers) with human intelligence across the world.
There will never be another peer-to-peer conference like this, the first. But there will be another O'Reilly peer-to-peer conference Sept. 17-20, 2001 in Washington, D.C. The second conference will be totally different from this one, I'm sure, because the field will be different. What both will turn out to be like remains unknown until people turn up there.
Discuss this article in the O'Reilly NetworkGeneral Forum.
Return to OpenP2P.com.