P2P: everything old is new again

14

Author: JT Smith

By Dan Berkes
During the Dark Ages of the commercial Internet (circa 1996), we were told — make that ordered — by the media, content, and technology companies that we would just love push technology. Too bad none of us really did. Approaching 2001, we’re told that we’ll love peer-to-peer. If it’s so great, then why does the hype sound so familiar?Push is the future of the Internet! It will revolutionize the way you use your computer to gather information, make purchases, and collaborate with your fellow electronic community members.

No, wait a minute.

P2P is the future of the Internet! It will revolutionize the way you use your computer to gather information, make purchases, and collaborate with your fellow electronic community members.

If you were anywhere near a computer with an Internet connection back in 1997, then surely you remember the hype surrounding push. The whole concept of push was that you shouldn’t have to seek out information, but that information should seek you out.

Leading the charge of the push brigade was Pointcast, offering users a screen saver delivering top headlines, current weather, and stock quotes. In the stampede that soon followed Pointcast’s lead, virtually every content service on the Web offered a push version of its site.

Netscape, Microsoft, and a handful of other companies aided and abetted these content publishers, creating clients, servers, and publishing back-ends to cash in on the hype. Even AOL was rumored to be working on a push protocol for its millions of members.

Users would no longer access the Internet from their desktop; the best of the Net would reach them through … wait for it … a Webtop.

Advertisers were almost unable to contain their glee over the prospects of a captive audience — one figure projected $19.1 billion in annual revenue from display and purchases.

In its March 1997 issue, Wired predicted the death of the Web browser, if not the Web itself: “Sure, we’ll always have Web pages. We still have postcards and telegrams, don’t we?”

Over at ZDNet, where webmasters were urged to “get with the push program,” Jesse Berst was a little concerned over which one of the dozens of push programs would become the de facto standard: “The bad news is, we don’t know which “broadcast standard” will be chosen. You don’t want to litter your desktop with a half-dozen ‘tuners.’ (If you hunt around, you’ll find about 30 such products out there already.)”

What about the public?

It seems that precious few of these push puff stories bothered to find out what the average user felt about push. The general consensus back then seems to have been that push was so right and such a natural evolution of network content delivery that everyone would simply embrace the new technology.

When was the last time you used a push client?

After a long and lingering illness, proprietary push finally kicked the bucket. Most say it was a technology before its time — Pointcast’s initial clients sucked down so much bandwidth that many network administrators banned its presence on the corporate desktop.

You can bet that most of the content publishers were less than thrilled about this relationship. Granted that venture capital was still flowing like water, but few companies could be thrilled about the prospect of forking over licensing fees, buying new technology, and hiring additional support staff simply to publish their content in a different format.

That could be counted as the first corporate Open Source revolt.

The days of Webtops and the fragmented tribes of push technology companies have long since vanished into the past, but push does continue in some ways — in some very Open Source ways, as a matter of fact.

What we might consider as push today bears faint resemblance to its ancestors. Instead of pushing information to your desktop on a pre-set schedule, content from a single site is now distributed to a network of participating sites using a technology known as the Rich Site Summary (RSS) format by anyone who feels like doing so.

RSS was Netscape’s baby, initially developed for the portions of its sites that visitors can personalize. Released to the public in 1999, this openly available format is well on its way to becoming the de facto standard for publishing content across a wide ranging variety of Web sites and distribution platforms — you can even grab an RSS NewsForge feed for your own site if you so desire.

While it may not meet the appropriate description of an Open Source project, the message is clear: Not even a watered-down version of push was able to gain headway until the technology that created it was freely available.

Because the push revolution that never happened finally went away, companies and pundits have been just a little more cautious about their endorsement of Internet fads, sticking with inevitable safe bets like wireless, broadband, and Linux.

Because everyone yearns to be driving the bandwagon, it was only a matter of time before someone started the ball rolling on the next big thing, and it has arrived with a vengeance.

As Napster was fighting its way through the court system, someone took a look at the number of users attracted to that service, and to other file-swapping services like Gnutella, and the now-defunct Scour.

This was a revolution, someone declared, and its name shall be peer-to-peer, or P2P for those fond of acronyms.

The Peer-to-Peer Working Group helpfully defines the protocol: “Peer-to-Peer computing is a sharing of computer resources and services by direct exchange.”

The Washington Post weighs in with a more complete example, based on Gnutella: “Members of a network using Gnutella software in essence form a search engine of their own that expands its search exponentially. When a Gnutella user has a query, the software sends it to ten computers on the network. If the first ten computers don’t have the file, each computer sends it to ten other computers and so on, until, designers say, an estimated million computers would be look for it in just five to ten seconds.”

Why is everyone so excited about a glorified file transfer protocol?

Advertisers can barely contain their excitement at the prospects of a world hooked on P2P and an entire Internet force-fed advertising.

Software developers are following right behind with a slate of announced products that will facilitate the entire P2P process from start to finish. The price for companies to buy into that software — if there’s even a market for the commercial products by the time they’re ready — will most likely be substantial.

On the Internet, content is free — unless you’re peddling porn or the Wall Street Journal. Ironic comparisons aside, precious few other mainstream sites have had much luck in getting users to pay up. P2P could be the last hope for quite a few sites that can’t figure out how to wean themselves from total reliance on advertising revenue.

Even staid O’Reilly is getting in on the P2P game, holding “the first and most important conference on P2P” next February in San Francisco.

Time and time again, news articles trumpet the overwhelming success of Napster and Gnutella as proof that P2P is the future of Internet use. There’s enormous focus on the potential of P2P and the standards of the developing technology behind it, but little consideration is paid to the current reality of P2P.

Napster and Gnutella owe much of their popularity to the fact that anyone can download, at no cost, music in MP3 format and files that would otherwise bear some sort of cost to use. No one knows if a fee-based P2P service will actually work.

The Peer-to-Peer Working Group aims to create a common reference infrastructure for P2P computing, and you’re invited to contribute if you’ve got a spare $5,000 lying around. Those that wish to look around but are willing to keep their mouths shut can get in on the proceedings for as little as $500.

With that sort of requirement to get in on developing a standard, it seems that fragmentation is almost inevitable.

Gnutella is P2P’s Open Source baby, with clients available on almost every popular operating system around. The program officially has no license — it’s the orphan of a project that lawsuit-shy AOL pulled the plug on during its infancy — so some clients have source code available, others do not.

Thanks to its open nature, Gnutella is the current base model most P2P adherents are working with to create something a little more unique, a little more specialized — something a company can feel good about burdening with a license.

Before anyone starts selling P2P as a commercial product, the current user community should be taken into consideration. Would anyone have even considered the protocol to be viable if Napster hadn’t been dragged into court? Is it possible that the attraction is for something that has a bit of a forbidden aura to it?

Take a gander at these selling points from the original release notes of Gnutella.

“The primary benefits of Gnutella over existing systems include: No one’s trying to make any money off of this, so you don’t have to tolerate ads or corporate dogma… Distributed nature of servant makes it pretty damned tough for college administrators to block access… Ability to change the port you listen on makes it even harder for those college administrators to block access.”

It seems impossible that any corporation would release an advertising-free product, and it’s always possible that users could learn to live with a few ads standing between them and a file transfer.

The bigger question is: How will these companies get on the good sides of network administrators? Napster isn’t welcome on many college campuses, and push dinosaur Pointcast was slighty less unwelcome in its day. No one will welcome any commercial product that evades ports for the purpose of slowing down the Internet for the rest of the campus.

Will P2P turn out to be the push of 2001? At this point, the hype and fanfare surrounding P2P smells an awful lot like the hoopla that surrounded push technology. Perhaps if the pundits ask the users what they want this time around, the end result might be different.

NewsForge editors read and respond to comments posted on our discussion page.