eDonkey2000 was one of the oldest still active peer-to-peer programs, predating the completely decentralized network model. Although it was Napster that blew open the door to P2P, eDonkey did popularize some features that proved important to later systems, such as identifying files by their MD4 hash, which served both as a file integrity check and as a fraud prevention measure, and the ed2k: URI scheme, which simplified searches by allowing users to link to files on the network from HTML pages.
eDonkey2000 was among the first (if not the first) to implement both ideas, and a member of the first generation of multi-source transfer applications. All are now staples in file-sharing, and practically a given in newly conceived protocols.
Moreover, all of these features live on in the eDonkey-compatible clones that sprang up as eDonkey2000 evolved from a one-man show into a corporate entity. eMule, xMule, and aMule have taken eDonkey's Multisource File Transfer Protocol and extended it. MetaMachine, on the other hand, shifted its resources back and forth between eDonkey2000 and its Overnet project, fracturing its own market well before the recording company lawyers started their harassment in 2005.
On the subject of what eDonkey2000's death would mean for his own project, eMule developer Hendrik Brietkreuz replied, in short, none. Instead, he thought, the real question ought to be why anyone was still using eDonkey2000. Despite its age and shortcomings, eDonkey2000 still had plenty of users. Answering his own question, Brietkreuz said, "I suppose there are many reasons: it is a very stable and large network" using a "feature-rich" protocol. "So, if you have a large network, nice protocol, and good software to access it, why would you not stick to it?"
In other words, the value is in the size of the network, not the features of the client. And that might be the key object lesson for open source.
Closed networks vs. open source
As near as I can tell, no open source P2P project has managed to build a large-scale network effect comparable in size to the proprietary alternatives. Gnutella, Freenet, and other open source P2P projects have all spawned smaller, often fractured networks as they compete with each other. I would not put BitTorrent in the same category, because most of the indexing and searching for files occurs out-of-network today.
We see the same thing in other "community-centric" spaces -- instant messaging, voice over IP, and social networking Web services, for example. Where building a large network of users is important, free software has not done as well as the competition.
Certainly venture capital funding for proprietary projects is part of the reason for their higher popularity -- if nothing else, the marketing and advertising dollars bring more people through the door -- but I think their success has more to do with their ability to the keep their networks closed, and thereby keep configuration details hidden from the user. Consider the complexity of setting up a SIP client compared to the complexity of setting up Skype. Skype can get away with asking for a username alone because everything else in its protocol and network topology is pre-set and hard-wired. That makes it simpler for new users to understand, and simpler for new users to join, thus growing the network.
Open source, on the other hand, likes to hash out details of the protocols and standards in public, forking and merging, and always with the doors wide open. Most open source advocates believe that this development model results in better code and better protocols. To many of them it would be heresy to suggest building a closed network that locks in users, especially for as callous a reason as building market share.
Can you build a sustainable, "virtuous cycle" networked community on open source code, where anyone is free to start up a competing service? I'm looking for examples. Because as eDonkey2000's obituary reminds us, in the community-centric computing space, open source always seems to finish in second place.