One of the key accomplishments of Linux over the past 25 years has been the “professionalization” of open source. What started as a small passion project for creator Linus Torvalds in 1991, now runs most of modern society — creating billions of dollars in economic value and bringing companies from diverse industries across the world to work on the technology together.
Hundreds of companies employ thousands of developers to contribute code to the Linux kernel. It’s a common codebase that they have built diverse products and businesses on and that they therefore have a vested interest in maintaining and improving over the long term.
The legacy of Linux, in other words, is a whole new way of doing business that’s based on collaboration, said Jim Zemlin, Executive Director of The Linux Foundation said this week in his keynote at LinuxCon in Toronto.
“You can better yourself while bettering others at the same time. Linux has proven it…,” Zemlin said. “Sharing is a higher purpose – it matters. That’s the magic of open source. That’s what this movement, and Linux in particular, has accomplished… billions of dollars are being invested into a future that’s based on sharing.”
Creator Linus Torvalds and his fellow Linux kernel developers have blazed the trail for other open source projects, which are now defining the future of whole new technology ecosystems. Automotive, software-defined networking and storage, the Internet of Things, and many more areas are now being built around open source technologies.
“Linux has put open source on the map. We have shown it’s a viable development model,” said Olaf Kirch, vice president of Linux Enterprise R&D at SUSE, in his keynote at LinuxCon North America last week.
We talked with a few Linux kernel developers at LinuxCon about what it was like for them to see the project evolve from a small project made of volunteer contributors, into a large, professional project. Here are their stories. You can also read more about Linus Torvalds’ reflections on the past 25 years.
Theodore “Ted” T’so, staff engineer at Google and maintainer of the ext4 Linux subsystem
Ted T’so has been involved in Linux kernel development since the summer of 1991, with the 0.09 kernel. At the time, the only way to download Linux was over a rather slow connection from an FTP server in Finland. So, Ted, who was the first North American kernel developer, set up and hosted the tsx-11.mit.edu FTP server, which was his personal desktop.
“I worked at MIT during the very early days, and lots of software packages were distributed on Usenet,” he said. People were accustomed to the collaborative model because of projects like Perl and through Usenet. It was common to send patches to the author, and people were familiar with the GNU, BSD, and MIT licenses.
No one gave it a name, he said, it was just the way people did things.
T’so likened the success of the collaborative approach to the “stone soup” model, wherein everyone has their own interests. If you’re writing software for yourself, he said, it’s much simpler. These days, you may have a product manager and a features expert trying to figure out what users want.
In the early days of Linux — much like with emacs or vi or Perl, developers were writing software for themselves to solve their own problems. In essence, he said, everyone was their own product manager or user experience expert.
The earliest pieces of code were software that was designed to make the developer’s life easier. Everyone has their own features that they care about, he said. “If the project or the problem is important to you, you’ll work on it.”
He said, “When I was working for MIT and my day job was security software and Linux was just a hobby, I could work on it as much as I liked.” Once it becomes a day job, however, you may have to work on things that bring value to your company, rather than the things that are important to you.
In some enlightened companies, he said, a developer may be able to make the case for the importance of a particular problem, and the company may then allow you time to work on it — e.g., Google’s 20 percent time. For T’so, for example, working on filesystem-level encryption became important to Google because it solves some problems in Android. In a case like this, he said, the corporatization of open source is an “adder”; it may allow you to do even more work and you may even get help.
“You can actually get minions,” he said with a laugh. “You can get people to help, and if it’s already an open source project, then the path of least resistance is to release as open source.”
“If you can find ways to make it in everybody’s interest to collaborate, they will.” Additionally, he noted, if as an open source developer, you can find those points of collaboration, you can be very successful.
Guardians of Process
James Bottomley, Distinguished Engineer at IBM, is the maintainer of the SCSI subsystem, PA-RISC Linux, and the 53c700 set of drivers.
James Bottomley first started contributing to the Linux kernel in 1992 on the 0.99.15 kernel, he thinks. He described submitting his first kernel patch, which was for the Linux NFS client. There was a bug causing certain files to be truncated. Within a week, he and some other developers tracked down the bug, wrote a patch, and sent it off to kernel developer Alan Cox. In contrast, now, he said, you have to submit a patch through the chain of the kernel mailing list. “You can no longer just send it to a developer that you happen to know.”
Bottomley said the attraction of open source for many developers was that it freed them from dependence on the proprietary process. For many developers, he said, problem-solving is the interesting bit. The ability to work on problems that you have a personal interest in stimulates the open source ecosystem.
In the 1990s, developers and contributors came to understand the open source process by doing, not by theorizing. In the early days, developers were hands-on techies. Now they’re guardians of process, Bottomley said.
A Career Prospect
Rafael Wysocki, Linux core ACPI and power management maintainer and a software engineer at Intel Open Source Technology Center.
Rafael Wysocki was a Linux user for 10 years before he started to contribute. He was teaching programming at a university in Warsaw, Poland, at the time and felt that if you were teaching something you should also be a practitioner. So he sent his first patch was in 2005, which was merged by Dave Miller.
“I had a 64-bit laptop and I wanted to be able to suspend it, not shut it down and reboot,” he said. “I decided, why don’t I fix that for myself. So I started work on hibernation”
He had the idea that maybe some day he could do it full time, but didn’t expect that one day he would be maintaining a subsystem of the Linux kernel.
“And it’s not just one. I maintain a few of them now,” he said.
When he first started contributing to the Linux kernel, “everyone was skilled enough to look at the code and fix the problems they saw on their systems. I could fix 64-bit hibernation without knowing a lot about the kernel.”
Today it’s much harder for a developer to start this way. “Now people are getting involved through their jobs,” he said. “Today you first get a job with a company that happens to work on the Linux kernel and that way you get involved in the community.”
Collaboration is key
Mimi Zohar, linux-integrity subsystem maintainer and member of the Secure Systems Group at the IBM T.J. Watson Research Center
Mimi Zohar was working on firewalls at IBM in the mid-1990s when the company moved to Linux from AIX. Then in 2004 or 2005 she started working on what’s now called the integrity subsystem of the Linux kernel.
At the time, there was only one Linux security module (LSM), SELinux. The IMA (integrity-measurement architecture), which was limited to taking file measurements, was being developed by a colleague. And EVM (extended verification module), which Zohar was working on, verified both file data and file meta-data integrity.
“I inherited IMA and ended up upstreaming it first,” she said. Verifying file data integrity was subsequently upstreamed not as EVM, but as IMA-appraisal. EVM was eventually upstreamed, but was limited to verifying file metadata.
The first piece of code that she wrote for Linux was upstreamed in 2009 and now she maintains EVM, trusted keys, encrypted keys, and IMA.
“A lot’s happened with security since I first started working on Linux,” Zohar said. “Today there are three major LSMs and a couple of minor LSMs. It will be interesting to see new use cases for the minor LSMs.”
Because of the interconnected nature of the Linux subsystems, it can still be challenging to upstream patches.
“By being in LSM, you have hooks in all the other maintainers’ subsystems. It’s not just your own subsystem you’re modifying. So you have to get everybody’s approval,” Zohar said. “The last set of patches I upstreamed, I think, touched six different subsystems. Everybody is busy. So getting others to review patches isn’t always easy. The key is collaboration. In this case, with
Luis Rodriguez and Kees Cook’s help, we were able to upstream these patches relatively quickly.”
Learn more about who is contributing to the Linux kernel, its pace of development, and much more in the 2016 Linux Kernel Development report. Download the report now!