Real World Linux 2004, Day 2: Keynotes

38

Author: David 'cdlu' Graham

TORONTO — Real World Linux 2004 Conference and Expo is under way this week at the Metro Convention Center in Canada’s largest city, and NewsForge is there. Day 2 of the conference saw a lot more people and a lot more happening. (Go here for a report on Day 1.)

Wednesday’s first keynote featured Novell’s Nat Friedman, who talked for well over an hour.

Friedman’s address began with a series of screen shots providing all longtime Linux users present with flashbacks to days gone by. The first was of X with its internal default background and the tiny window manager (twm), an analog clock, a calculator, and a calendar. This he called “Linux Desktop: 1992.”

Next up, “Linux Desktop: 1995,” showed tremendous progress with the inclusion of an xterm running the elm mail client, a digital clock, and the fvwm window manager, a window manager with actual functionality and multi virtual desktop support. It also had the Netscape Web browser.

“Linux Desktop: 1997” showed new progress with the addition of an early version of the GNOME desktop environment and a spread sheet written, as Nat pointed out, by Miguel de Icaza. By 1997, he noted, even xterm was configurable.

The slide of “Today’s Linux Desktop” showed OpenOffice.org and Evolution. OpenOffice.org, he mentioned, is a descendant of StarOffice, which Sun bought a few years ago from Marco Boerries and released to the public, complete with 6.5 million lines of German-commented code.

The slides did a very good job of making the point that Linux has made significant progress over the last several years.

Linux’s Zeitgeist is good

For the next part of his presentation, he discussed Google’s Zeitgeist charts, their weekly search term, search operating system, and other search engine related statistics release. In the Google Zeitgeist, Linux is listed as being the host of 1% of the Web browsers that connect to Google. Curious for more information, he contacted Google to find out if the 1% was correct, or rounded. He was told that it is just shy of 1.5%, and pretty soon it should jump straight to 2%.

By and large, Friedman told the packed room, Linux adoption is coming from the Unix market. Linux systems are more cost effective than their Unix counterparts, and it only makes business sense to switch to it from the old Unices. Linux’ adoption rate — the percentage increase in new users, he added, is faster than that of Apple.

He went on to list a number of Linux adoption success stories, including Largo, Fla.’s 900 users; Sao Paulo Brazil’s 10,000; Munich, Germany’s 14,183 Linux desktops across seven departments at the city government level; Spain’s 300,000; and Thailand’s estimated 1 million Linux desktop deployments.

In Spain, he explained, the school districts in two states aim to have one Linux computer available for every two students. It has reached the point there that students ask to use the “GNU/Linux” rather than the computer, and parents ask to use Mozilla rather than the Web. The 300,000 computers running Linux instead of a proprietary operating system also means the Spanish government can keep the money it is spending inside the country by contracting Spanish citizens to do the work. Friedman pointed the audience to linex.org for more information about the Spanish deployment, and Thailinux.org for the Thai deployment of a million Linux computers.

Friedman told of a bank in Brazil called Banrisul that has adapted Linux for use on all its ATMs and is so proud of the fact, all their ATMs display an image of Tux in the bottom left corner.

The city of Largo has converted its fleet of computers for all aspects of civil administration to Linux, using thin clients — computers that boot off the network and run all their software off a central server, rather than functioning as an independent computer — right down to the displays in their emergency vehicles using CDMA cellular wireless Internet connectivity.

The city of Largo, he told us, buys their thin clients off eBay for as little as they can pay for them and keeps them stockpiled in case one ceases to function, that being a far cheaper solution than using conventional computers.

Friedman’s heritage is from the Ximian project, founders of the GNOME desktop environment. Ximian was recently bought by Novell, and Novell is a major sponsor of Real World Linux 2004. Novell’s logo is plastered everywhere, right down to our ID badges at the conference. It was only natural that he’d talk at least a little about his new employer.

Novell, he said, is adopting Linux on an accelerated schedule. The goal is to have it deployed in 50% of the company on a full-time basis – i.e., no dual-booting other operating systems — by October 31, 2004, roughly 3,000 desktop systems. At present the company is up to about 1,000 installed Linux desktop systems.

The major barriers to corporate adoption of Open Source Software, Friedman said, are:

  • usability
  • application availability
  • interoperability
  • management and administration

Problems more ‘perceived than real’

His basic take on the problems is that they are more perceived than real.
For usability, he argued that users learn patterns in whatever they do, and so are used to them. The key to solving the problem is pursuing intuitive and robust applications. Applications should make logical sense to people using them, and should not crash.

Apple, he said, created the “Apple Human Interface Guidelines” from which the idea to create the GNOME Human Interface Guidelines was born. The guidelines outline, he said, down to the pixel how things should look and work to allow the best user experience.

One thing he demonstrated as being an ease of use and matter of intuition problem was the “Apply” button in many configuration programs. He demonstrated a simple change in “gconf” which changed the size of icons without pressing “Apply.” What real life analogy does the Apply button have? he asked.

In real life, he said, you don’t pour water out of the pitcher and then press “Apply” to have it show up in the glass.

Duplication of effort, he said, is an inherent fact of open source development. It is not, however, a problem, unless and until two projects reach the point of specialization where they can no longer be sanely reintegrated. KDE and GNOME, he said, are not at that point yet, but do risk getting there.

Code duplication is useful to the community, he argued, because it means every permutation of a problem’s solution will be tried. The best one can then continue to live.

After Friedman’s presentation, I found myself at Dr. Jasper Kamperman’s presentation on a qualitative comparison of open source versus proprietary code.

Jasper works for Reasoning, Inc., which analyses source code for otherwise virtually undetectable, or at least unfindable, problems, such as uninitialized variables, memory leaks, and resource leaks, depending on the language.

Jasper Kamperman explains his comparison study between open and closed source software.

Due to an NDA he could not give us the name of the proprietary software he used for the comparison, so it is anybody’s guess what it was. Without knowing it, it is difficult to qualify his presentation as useful.

For the sake of his presentation, he compared Linux kernel 2.4.19’s TCP/IP stack with an unnamed commercial TCP/IP stack. Apache, Tomcat, and MySQL were also compared to commercial implementations of the same types of programs.

His fundamental conclusion was that at the development stage, open and closed source software are about comparable in terms of the number of errors per thousand lines of code, but open source software tends to have fewer errors per thousand lines of code by the final release. He presumes that this is due to peer review.

Executives recognizing value of Linux

In the afternoon, Anne Lotz-Turner of CATAALLIANCE moderated a panel consisting of Ross Button, VP of emerging technologies at CGI, Joseph Dal Molin, president of E-cology Corporation, and Jon “maddog” Hall, executive director of Linux International.

Lotz-Turner opened the panel by discussing a survey in which 60% of those queried are corporate executives. Of the 60%, she told the audience, 13% did not include open source software as part of their long-term strategy. Fifty-five percent acknowledged that their company uses open source software for something, anywhere from as mundane as a Web server to as complex as their entire company. Thirty percent said they have made a conscious decision to use open source software.

Survey respondents told the surveyors that their key factors in deciding what software to use are, in order:

    1. Reliability
    2. Performance
    3. Price
    4. Security
    5. Interoperability

    Open source, she noted, is strong in all those categories.

    The problems, the respondents said, were:

      1. Intellectual property concerns
      2. Time-consuming to research open source options

      After her fast-paced introduction, she asked Hall to start the discussion by answering the question, what are the biggest obstacles to open source adoption?

      His answer was to the point: Inertia.

      Companies, he said, aren’t using open source because the applications they want to use for their specific specialized purpose are not supported under Linux. The companies that make the applications don’t want to make the applications available under Linux or other open source operating systems because no companies are using them. It’s a vicious circle.

      Linux, Hall said, first gained fame from super computer clusters running it and later from embedded systems. Linux has conquered most markets, but the desktop battle is closer to the final frontier than the first fight.

      Before 1980, he told the audience and fellow panelists, companies hired other companies or individuals to write software for their purpose. If the software wasn’t adequately functional or documented, the contracted company simply wasn’t paid. In so doing, companies had tight control of their software.

      Since 1980, companies have taken to using prepacked software programs, and have forfeited that control.

      Ross Button, Joseph Dal Molin, and Jon “maddog” Hall are introduced by moderator Anne Lotz-Turner.

      He described the pre-1980 system as the first wave in software development, the packaged sets as the second wave, and open source software as the third wave. The one that is starting to wash over.

      Joe Molin was asked by the moderator why he thought Canada has been slow on the uptake for open source software.

      Molin’s answer was as simple as Hall’s: inertia.

      He focused more on mindset, though, and described open source not as a product, but as a paradigm. Open source, he said, has been around a long time under other names. Peer-reviewed medicine and science have been around a lot longer than open source as we know it, and it is a tried and true way to develop medical technology.

      Companies and organizations need an environment to explore open source that is free of the propaganda and FUD (fear, uncertainty, and doubt) so prolific on the Internet, he said. Companies need to experiment with open source and they will find that they benefit, Molin said.

      Ripple effect of Y2K

      When Button’s turn came up, he discussed the ripple effect of the Y2K upgrade craze. Companies the world over, he pointed out, spent a lot of money and time upgrading their computer systems and looking over ancient but still usable code. That was five years ago, and many companies are now at a point where they are seeking to upgrade.

      Sixty to seventy percent of corporate IT budgets, he said, go to maintaining existing infrastructure. The balance can go to researching and purchasing new equipment.

      The moderator returned to Hall and asked him what should be done to overcome the inertia he had talked about earlier.

      His response was that companies need to not be concerned about how hard a transition is and how much retraining will cost at first, but to look at what they already have. Many companies will find that in some offices employees are already running Linux, and some may be running it at home. Many people may pre-exist the move to Windows at larger companies and still remember old Unix mainframes. In short, companies need to figure out what knowledge their employees already have.

      Instead of converting existing projects over to Linux, he suggested, companies should have new projects use it. People do not have to be retrained if they’re trained into Linux in the first place.

      A member of the audience and the panel discussed the fact that if companies share out proprietary source code, they will get more in return. The costs of maintaining that code internally can exceed the cost of publicizing the proprietary code and having the community at large assist in its maintenance.

      Another member of the audience asked for the panelists’ opinion on the ongoing fight between SCO and the Linux community.

      Hall responded that adoption was really not being affected. With large corporate backers like HP and IBM ignoring the threat, smaller companies are following along on the assumption that if the threat had any merit, those large companies would be acting differently.

      Calgary tells of its experience

      Immediately following the panel discussion, another keynote presentation started upstairs. This one was presented by D.J. Coppersmith of Hewlett-Packard Co. and Dan Ryan of the City of Calgary, Alberta.

      Coppersmith’s section of the presentation was little more than a buzzword-infested pitch for HP products and services supported by a professionally made PowerPoint presentation. His purpose was to introduce how wonderful HP was because it helped the city of Calgary convert to Linux.

      Ryan’s presentation was a little more interesting. The city of Calgary, approaching 1 million in population, is in the process of converting to Linux, and generally is very Internet-aware. Ryan said that about 85% of Calgarians are on the Internet — 62% of them using it on a daily basis. The city has also grown very quickly over the last number of years, but the IT budget, not being of political importance to the city council, has not grown with it.

      Calgary started a series of pilot projects to convert its old Unix servers to Linux on x86 hardware. The pilots went so well that the city went ahead with plans to switch to Linux. Ryan said that it was a done deal, that there was no going back in the foreseeable future.

      C.J.
      Coppersmith and Dan Ryan take questions following their keynote
      address.

      Linux has allowed the city of Calgary to reduce the number of servers it needs, lower hardware, licensing and maintenance costs, and improve performance on the city’s database systems — on the order of 200% to 600% improvement over the old systems.

      Ryan said that processes that used to take 60 hours to do on their 8-CPU UNIX servers could be completed in only 13.5 hours on their 2-processor Intel systems running Linux.

      The only hiccup they encountered is that they needed to upgrade from Oracle 8i to Oracle 9i, but they took advantage of the opportunity to downgrade their license to a per-CPU standard license instead of a more complex enterprise-level license.

      One of the key factors he credited with the success of the move was the involvement of the city employees with the migration. By utilizing their input, city administrators found that morale was high and more could be accomplished.

      The switch from Unix to Linux is already saving the city of Calgary $500,000 per year from their IT budget, Ryan said. The result was that old computers, not employees, got laid off.

      More tomorrow.