October 21, 2003

Interview: Jon 'maddog' Hall

Author: Robin 'Roblimo' Miller

Jon 'maddog' Hall is chairman and prime speaker for the Enterprise Linux Forum being held this week at the Washington, DC, convention center. The conference is sponsored by a number of Linux-oriented companies, including OSDN. We hadn't chatted with maddog for a while, so this seemed as good an excuse as any. Email interview transcript follows...NewsForge:

Do you expect to see more managers or techies at Enterprise Linux Forum?

This conference is for C*Os who specifically =>DO NOTNewsForge:

How has the attitude of non-technical managers toward Linux changed in the last year or two?

They have become more accepting, particularly because companies they know and
trust are behind it. Or they have had reports from their staff about successes
in Linux, or that they have read about successes in news articles, etc.

Do you often get asked about Microsoft's many "sponsored" TCO studies that show Windows is a more cost-effective server OS? What do you say in response to those surveys' findings?

I will admit that the TCO studies between Wintel and Lintel are very close.
After all, you are dealing with an operating system that normally has its price
buried in the system price, and often has the office package bundled too, so it
is hard to see what the real prices and costs are. It is much easier to show
the TCO cost savings between Lintel and SolSPARC, especially when this TCO
cost is usually based on hardware, software and (re)training. If it is
like hardware to like hardware and the only difference is the operating system,
I can see where people might think that Wintel has a lower TCO.

On the other hand those companies that are thinking about changing from
SolSPARC to Wintel should really be thinking about changing from SolSPARC to
Lintel....so this would automatically slow down the large growth in Wintel
servers, which today are selling at a 42% share of the market, and increase
the share of Lintel servers. This should really be more of a no-brainer,
because the systems administrators that are already in place with SolSPARC
would naturally learn Lintel servers faster than Wintel servers. And the
client system users would see no difference.

Now so far we have been talking about TCO based solely on hardware, software,
and (re)training. But TCO is really a lot more than that. It covers things

  • flexibility, and how tight a fix can you have between your
    actual problem and the inflexible, closed source applications that
    you have been given

  • stability, and whether your systems are vulnerable to crashes,
    virus attacks, denial of service, etc. which cut down productivity

I give a whole talk about various things that are causing businesses to lose
money through their software not working the way they want it to work. Then
I say, "what would happen if each desktop lost $5/day in productivity?" If
you have 500 desktop systems, this amounts to 2,500 dollars per day times
an average working year of 200 days, or about 500,000 dollars per year.

Now I will admit that we will probably never get the loss down to zero, but
what if we were able to reduce the loss by just $2. by using Free and Open
Source Tools and hiring a FOSS programmer to implement FOSS changes to make
the software fit the business better. This would save $200,000. per year,
and probably make for happier employees because the software would be easier
to use.

Also, a lot of the TCO studies seem to think there is no (re)training cost
in going from current windows desktops to "2003", when there obviously is.

Finally, there is nothing which says that Microsoft will keep the same
licensing practices in the future.

How about security? Now that proprietary software companies,
especially Microsoft, seem to have become conscious of security, is Linux still perceived as more secure than Windows? Is there a difference in that perception between techies and execs? Is the execs' perception changing? If so, in which direction?

I have a friend who has been in the computer security business for
about twenty years. He has recently given up in disgust and gone to raise
horses in Pennsylvania. Did he quit because he was not making money? No. In
a letter to me he stated that he was tired of telling people how to protect
themselves and then watch them not listen to him.

Another friend of mine, and one whose integrity is without peer in my book,
was recently fired by his company because he co-authored a study that showed
that our economy was at risk due to the "monoculture" of Wintel.

I tell people that the only way to win in the security area is through constant
diligence. That having been said, I believe that Linux is still perceived, and
is truly, more secure than Microsoft's products. Microsoft has gone in with
the presumption of trying to make everything "easy". Thus the ability to
launch applications from email automatically, the unprotected address books,
the (ironic) "openness" of some data, which can be used by viruses and
hackers, and the "closing" of other data, which could be legitimately useful.

Microsoft has not understood that just because they decide to drop
support of a particular product, that not all of their customers stop using
it, and viruses that cannot penetrate through newer versions get a foothold in
older versions still running. This is best understood by their refusal to
patch Office 97, which had a very large security hole that everyone knows
about, but has not been patched because the product is "retired". We are not
living in a single-user, stand-alone environment anymore, as many Microsoft
systems were for so long. A networked environment needs over-all security.

Open Source allows this problem to be solved by providing the source code for
all versions of the software. Security patches can be back-ported to as many
systems as are necessary to make the entire network secure again. Microsoft
should either release the source code to these "retired" products, or put in
a time-bomb to kill them when they become "obsolete". Or they should just
face the music and make critical patches available to anyone who wants them,
no matter what the age of the software.

More on page 2...

Linux on the desktop: Are you seeing more interest in it? I notice that this forum has a 'front office' track...


Hmmmm, I did not even think of that angle when we named the track, but that
"Front Office" has to deal with the executives (maybe we should have named it
"back room"?)

On the other hand, I am wildly enthusiastic about the Linux desktop, and I
believe that the years 2003, 2004, and 2005 will finally see the migration of
Linux to the desktop.

Of course people have been using Linux on the desktop for years. I have not
had any proprietary Microsoft products in my environment for over two years.
Yet when asked about the desktop, I have been saying that for the average
person the desktop would be in this time frame even as long ago as 1998.
Having brought out three operating system/hardware combinations while working
for Digital Equipment Corporation, I know that it takes time to build the
application base, support base, customer base, etc. to allow the desktop
to flourish. This is now reaching critical mass.

Projects like Munich, companies like Ball Music, and the emergence of new
programmers and systems administrators from college where they learned Linux
are starting to turn the tide. And the economy has helped. People are looking
to do more, faster, with less.

What about patent and other infringement threats a la SCO? Are you hearing about any potential corporate Linux users pulling back because of this problem?

I heard about one or two. But then other companies who are in the
multi-operating system business, so have no real ax to grind with respect
to Linux, tell me that more and more companies are now moving. I think that
the SCO thing caught people off guard. But the more people think about it,
the more that SCO fails to deliver "the smoking gun", the more that people
apply business and legal logic to it, the less they fear it.

Recently Silicon Graphics went through the million or so lines of code that
they had contributed to Linux. They had always told their engineers to be
careful in this area, and not to contribute code that might belong to someone
else, but they decided under the circumstances to review it again. Out of all
that code they found approximately 200 lines of code in one routine that
*might* be conceived to have belonged to the System V code stream. They then
found two or three places inside the Linux kernel where there were similar
pieces of code, and they substituted the suspect code with this other code.

I have been telling people who ask me about these issues:

  • Either there is code that belongs to SCO or there is not
  • If there is, we have the resources to go into a clean room and
    rewrite it
  • Next question

What can *we* do about patent and infringement threats?

Write to your congress and senate. I am tired of patching this, we need to
go for the jugular. Patents and copyrights were invented and sustained by
the government "for the common good." I consider Free and Open Source one of
the greatest accomplishments "for the common good" that I have ever seen, and
it is being choked by mickey mouse (pun intended) organizations that are
trying to make a buck off contributed code, code that was created "of the
people, by the people and for the people." And if some person considers this
"Un-American", please have them meet Mr. Lincoln.

Either we should bring the laws regarding software into the twenty-first
century so a reasonable software programmer can reasonably respect ownership
rights (or defend them) without unreasonable time and legal costs, or only
large companies will be able to create software in the future. It becomes
too complicated otherwise.

I believe that "the common good" in a world-wide software development can not
tolerate software patents, needs a shorter time frame on software copyright,
and an easier method of determining software ownership or non-ownership.

Do you feel there's any chance of a single company -- SCO, IBM, Microsoft, whomever -- ever dominating Linux development?

There is always a "chance" of something happening. There is a "chance" that
the moon will be hit by a meteor and come crashing into the earth, but we
live with it. In business we call this "risk", and it is managed along with
everything else.

The good news is that the GPL is an effective force against any company (or
even group of companies) controlling Linux. If the Linux community felt
that this was happening, they could fork off a different code stream.

But I also think that a lot of these companies (with the possible exception
of a large, Redmond-based company) see no reason to try and "dominate" Linux
development. They learned their lesson from Unix (which, in their defense,
started in a different time and under different circumstances), and they
see that they can gain more of what people want by working together, at lower
cost, they they could by working apart. Ironically, I think that Microsoft
helped to teach them this lesson. Companies shipping Windows or WNT only
had to provide a few engineers to develop device drivers and boot-path
support, not the hundreds of engineers that it took to develop HP/UX or
AIX, or Solaris, or Digital Unix. And customers who bought Microsoft products
still bought the system vendor's hardware and services.

Now if Microsoft had just maintained a reasonable business manner....

Can volunteer free software organizations with no formal corporate backing (like Debian) remain viable in the face of increasing Linux commercialization and corporate use?

Absolutely. As I go around the world I find that more and more people are
discovering and using Debian. Non-commercial distributions like Debian are
under no compulsion whatsoever to put their distribution out "before its time."

There is also room for distributions that are "bleeding edge", that bring out
features that might be considered to be "unsupportable" by mainstream
distribution groups. I encourage this type of distribution. While
"mainstream" distributions are concerned with "stability" and have longer
update times than "bleeding edge" or "specialized" distributions.... I especially
like the distributions based on Knoppix, where there are some that are
used to create Beowulf systems, and some that create audio/visual collections,
etc. I think these are kind of "cool", and I enjoy trying them out.

Where's Linux International going these days? Any plans to update the Web site and make it more useful? Or has much of LI's advocacy role now been assumed by OSDL?

OSDL has mostly been a technical organization, with limited "advocacy" roles
to date. LI and OSDL work together in a lot of areas, and we are investigating
ways of working even closer.


  • Linux
Click Here!