July 17, 2003

A centralized server architecture could be the killer Unix app

Author:

- by Paul Murphy -

If you're like me, you probably use Linux or Unix at home and then go to work
and wonder why all the people struggling with Microsoft products don't just upgrade to
Linux. Despite all the cost and security issues, business's commitment
to Microsoft's Windows desktop products shows few signs of waning. The key to turning this situation around may be to place the user's perspective above that of the systems staff.

Ask people why they make the systems decisions they do, and you'll get some interesting answers. Unix and Mac users usually talk in terms of the available technologies and the requirements they're addressing with their chosen platform. A primary driver is something like functionality,
sufficiency at lowest cost, or the opportunity to contribute to a community.

Ask the average Windows user or corporate CIO the same question, however,
and the primary behavioral determinant usually turns out to be some variation
on "I'm doing what everyone expects, because it's what everyone is doing."

It doesn't take Mr. Spock to realize that making systems choices without
considering either the technology or the requirements to be addressed
cannot possibly be logical. But logical or not, that is the fundamental attitude those
working toward wider acceptance for Unix ideas, including open source and
network computing, are up against.

Back in the mid-'80s, when the Macintosh was first introduced,
no reasonable individual could objectively weigh the merits of the Mac against those of the PC
and decide in favor of the PC. What appears to have happened to the Mac then, and seems
to be happening to Linux today, is that the better product succeeded
when users made technology decisions based on
their own needs, but failed when organizations allowed those
decisions to be heavily influenced by systems professionals.

In 1984 users bought Macs but organizations bought IBM. Today technical users
install Linux but organizations buy Microsoft. The products have changed
but the underlying behavior hasn't, and it is this difference between individual
and organizational behavior that now blocks the widespread acceptance of
desktop Linux.

Part of the reason may be found in an odd phenomenon: individual decisions tend to
be prospective, meaning heavily influenced by the expectation of value
to be received, while organizational decisions tend to be
retrospective, meaning heavily influenced by the vendor's track record. (This
is why authors like Tom Clancy and J. K. Rowling, for example, received rejection notices from every
major publisher before getting their first books into print.)
Thus organizations then bought the IBM PC/AT and now buy Microsoft Windows while
individuals bought MacWrite and MacDraw then and OpenOffice.org products now.

How can organizations break out of this self-perpetuating loop? Only through application of
an external force.

Outside of the United States Linux is getting that extra kick from
national economic policy. Governments in
countries like Germany and China are pushing Linux mainly because it isn't American,
while India and Japan push it in an effort to seize national economic advantage
in computer services. Obviously, however, those forces not only don't apply within the US,
but represent long-term competitive threats.

Inside the United States what's needed is a true killer application for Unix, something that
is so obviously preferable to what came before that it can overcome organizational inertia and drive
significant cultural change.

The replacement of the Microsoft client/server architecture with the Unix business architecture is
a candidate for the role of that missing killer application.

In contrast to the "one man, one computer" fragmentation inherent in the Microsoft client/server, a properly implemented
Unix architecture tends to extend the reach of the individual by providing the
coordination and focus needed for large groups of people to pursue a common goal.

Unfortunately, the fact that it's possible for Unix to draw organizations together
doesn't mean that it usually does. On the contrary,
that requirement for decentralized control makes the Unix business architecture
anathema to traditional data center managers
and almost completely unintelligible to Windows people struggling to achieve productivity
and uptime through desktop lockdown and server centralization.

What's a smart display?
A smart display is a terminal with a big screen and a powerful graphics engine. It typically runs Java/OS, an X server, or Display PostScript, often concurrently. Notice, however,
that applications run only on the server, not the client. This
is not a diskless workstation à(a.k.a. a Microsoft thin client); there is no
double licensing, no Microsoft security issue, no Windows server, and no local
application processing.

From the user perspective a smart display offers fast, high-resolution, big-screen graphics
with no noise, no heat, and a reasonable expectation that it'll go 300,000 hours
between failures.

Reliability is a big part of the appeal of these devices to systems
management staff too, but the best thing about them is the exact opposite of
what you might expect: the range of software, inherent security, and easy updateability
of the Unix host environment mean that smart display
users can be given greater desktop control and greater
application flexibility than is possible at any practical cost with the Microsoft client/server architecture.

The 1998 IBM Redbook
IBM Network Station - RS/6000 notebook"
by Laurent Kahn and Akihiko Tanishita
provides an excellent, if somewhat dated, introduction to the technology covering
setup, operations, benefits, and typical business deployments of smart display terminals.

As a result most Unix
installations in business are fundamentally mismanaged. There are currently very
few business success stories for the Unix/smart display architecture.

Sun itself provides one example of a success in the making. According to

a recent presentation by Bill Vass,
Sun's CIO, about running Sun on Sun,
he now has around 25,000 smart displays installed and is starting to see
significant benefits.

To demonstrate the benefits this architecture offers users, just turn on a smart display, log in,
and show users that the applications they care about work instantly and without any of the
frustrations -- file losses, reboots, help desks, multiple sign-ons, noise, having to
relearn the OS every few months, cantankerous application clients, viruses, PC
networking, and so on -- that characterize the business PC.

It's almost equally easy to demonstrate some direct cost savings to organizations. Just talk about things like reduced capital costs, reduced maintenance, and the benefits of having a single
point of service for software and file management; then discuss the
elimination of various Microsoft Windows cost sinks including PC networking, the help desk, and
the maintenance organization.

Try to go beyond that, however, and the supporting research hasn't been done. For example, it
is reasonable to believe that organizational returns on major systems investments in things like an
ERP/SCM application are significantly higher with the Unix architecture than with client/server,
but the research needed to support or debunk the idea has not been done.

Big numbers, little evidence
One line of reasoning holds that the inherent complexity of the Windows architecture
requires the company to operate a PC help desk and thus to put first-line
application support into the hands of people who know the infrastructure but don't
know the user's job or applications.

Replacing the client/server architecture with the Unix approach eliminates most of
the ambiguities in problem diagnosis because things that aren't there can't break.
That eliminates the need for a help desk and allows companies to have lead users provide
application support to their peers,
thereby removing both cost and social barriers to user experimentation and learning.

The resulting productivity increase flows directly to the bottom line. For a company adding
a billion dollars in
value to its annual inputs, a 1% change amounts to a $10 million increase
in funds available for shareholder distribution or re-investment --
easily dwarfing all other systems-related cost/benefit considerations.

The critical issue with Unix for business isn't technology but management.
The key component is a deep commitment to combining decentralized control with
centralized services.
Real long-term productivity gains don't come from cash savings on software, desktops, or
staffing, but from peeling away layers of complexity and risk to let people do
their jobs as they want to do them. Unfortunately that's also where the biggest risks with
a Unix/smart display architecture are: if systems
management tries to use the system to seize control of
business processes -- in effect trying to recreate
the old mainframe terminal days -- everyone loses. Think of Unix
as a central information switch and you
can see the danger: put the wrong people in control of it, and they'll strangle the company.

To counteract that, you have to set things up to give
users genuine control of their applications and computing environment. You arrange for
things to stay that way by educating users to exert a balancing force against IT's
centralizing tendency. The best way to truly educate users is to get them to adopt Linux or BSD at home.
A knowledgeable user base can counter any attempt by the IT people to
pull off the kind of power grab represented by desktop lockdown and related Microsoft client/server
strategies.

Though the way to better productivity is through a different organizational model rather than reducing hardware or software costs, there are some clear lessons one can learn by running the numbers. Next week we'll look at the differences in demonstrable costs for the two architectures.

Paul Murphy wrote and published The Unix Guide to Defenestration. Murphy is a 20-year veteran of the IT consulting industry.

- Write for us - and get paid! -

Category:

  • Unix
Click Here!