Articles / The Case for Centralized Co…

The Case for Centralized Computing

Dan Feldman offers an analysis of the systems available for centralized computing in a computer lab, and how Linux can fit into them. Throughout the early history of computing, systems were predominantly centralized. Mainframes, and later UNIX servers, provided the computing might, while inexpensive terminals provided the interface for the users. With the advent of PCs, the role of central servers declined. Today, many organizations are going through a reverse change, gradually centralizing their services and limiting the functionality of client machines.

This movement toward centralized systems provides a potential killer application for Linux. As the only operating system which provides an excellent solution for both high-volume servers and thin clients, Linux has the potential to make inroads in the business and educational markets where centralized systems are increasingly common.

Terminals, Clients and Servers -- Oh My!

First, some definitions. A centralized system is generally a computer system in which a single server provides most or all of the disk space, memory, and processor time that the users use. A client-server system, on the other hand, primarily uses the computing power of the users' machines, possibly storing data in a central location. Of course, there is no firm boundary; the clients of today are far more powerful than the servers of thirty years ago.

Several vendors have attempted revivals of the central server phenomenon. The most famous is Sun's JavaStation, a widely advertised device which could download and run custom, small applications written in Java. Almost immediately, Microsoft retaliated with its NetPC standard, which mostly required some minor changes to the design of commodity PCs to make them more similar to terminals. Both of these attempts failed. The JavaStation required total rewrites of every application in Java, and didn't work well with the Web, which was just becoming a business tool at the time. The NetPC simply wasn't appreciably different from more widely available PCs.

Both the JavaStation and the NetPC downloaded each application from the server each time it was used. When the installation of 100 Mbps networks increased the speed of client-server communication by an order of magnitude, Sun and Microsoft realized that terminals could now transfer images of the users' desktop in real time over the network. This allowed administrators to install applications, unmodified, on the server alone, and not have to worry about network communication among the clients. Microsoft bought Citrix and developed inexpensive terminals that could use a single server in parallel, while Sun created its Sun Ray terminals which transfer graphics from a Sun server.

The idea of transferring graphics commands over the network had been tried once before -- with X terminals. X terminals receive graphics commands over the network, and can even connect to multiple servers at once, but the X protocol is uncompressed, and can't keep up with today's graphical applications.

How Linux Fits In

Linux, as a multiple-user system, fits into a centralized infrastructure naturally. Nearly every Linux distribution comes with XFree, a very good implementation of an X server, so setting up a system to run programs from a server requires only a few changes to the X configuration. On the server side, a few lines added to the XDM configuration allow terminals to connect over the network.

However, X is not a satisfactory solution. Running a modern desktop environment like KDE will bog down a fast network with only a few users. Though X can be compressed using a local proxy, setting such a system up can be a challenge. X servers are very large, up to 80MB, which doesn't seems appropriate for a supposedly thin client. And, for those concerned with security, X sessions are quite vulnerable to simple sniffing.

There are other ways to use Linux in a centralized system. The VNC system from AT&T Research allows display of X programs using a very small client. The client can run on nearly any OS, which allows easy integration with existing Windows systems; to run a program on a Linux server from a Windows machine, one could simply open the VNC client and turn the PC into a terminal. VNC is a compressed protocol, and the required bandwidth decreases with every release. Disadvantages of VNC are that it is even less secure than X -- a program exists which can pull VNC sessions off the network and "replay" them pixel-for-pixel -- and that a framebuffer is needed for each client connection on the server, which can increase memory requirements considerably.

A Theoretical Case Study

At some point, the choice of which terminal system to use depends on the cost. The cost of maintenance, the cost of hardware, the cost of setting up the system, and, for the Microsoft and Sun systems, the cost of software must all be factored into the total cost of purchasing a centralized computing system.

My school recently paid $800 a seat for a cluster of Compaq iPaqs running Windows NT. This is a good example of where centralized systems could save thousands of dollars. A lab full of fifty of these systems would cost $40,000, a useful baseline price to compare the terminal-based systems with. (My school actually paid $65,000 for thirty machines, because they bought very expensive monitors for each station and decided to purchase a $15,000 server to store a few hundredMB of student files. For the sake of argument, we're assuming there are monitors to spare and that the individuals designing the lab don't have $30 million in taxpayers' money they need to get rid of.)

The computers in this lab need to:

  • Run a word processor and other simple business programs.
  • Be able to browse the Web.
  • Provide email and 50MB file storage for 1,000 users.
  • Accommodate 50 simultaneous users (that is, all the stations can be used at once).

Our hypothetical lab is owned by an educational institution such as a large high school or a small college, so educational discounts are included below. They usually don't make very much difference in the total prices; in fact, the cheapest solution is made by a company with no educational discount at all.

There are many ways to construct such a lab in a centralized manner. Here are the prices for a few, not including monitors and incidentals:

  • Purchasing a Sun Ray system provides an all-in-one solution. A Sun Ray package of 50 terminals, a Sun server, and the necessary software costs $36,390 with their educational discount, or $737 a seat. Note that to meet our requirement of 50GB of storage space for users, an extra hard drive is needed in the server.
  • The Microsoft system is more difficult to price out. A document on the Microsoft Web site recommends a quad-Pentium IV 500 MHz with 4GB RAM for 100 users. A Pentium III 1 GHz with 2GB RAM and a 36GB hard drive should handle 50 users nicely, and such a server costs $6,317 from Dell. The server will need 3 20-seat licenses for Windows Terminal Services, adding up to $2,364 (including a small educational discount). Finally, fifty Capio Windows Terminals, model 325, cost $12,600. The total comes out to $21,281, or $426 a seat. (If we ignore Microsoft's recommendations and use the same server set-up the systems below use, the cost per seat is $493.)
  • Apple iMacs running OS X might soon be a viable terminal solution. Fifty low-end iMacs and a dual-G4 with 40GB storage space cost $43,500, or $870 a seat. Assuming the monitor built in to each iMac would cost $200 separately, iMacs compare favorably with the Sun Ray.
  • X terminals are hard to find, but IBM sells a device called the Network Station 2200 that includes an X server for $587 (It runs NetBSD, so there are no worries about selling out to a commercial empire). Due to the nature of the X Window System, server requirements vary widely, but it is safe to assume that a desktop environment like KDE or Gnome, plus a Web browser or word processor, would take up 64MB RAM. A Dell server with the required 4GB RAM and a 1 GHz processor costs $12,080. Fifty stations adds up to $41,430, or $828 per station -- actually more expensive than standalone machines!
  • For about half the price of the IBM unit, one could build ones own X terminal running Linux. Fifty $300 generic PCs, plus the server described above, comes out to $27,080 -- $541 per seat.
  • VNC terminals are very inexpensive. The New Internet Computer (NIC), a simple network appliance with an 800x600 video card, Linux, and a VNC client, costs $200. The memory requirements are slightly larger than with X terminals, but fifty clients should still fit nicely into 4GB. The whole package comes out to $22,080 -- $441 a seat. Running the NIC as an X terminal is also possible, and would cost a few dollars less.
  • Wouldn't it be nice if, instead of buying obscene amounts of memory on the server, we could make use of some of the 64MB RAM inside each NIC? The NIC reads its operating system off a CD-ROM, and it's trivial to burn a new one with a custom distribution. That means we can burn a Linux kernel to the CD-ROM and have it mount a remote NFS partition as root, then load the desktop environment and other software. A Dell server with 128MB RAM and a 73GB hard drive (10,000 RPM, since 50 clients will access it at once) costs $2,311, so our lab will cost $12,311 -- just $246 a seat. While this might be the slowest solution (each machine is limited to the internal 266 MHz processor), it is by far the cheapest.

Putting It Into Practice

All the terminals in the world are useless without the right software. Most computers in business and education need to do four things: browse the Web, access email, run productivity applications, and access any custom databases that might exist. Browsing the Web and checking email are situations any OS can handle. On a server OS, it's possible to give each user in the lab an email account and a proxy server to speed up Web browsing. Productivity applications do exist on Linux (I'm writing this using WordPerfect over a VNC link, and KDE's Kword is making great strides), but this is one area where Windows machines are still far superior. (For the Sun Ray, the Open Source word processors and StarOffice are available, but commercial office suites like Corel's are not). Custom databases, of course, will need to be rewritten to work under a different OS, but usually both Windows and Unix versions already exist in some form. Many companies run corporate databases through an X client that connects to the user's PC, so a Linux system will fit in naturally.

Usability certainly varies between the different terminal systems. Windows Terminal Server has very good compression -- after all, Microsoft can fine-tune it for their graphics API -- and the user has access to most Microsoft applications. VNC isn't as fast, but the next version is rumored to incorporate new algorithms to speed it up.

Running the applications directly on the NIC is an attractive possibility because it is so cheap, but the overhead involved in creating a custom boot CD, burning 50 copies of it, and creating the distribution on the server may be worth the $10,000 expense of using a more server-side solution. The Windows solution is, typically, extremely simple to set up, though handling the creation and management of 1,000 accounts is certainly easier on Linux. VNC is not trivial to set up -- it often involves patching the X server, setting up XDM, and reconfiguring inetd -- but an experienced sysadmin should be able to do it in a day or two.

For some purposes, using terminals is a major advantage. Updating applications is simple, and with VNC or a Sun Ray, users can save their desktops and use them at a different terminal. Sharing files and checking email inboxes suddenly becomes simple when everyone uses the same server. With VNC, a teacher or supervisor could monitor a user and take control of a session; X allows users to run programs on each other's displays.

Maintenance on centralized systems is mostly a matter of updating the server. The Linux, Windows, and Solaris servers should really only be modified by experts in those systems; after all, a small error could make 50 machines stop working. That's why a Windows shop shouldn't consider a Linux-based network,or vice-versa, unless they are willing to hire new system operators and undergo training.

In Conclusion

All the systems examined would save our hypothetical lab at least several thousand dollars. Though the Linux-based solutions are not perfect, they are somewhat cheaper than the Microsoft and Sun systems, and would probably run acceptably in a commercial environment. It is also worth noting that none of these solutions is complete -- in the real world, each station would need a monitor, hubs and switches would need to be purchased, and a backup system would need to be put in place.

Are centralized systems really worth the effort it takes to choose one and configure it? No one would recommend a centralized system for a home network with three computers, for a team of programmers who constantly push the limits of the operating system, for a LAN party of Quake players, or for running an air traffic control system, but for applications where there are a large number of similar workstations which run similar, non-graphics-intensive programs and absolute reliability isn't necessary, centralized systems can greatly reduce cost and sometimes even increase functionality. It's time for terminals to have their place in the spotlight again.


Dan Feldman is a high school student in Seattle. He likes to write programs in Python, find recipes for dirt-cheap computers, and experiment with cool Open Source stuff. He can be reached at cloudfree@mostlysunny.com.


T-Shirts and Fame!

We're eager to find people interested in writing editorials on software-related topics. We're flexible on length, style, and topic, so long as you know what you're talking about and back up your opinions with facts. Anyone who writes an editorial gets a freshmeat t-shirt from ThinkGeek in addition to 15 minutes of fame. If you think you'd like to try your hand at it, let jeff.covey@freshmeat.net know what you'd like to write about.

Recent comments

01 Jan 2001 12:40 Avatar davedykstra

GraphOn has a much superior but secret X protocol
I'm not pursuaded that thin clients are a killer app for Linux but they are at least a niche. I want to make you folks aware though that we have found that the X compression applications from GraphOn, GoGlobal and Bridges, perform much better than VNC or LBX for both low bandwidth and high latency. Unfortunately, they're keeping their protocol a secret. Their Unix clients are really awful so far too. Perhaps somebody could figure out what they're doing and make an open implementation.

24 Dec 2000 09:57 Avatar 10t8or

If U won't burn a CD every time... =)
(...)
Running the applications directly on the NIC is an attractive possibility because it is so cheap, but the overhead involved in creating a custom boot CD, burning 50 copies of it, and creating the distribution on the server may be worth the $10,000 expense of using a more server-side solution.
(...)
aka the bootprom solution...

isn't it possible to set up some ethernet cards provided with EEPROM in the client machines? they would request for kernel at serverside.. it is also possible to have some choiches like "which os do you like?" with windoze, winterminalserver, local linux or remote xdm session on an alien *nix in the list!
a medium linux server with bootpd would be good..
no CDs to burn and no hassles in case of kernel updates or so.. the user on the client would have his freedom he claims: a choiche and fast ease of use!

Francisco.

24 Dec 2000 04:07 Avatar jlampton

Weird X, VNC, and terminals for all
If only the JavaStation could have stayed around long enough for Weird X... I'm sure with a little tweaking (perhaps via a proxy) one could setup a compression stream based off of GZip Stream and possibly an encrypted data port. I attend a public residential school that has windows machines with netware clients. I normally connect to my *nix boxes using VNC to do work which works nicely. I have also been on the network when a friend of mine was using VNC on his machine as well; this tends to get ugly. I don't know if there have been any speed improvements, this was a year ago. I personally like the idea of terminal based systems, even in the business environment due to ease of maintenance. (500+ windows boxes tend to get quirky when you have curious users that don't always do what is best for their machines.) Just my 2 cents.

23 Dec 2000 17:18 Avatar sporty

*nix
It's not all about linux. You can use *bsd, with this too. Don't beat down NetBSD, as implied with the IBM example, by saying a Linux server custom built would be better. Any free *nix would do better.

23 Dec 2000 17:17 Avatar sporty

*nix
It's not all about linux. You can use *bsd, with this too. Don't beat down NetBSD, as implied with the IBM example, by saying a Linux server custom built would be better. Any free *nix would do better.

Screenshot

Project Spotlight

Kigo Video Converter Ultimate for Mac

A tool for converting and editing videos.

Screenshot

Project Spotlight

Kid3

An efficient tagger for MP3, Ogg/Vorbis, and FLAC files.