Much discussion occurred within the group during those few days following Bruce's announcement. The discussion focused on defining exactly what problem LSB was trying to solve, and how best to solve it. LSB is not a solution looking for a problem, instead, it is a solution to a very real, sometimes subtle problem. The end result of the discussions was that we had a better understanding of the problem, and which solutions were viable given all of the real-world constraints that we have to live with.
The recent announcement of LSB's re-organization reflects this improved understanding of the problem and the solutions. It also shows commitment to the LSB from many of the distribution vendors, and several of the ISVs. We all want Linux to succeed, and we all believe the the LSB is a much needed part of making it happen.
In a word, Compatibility. Linux is trying to break out of a traditional UNIX catch-22. You need more applications ported to Linux to be able to increase the sales of Linux, but you need greater sales of Linux to convince the ISVs to port their applications to Linux. Right now, Linux has an increasing market share and ISVs are interested, but there are some technical problems that are causing their interest to diminish. The big one is Compatibility.
In order for an ISV to consider the entire Linux market as a single target, every version of Linux from every distribution vendor must be Compatible, at least within a certain large subset that is needed by an application. Today, this is not true, and from the ISVs perspective, the Linux market is really a collection of smaller markets based around similar, but unique distributions. This "fragmentation" is not appealing to ISVs.
Linux is a collection of components. Each distribution picks the set of components that will be bundled together to form a release. Fortunately, there is only a single source for many of the core components that make up Linux. The problem, however, begins to show itself in two ways:
First, as each release is prepared, a distribution vendor will usually pick the latest (or near latest) version of a component. This is a Good Thing because innovation and improvements get out to end users quickly. This is also a Bad Thing because each version of these components tend to have a slightly different set of features & mis-features. Version A is not always completely compatible with version B of the same component. (BTW, this is not a Linux specific problem, it happens to everyone). These differences make subsequent releases by the same vendor not Compatible.
Second, I said that there was only a single source for _many_ of the components. There are quite a few components where distributions get to pick from a selection of similar components. This is a Good Thing because it allows for innovation and improvements. A little competition usually makes products (or components) better. This is also a Bad Thing as most of the choices are not Compatible with each other. These differences make releases by different vendors not Compatible.
Third (yeah, I know I said there were only two), not all vendors choose to include all components in their releases. Usually, the subset of components which is common across all distributions is very large, and is more than adequate for most applications. However, there does exist the possibility that a vendor may leave out a component that is needed by some application.
I describe this method of producing a release as defining a feature/version list. If all distributions were made from the same feature/version list, then they would be Compatible, at least until you get down to the particular bug fixes that may be made by each vendor. As it turns out, each release is made from a unique feature/version list, and the problems described above creep in.
Defining a particular feature/version list and mandating that everyone use it is not an acceptable solution. It goes against the very grain of what makes Linux successful: the ability to innovate and move forward. We must find another way to solve this problem.
Last month, in Jordan Hubbard's editorial, he pointed out the frightening prospect that we may all be repeating the history of 8-10 years ago. I have always heard that the best way to avoid repeating history is to learn from it. There is a lot of opportunity to do just that with this problem. You see, others have faced this same Compatibility problem in the past, and solved it. the LSB can learn from their solutions, and develop a similar solution quickly.
There is a long list of groups such as 88Open, ABI+, MIPS-ABI, Sparc International, DICOP (in Japan), that have managed to mend the UNIX fractures within their respective architecture. Most of these groups did not have the advantages that Linux has today. Oftentimes, these groups had to reconcile differences between OSes that did not share a common lineage (BSD vs SYSV vs Mach). Linux only has to reconcile the differences between versions of most components and a few competing components.
Looking back at how these groups solved the problem, we find that they based their standards on a behavioral specification instead of a feature/version specification. In short, their specifications read something like this:
"Functions X, Y, and Z are located in library libfoo.so. Their behavior is described by the source standard A, B and C"
The source standards A, B, and C were often things like ANSI C, POSIX, and XPG. To verify that an OS correctly implemented X, Y and Z, test suites were developed that actually tested their behavior.
A behavioral approach works very well from an applications's perspective. An application interfaces to the OS by calling functions that are located in shared libraries. The application doesn't care how the libraries were built, only that the functions exist, and that they behave as expected.
Getting back to Linux, the same techniques can be applied. We can list which functions are located in which libraries and define their behavior by referencing a source standard such as POSIX. There are a lot of little details that have to be taken care of simply because there are hundreds of functions whose locations and behavior have to be written down. While work begins on developing a test suite for this much, additional problems such as what commands are available and the infamous "package format" can be addressed.
Yes, but not as much as it could be. By learning from the experience of previous groups, we have solved much of the initial "how do we solve this problem" problem. By including other standards by reference, we immediately pick up thousands of pages of specification without writing down more than a paragraph or two.
We can very quickly specify what is already common and prevent any further divergence so that the perceived "fracture" doesn't become something that requires "medical attention". We can then roll up our sleeves and start solving some of the really tough problems like how an application can add something to the system startup mechanism, and in what format documentation should be provided.
Linux is faced with an incredible opportunity right now, one that many OS vendors have been unable to obtain no matter what they tried: Linux is popular, it has impressive growth, is technically sound, and has media attention. Microsoft OSes have only one significant advantage: Compatibility.
Most DOS programs from as far back as DOS 3.10 have continued to run through DOS 4 & 5 & 6, Windows 3.1, 3.11, WFW, NT 3.X & 4.X, and Win95 & Win98. Any "well behaved" program written for any of those OSes continues to run today. This is quite a technical achievement. ISVs can maximize their potential market by writing applications for whichever OS that contains the minimal functionality they need, knowing that it will run on that OS and all OSes that have come after it.
Sadly, Linux does not have this kind of Compatibility (yet). Linux has been growing at a phenomenal rate over the past few years always adding and improving features. This kind of growth, while good for Linux, has often made it impossible, or at least very difficult to maintain Compatibility between different releases.
This time has come, however, to solidify some of the core of Linux, and to begin guaranteeing a certain level of compatibility across releases. If Linux can prove that useful applications written within certain reasonable boundaries will run across releases, and will continue to run in the future, then Microsoft loses its big technical advantage.
Laying the foundation for Compatibility is a large part of what the LSB is trying to accomplish.