Articles / What is Wrong with Make?

What is Wrong with Make?

Evolution is a slow process. Getting rid of old bad habits is never easy. This article is a critique of the Make build tool. I'll list its shortcomings this week and suggest a few more modern alternatives next week.

Make looks fine on the surface

The Make build tool has been with us for about 20 years. It is a somewhat sad thing to see today's complex new projects considering Make as their only choice. I often get the question, "What is wrong with Make?" So often, in fact, that the obvious answer to the question becomes, "The first wrong thing is people's ignorance about Make's limitations with respect to their requirements." I'll start my critique with a list of wrong beliefs about this classic tool.

Myth 1: Make-based builds are portable

Several Make clones are Open Source C software, and all build platforms have a Make preinstalled. Make-based builds are known to be portable and they really are, compared to some IDE solutions. The problem is that the Make tool, as it was originally implemented, has to rely on shell commands and on features of the filesystem. These two are notorious sources of platform incompatibilities.

In consequence, the fact that every system has a Make is probably true, but not relevant. The statement "Every system has a shell" is also true. That doesn't mean that shell scripts are portable, and nobody claims that. Another problem is the fact that the original Make was lacking some fundamental features, and later clones added the missing features in different ways. The obvious example is the if-then-else construct. It is not really possible to build real-life projects without if-then-else. One has to use workarounds, based on either the "include" directive or on conditional macro expansion. In the best case, you do get to the desired functionality, but you lose the readability of the build description.

Make-based builds are not portable. They rely too much on features of the surrounding system, and Make clones are incompatible with each other.

Myth 2: Make-based builds are scalable

Many of us have had the experience of expanding a given codebase by adding a few source files or by adding a static library. In carefully designed builds, it is not a lot of work to make the new code part of the final product. You just add a short new Makefile in a new directory and rely on a recursive call of Make.

The problem comes when the project gets really big. In big projects, keeping the build fast enough is a challenging task. Also, changing the binary deployment of the product becomes a complex task, much more complex than what it needs to be. With a typical recursive Make, the architecture of the software product is nailed down to the directory structure. Adding or removing a single piece is still ok, but other restructuring is much more disruptive. A sad side effect of this is that, over time, developers come to think about the software not as a set of abstract components and their interfaces, but as a set of source directories with sourcecode and that builds up even more resistance to change.

The speed issues in large builds are well documented in Peter Miller's seminal article "Recursive Make Considered Harmful". When using recursive Make, there is no simple way to guarantee that every component is visited only once during the build. Indeed, the dependency tree is split in memory between several processes that don't know a lot about each other. This makes it hard to optimize for speed. Another problem is that Make uses just one namespace, and recursion is never easy with only one global namespace. If you want to configure subtargets differently (to pass parameters to Make processes started by Make), you'll get an error-prone build setup.

Recursive Make is not the only way to implement Make. You may wonder: If it is harmful, why is it so widely used? One argument that people mention is that recursive Make is easier to set up. This is not a very solid argument. For example, John Graham Cumming demonstrates in a recent DDJ article a simple non-recursive Make system. There is also a more subtle argument in favor of recursive Make. The fact is that it allows developers to use their knowledge about the project to easily start only a smaller part of the build. In non-recursive systems, you can start a smaller part, but the system will usually first "collect" all the build descriptions, which is slower or has undesired side effects (or both). It is also possible to design and implement systems that are somewhere in between. Some homegrown Make-based systems use clever checks outside Make to avoid starting new Make processes when there is nothing to be built, then act recursively for the rest of the cases. Anyway, for good or bad reasons, the fact remains that Make-based builds scale painfully.

Myth 3: The Make tool is simple and easy

The Make tool paradigm is elegant and powerful. I have seen nice expert systems implemented only with Makefiles. People like things that are elegant and powerful. Unfortunately, Make implementations are poor and dirty by today's standards. The Makefile syntax is obscure and, consequently, more difficult than you may think. Make wins hands down the contest for the messiest namespace in all programming tools I have ever seen.

File targets and command targets (also known as phony targets) share the same namespace, making it next to impossible to build files with certain names. The environment shell variables and the Make macros also share the same namespace. Make macros cover both variable and function concepts (as in functional programming languages). Macros can be defined inside the Makefile or on the commandline. All these problems lead to complex scope rules and some bad surprises for both the novice user and the expert user.

Make-based builds rely heavily on shell environment variables. The best-known consequence is that the build is difficult to reproduce for another user on another build machine. A more subtle issue is that it is difficult to document the build. Most modern systems allow you to ask which are the parameters of the build and which are their meaningful values. Make doesn't help you to provide this feature, despite the fact that Make-based builds definitely need it.

Even without several sources for variables, a single namespace is still too messy. One namespace means that target platform-wide variables, build machine-wide variables, project-wide variables, and individual user customizations are right next to each other. Without solid naming conventions, you certainly don't know what you can change without endangering the entire build.

Another area of misleading simplicity is the fact that many Make macros have one-letter names. Many Make clones improved on that by introducing more understandable aliases for those names. Of course, they did that in incompatible ways, so that you cannot profit from the improvement if you want to keep portability of the build description across Make clones.

Myth 4: Make is fast

Make and many of its clones are implemented using the C/C++ programming languages. They are speed-efficient implementations. Indeed, correctly-designed Make-based builds show little overhead while building. But before you enjoy too much the raw speed of Make, you have to remember that fast is not safe and safe is not fast. Because of this fundamental contradiction, you should look suspiciously at anyone claiming amazing speed. Indeed, Make achieves some of its speed by performing only superficial checks. This kind of speed gain strikes back because you'll feel more often the need for complete builds from scratch.

Make also expects you to know how to build a smaller part of the project that you are currently working on. This strikes back when more time is spent in product integration between several developers.

More Make problems

In addition to the characteristics described above, often considered strong points of Make, there are also a few characteristics that are acknowledged shortcomings of Make. Here is a short list of them.

Reliability issues

Make-based builds are not safe, and nobody claims that they are safe. The main reason for this problem is the fact that Make relies on time stamps and not on content signatures in order to detect file changes. In local area networks, it is frequently the case that several computers with their own clock devices are sharing the filesystem containing the sources for the build. When those clocks get out of sync (yes, that happens), you may get inaccurate builds, especially when running parallel builds. Moreover, Make takes the approach of not storing any stamp between builds. This is a very bad choice because it forces Make to use risky heuristics for change detection. This is how Make fails to detect that the file changed to another one older than the previous (which happens quite often in virtual filesystems).

Not using content signatures is especially painful when an automatically-generated header file is included in all the sources (like config.h in the GNU build system). Complex solutions involving dummy stamp files have been developed in order to prevent Make tools from rebuilding the entire project when that central header file was regenerated with the same content as before. An even more insidious safety issue is the fact that Make does not detect changes in the environment variables that are used or in the binary executables of the tool chain that is used. This is usually compensated with proprietary logic in homegrown Make systems, which makes the build more complex and more fragile.

Implicit dependencies

One cannot criticize Make for implicit dependencies detection for a good reason: Make doesn't have such a mechanism. In order to deal with header file inclusion in C sources, several separate tools exist, as well as special options to some compilers. High-level tools wrapping Make use them in order to provide a more or less portable "automatic dependencies" feature. Despite the efforts and the good will of those higher-level tools, Make blocks good solutions for automatic dependencies for the following two reasons:

  1. Make doesn't really support dynamic many-to-one relationships. It does support many-to-one, but not if the "many" part changes from one build to the next. For example, Make will not detect if a new dependency has been added if that dependency is new in the list but old on disk (older than the target, according to its timestamp). By the way, make also lacks support for dynamic one-to-many, which makes it inappropriate for Java builds (with Java, a single file can produce a variable number of outputs).
  2. Make doesn't really support using automatic dependencies and updating those automatic dependencies in one run. This forces you to make multiple Make calls for a complete build. (Did you ever wonder why the ubiquitous sequence "make depend; make; make install" has never been folded into just one Make call?)

Limited syntax

The lack of portable if-then-else was already mentioned. There are many other idiosyncrasies in the Makefile syntax. Some of them have been fixed in later Make clones (in incompatible ways, as you may expect). Here is a list:

  1. Space characters are significant in a painful way. For example, spaces at the end of the line (between a macro value and the comment sign following the macro definition) are kept. This always generated and still generates very hard-to-track bugs.
  2. In the original Make, there was no way to print something at parsing time. Moreover, Make has a bad habit of silently ignoring what it doesn't understand inside the Makefile. The combination of the two is a killer. I've had to stare at Makefile code that wasn't working as expected, just to discover that some code was silently and entirely ignored because of a typo somewhere above.
  3. There are no "and"/"or" operators in logical expressions. This forces you to deeply nest the non-portable if-then-else or those portable but unreadable equivalent constructions.

All this is annoying, but it is far less serious than Make's decision to rely on shell syntax when describing the executable part of a rule. Decent build tools gained, over the years, libraries of rules to support various tool chains. Not so with Make. Make has instead a hard-coded set of rules rarely used in real-life-sized projects. Actually, it hampers accountability more than anything else. One of those rules may and sometimes does trigger by accident when the makefile author doesn't intend it. Lack of good support for a portable library of rules is, in my opinion, the biggest shortcoming of Make and its direct clones.

Difficult debugging

The inference process of Make may be elegant, but its trace and debug features date back to the Stone Age. Most clones improved on that. Nevertheless, when the Make tool decides to rebuild something contrary to the user's expectation, most users will find the time needed to understand that behavior not worth the effort (unless it yields an immediate fatal error when running the result of the build, of course). I have noticed that Makefile authors tend to forget the following:

  • How rules are preferred when more than one rule can build the same target.
  • How to inhibit and when to inhibit the default built-in rules.
  • How the scope rules for macro work in general and in their own build setup.

While not completely impossible, Make-based builds are tedious to track and debug. By way of consequence, Makefile authors will continue to spend too much time fixing their mistakes or, under high time pressure, they will just ignore all behavior that they don't understand.

What about generated Makefiles?

You may argue that all the discussion about the syntax of Makefiles is pointless today, and that would be fair. Indeed, today, many Makefiles are automatically generated from higher-level build descriptions, and the manual maintenance burden is gone. With these new tools, can you forget about those Make problems and just use it? Yes and no. Yes for the syntax-related problems, yes for the portability problems (to some extent), and definitely no for all the rest (reliability issues, debugging problematic builds, etc.). See my discussion of other build systems to understand how the shortcomings of Make are affecting the tools built on top of it.

Conclusion on Make

The Make build tool was and still is a nice tool if not stretched beyond its limits. It fits best in projects of several dozen source files working in homogeneous environments (always the same tool chain, always the same target platform, etc.). But it cannot really keep up with today's requirements of large projects.

After I tell people what is wrong with Make, the next question is always the same: "If it is so awkward, why is it so widely used?" The answer does not pertain to technology. It pertains to economy. In short, the reason is that workarounds are cheaper than complete solutions. In order to displace something, the new thing has to be all that the old thing was and then some more (some more crucial features, not just some more sugar). And then it has to be cheaper to top it. Despite the difficulty of being so much more, in my humble opinion, today, the time of retreat has come for the Make tool. Next week, I'll offer my look at the alternatives.

[The second article is here.]

RSS Recent comments

02 Jul 2005 01:20 tomfm

maybe it's your requirements that are the problem
"The Make build tool was and still is a nice tool if not stretched beyond its limits. It fits best in projects of several dozen source files working in homogeneous environments (always the same tool chain, always the same target platform, etc.). But it cannot really keep up with today's requirements of large projects."

Yes, that's what it was designed for. That's the kind of projects real people can handle.

Large projects and cross-platform projects are not automatically a requirement. Large projects can be broken up into individual, small components, and instead of writing entire projects in a cross platform way, you can write a small cross platform library and implement the rest of the project in terms of it.

If "make" doesn't get the trick done, then the problem is probably with your project organization, not with "make".

However, if you want something that is slightly cleaner than "make", there is Plan 9's "mk".

02 Jul 2005 02:42 colinbreame

Re: maybe it's your requirements that are the problem

> However, if you want something that is

> slightly cleaner than "make",

> there is Plan 9's "mk".

I've briefly looked at Plan 9's mk and it looks very much

like make to me. Can anyone summarise the differences?

02 Jul 2005 03:12 tarzeau

GNUstep Make
Have you looked at that?

02 Jul 2005 05:10 adulau

Old can be sometimes better...
""Evolution is a slow process. Getting rid of old bad habits is never easy."" Yes, this is true. But sometimes in information technology, there is so simple technologies that we need them and we should avoid the over complexity.

As you said, Make is still enough for small scale project but there is (or will be) maybe other approach in the software building tools area. We will see... A nice possible candidate is rake (rake.rubyforge.org/) but there is a bunch of others available.

A nice example of very old technologies still used is UUCP (not really related to software building). UUCP is less complex than SMTP but can be used as an easy approach to store-and-forward mail. Look around services like UUCP over SSH (www.uucpssh.org/) is quite active and permits to overcome some xDSL effect on email (like non-official IP and alike).

Sometimes, we still love old technologies just because they work well and avoid the never ending spiral of complexity.

Thanks a lot for your article,

Just my 0.02 EUR,

02 Jul 2005 06:24 refeak

What's wrong with developers?

The simplicity and versatility of make provides a software tool constrained
only by their imagination.

Applying a topological sort of targets and dependencies is applicable to so
much more than source code.
(see D. Knuth, The Art of Computer Programming: Volume 1)

My suggestions to use make for a work-flow project just never seem to
be accepted ? :)

Make + Vi are my IDE
Linux Build (home.valornet.com/lorr...)

02 Jul 2005 07:04 jepler

GNU Make is just fine for big projects
GNU Make is just fine for big projects, but (like any one of the alternative build systems you may propose) you have to know what you're doing.

Using GNU Make, we build a large (approx. 6000 object file) project for two architectures. A make that rebuilds no targets takes just 4 seconds with full dependency checking. If you need to work with a project of this size, you'll find that you want to increase the size of some of the xxx_BUCKETS constants in the Make source. It looks like we increased FILE_BUCKETS from 1007 to 20101, DIRECTORY_BUCKETS to 1999, and DIRFILE_BUCKETS to 1087.

With distcc, we get an impressive speedup factor. In fact, the limit seems to be that the 100mbit ethernet is saturated with preprocessed source files and object files when many files are rebuilding.

In a separate tree, we build some external libraries (tcl, tk, python, expat, scew, blt, tktable, tkimg, trio, and glew) all from a single non-recursive Makefile. This generates around 500 object files and libraries. A make that rebuilds no targets takes 1/10 second. A full rebuild takes only 40 seconds, probably less time than invoking the recursive make's of all of the libraries when you use their recursive, bulky automake-generated makefiles.

While you mention it, you show no signs of having learned the lessons in Recursive Make Considered Harmful. You also say things that are downright wrong, like "Make doesn't really support using automatic dependencies and updating those automatic dependencies in one run"---It does. In our system, one invocation of "make" computes any needed dependencies and then automatically restarts to use the new dependencies to build the targets.

What other complaints did you have about Make?

Honestly, I just don'f care about files with spaces in their names, but I'll wager it's just a matter of taking care when quoting filenames, as for the shell. No implicit dependency generation? That's just another way of saying you'll be surprised one day that your whiz-bang Make replcement has a different idea of how #include works than your compiler does. For instance, if your compiler is gcc, does whiz-bang-build get -include, -I-, -imacros, and other command-line switches right? What about gnu cpp's "#include_next" or "#pragma once"? I know that my build system does, because it invokes gcc in dependency-generating mode.

It is claimed that using a hash of file contents rather than timestamps fixes a major problem in Make. However, you can still end up with files that should be regenerated but are not. Scenario: whiz-bang-build is invoked. Hash(F) = H. It decides that target T, which depends on F, must be rebuilt. User changes F to F', and then T is rebuilt. User puts back F. At the next invocation Hash(F) = H, so whiz-bang-build concludes T need not be rebuilt. However, T is really built from F', not from F. Will this happen rarely? Sure. But so do the problems with timestamp-based rebuilding.

Finally, hiding in there, is one valid criticism, though it doesn't present a problem in any build system I've used: generating more than one target from a single invocation of a rule. The need for this exists in our system, but the rule is so trivial that it can simply be invoked twice, once for each target.

02 Jul 2005 09:06 yayoubetcha

Thanks for the article, but...
I do appreciate the article. It is important that institutionalized methodologies be examined from time to time. I do agree with much of what you say, but GNUmake is portable. The tools used with it are portable. Here I define "Portable" as "the most portable". If you are on a system in which GNUmake does not function as advertised, then you are on a system in which no other "portable" tool will operate better either.

GNUmake is used in producing large projects like nothing else. Including the Verilog and VHDL used in the production of some rather popular microprocessor chips; operating systems; etc.

Of course, there are difficulties, but I would say there are difficulties using a number of important technologies, such as the C language. What is the integer range of an "int" (8b, 16b, 32b, or 64b?). Is this portable as you define it? Because people people create bugs in C and have a difficult time in debugging their source, should not be a failing of this technology.

Is Subversion better than CVS? Why do CVS projects outnumber Subversion by 1000:1 (just a guess)? The answer is community understanding. The better of the technologies should not be compared white-paper to white-paper. Big-Mo has more to do with it than technology.

As for Make, I am working on the ultimate sucessor right now: it is the new GNU-DWIMNWIS -Make (Do What I Mean, Not What I Say). It will be ready as soon as I finish DWIMNWIS-C++.

02 Jul 2005 12:20 Alpt

Scons is elegant, Make isn't
www.scons.org/

It works, it is python and elegant, it is portable, and above all it's easy.

Why do I have to use Make?

In 50 lines of Make I've done 1/10 of what I've done with scons.

Surely there are a lot of nice alternatives.

Stop the autoconf insanity ;)

freshmeat.net/articles...

02 Jul 2005 13:23 wyo

No hint how to overcome the limitations of Make
Unfortunately the article doesn't say a word about how to overcome the limitations of Make. I agree with many of the objections and aren't fond of Make either but I don't see a better solution than to improve Make further. At least none of the other solutions I've looked at attract me either.

On the other side most developers simply write much to complex make files. Just look once at my most complex make file "cvs.sourceforge.net/vi...;, it's not for a very big project but still of a significant size. You probably agree it's quite understandable.

02 Jul 2005 16:53 emoreau

This guy has something to $ell...
Can't wait to read the next article.

Bet this guy want to $ell us a replacement.

For the last 10 years, i have used some of the most

popular IDEs... i think that they look easier to learn,

but they are not easier to use than a text-based

GNU/Linux/(UN*X) system, especialy when you can

use it in a multi-windowed environment.

Vim + GNU Make + Bash + X11 + SVN is my favorite.

Screen is a good X11 alternative.

02 Jul 2005 16:56 Carl0

Re: Scons is elegant, Make isn't
I see not less issues with scons scripts as with make files compared to the number of applications using either one. It always comes down to the devleopers who are interested in their applications, but maybe not so much in portable build scripts...

02 Jul 2005 22:11 NelsonIS

Weak article
Although, check out aap (www.a-a-p.org or freshmeat.net/projects...) It's basically make with some default steps builtin that you can easily override, the power of python for any complex steps, and it does checksumming rather than just timestamp checks. It's a very cool program, I highly recommend looking at it.

To be specific, I don't think that safe and fast are mutually exclusive. Make can be made safe, used properly, there isn't anything unsafe about it, if not and you don't have your clock set or you don't have depends properly defined, then yeah, things might not build or it's possible for non-build to not error out. I know of no build system that enforces "safty" though, if you choose not to build a safe system then it's not safe. I also assert that if you're spending a lot of time solving scaling problems with make then you probably need to split your project up, it sounds like bad engineering to me; why don't you have a simple high level make that builds the components and then a make for each component? I think it's a managable problem. It's also somewhat weak to not suggest something better.

02 Jul 2005 23:11 csimpson

Re: Scons is elegant, Make isn't

> www.scons.org/

>

> It works, it is python and elegant, it

> is portable, and above all it's easy.

> Why do I have to use Make?

>

Part of the problem is that a lot of the replacements are written in $FAVORITE_LANGUAGE. Like it or not, the most widely accepted and widely distributed tools are written soley in C. Expecting Python to be installed by default on most Linux distributions is acceptable these days, but expecting it on most commercial systems is not. I don't think you are going to find Python by default on Solaris, AIX, or Irix.

What you will find though is a C compiler. If some forward thinking developer would come out, write a build system that makes sense, get commercial support behind it, and implement it well, then we might finally have a viable alternative to make. But until those conditions are met, we are stuck with autoconf/automake madness.

02 Jul 2005 23:15 billposer

interesting article

Interesting article. I use make a lot, not only for software but for other tasks with dependencies, such as formatting and printing. It's true that it can be painful and frustrating.

I don't think it solves everything, but makepp (makepp.sourceforge.net/) is an attempt to improve on make that has some interesting features. One of them is that if you change the build rules it detects the changes and rebuilds as necessary. It also avoids recursion.

I wonder if a good solution might not be to create a scripting language extension that provided an api for a topological sort, a simple syntax for stating the basic dependencies, from which if necessary you could escape to the full language, thereby solving the problem of the lack of if/then/else, and an api for appropriate "change sensors": time-stamps, hashes, diffs, whatever. Of course there could be bindings for multiple languages.

02 Jul 2005 23:18 billposer

Re: No hint how to overcome the limitations of Make

> Unfortunately the article doesn't say a

> word about how to overcome the

> limitations of Make.

To be fair, he says he's going to talk about alternatives to make next week, so he isn't done yet. This is just the first part.

02 Jul 2005 23:27 csimpson

Re: Thanks for the article, but...

> Is Subversion better than CVS? Why do

> CVS projects outnumber Subversion by

> 1000:1 (just a guess)? The answer is

> community understanding. The better of

> the technologies should not be compared

> white-paper to white-paper. Big-Mo has

> more to do with it than technology.

I'll answer that (rhetorical) question. Because CVS has been around longer and it's difficult to port old projects to a new version control system and maintain all of the old version information. As new projects start though, many are considering Subversion as a viable alternative to CVS. If you want anymore evidence of that, look at XCode's support for Subversion as well as CVS.

There really isn't community understanding of CVS anyway. CVS is no longer suited for the job for which it is being widely used and developers are using hacks and workarounds to do basic check-in and check-out tasks. I would like to see how many developers there are that actually understand CVS rather than just using it. Those are the sort of people who would just as easily and happily use Subversion if it would integrate with their IDE.

The same problem would be true of any make replacement. Old projects would be loathe to port over old build systems for no reason, but new projects or new major versions where there is going to be a significant major rewrite would consider using a make replacement, if a single, solid, community supported alternative existed.

04 Jul 2005 06:09 xnc

Re: Scons is elegant, Make isn't

> www.scons.org/

>

> It works, it is python

Well that kills it right there. If any build system is going to be portable and usable for the proverbial "everyone" then it can't rely on anything more than Bourne shell. That is the only scripting option that can be assured to exist on all platforms. At least it's supposed to be.

04 Jul 2005 12:57 Avatar whitemice

Re: Scons is elegant, Make isn't
>> Part of the problem is that a lot of the replacements are written in
>>$FAVORITE_LANGUAGE.

Agree

>> I don't think you are going to find Python by default on Solaris, AIX, or Irix.

Yep. If it isn't compiled to an executable and with minimal dependencies it is NOT a viable solution. Even on a LINUX system requiring Python creates far to much of a dependency problem.

06 Jul 2005 07:32 Avatar ulriceriksson

Predictions
I predict that the next article will suggest some other build tool which is less portable than make.

I predict that this tool will be no more scalable than make for large builds.

Finally, I predict that the tool won't solve the problem of partial rebuilds.

Tune in next week for more predictions.

06 Jul 2005 13:04 buildsmith

Re: GNU Make is just fine for big projects

> GNU Make is just fine for big projects,

> but ... you have

> to know what you're doing.

That is part of the problem. I see that you have

solid make knowledge. But you have to consider

that, in large companies, people with various level

of make knowledge change makefiles. "know what

you are doing" is not fulfilled as frequently as you

and me would like that to happen.

> A make that rebuilds

> no targets takes just 4 seconds with

> full dependency checking.

Granted, I didn't see yet anything else to come

close to this speed.

> you show no signs

> of having learned the lessons in

> Recursive Make Considered Harmful.

That is a bit hasty. You cannot know what I

learned and what I didn't without seeing any piece

of my work, isn't it? It may be that my builds from

the time I was using make are very similar to the

builds that you describe in your comment...

> You also say things that are downright

> wrong, ... In our system,

> one invocation of "make"

> computes any needed dependencies and

> then automatically restarts to use the

> new dependencies to build the targets.

Could you elaborate a bit on "automatically restarts"?

I'm interested and I hope you

have a good solution (see also my comments on

omake in the second part of the article). If it is just

recursion (calling make from a makefile) then it

doesn't invalidate my statement (recursion is

fundamentally the same as multi-pass,

the dependency tree is split in memory across

several processes). I stated that it is not possible

in one pass not that it is not possible at all (make

depends followed by make works, isn't it?).

> What about gnu cpp's

> "#include_next" or

> "#pragma once"? I know that

> my build system does, because it invokes

> gcc in dependency-generating mode.

Your argument is not a solid one. As a matter of

facts, I had to deal quite a lot with builds where

gcc -MM was used. The issue is that the GNU is

not the only compiler in the world and you may

want to use another one for a valid reason. For

those builds, we were using 2 compilers, gcc for

dependencies and the native one for the actual

compilation (for run-time speed of the product or

simply because there was no other compiler for

that embedded platform). In such setup, there can

be disagreement as the ones you discribe anyway.

But if the dependencies are in the build tool, at

least you get a consistent behavior on all build

machines, what ever the compilation toolchain.

Even more, you can define consistent API to

extend the behavior to other builds (Tex, Java,

etc.) which you will have a hard time to acomplish

with the C compiler as a parser.

> User changes F to F', and then T is rebuilt.

> User puts back F. At the next

> invocation Hash(F) = H, so

> whiz-bang-build concludes T need not be

> rebuilt.

Stop right there. Where do you get that

the user puts back F and the hash is still H?

In all build tools using hashes that I know,

as soon as F' is used to build T the hash is

updated as well (becomes H').

> generating more than one target from a

> single invocation of a rule. ... the

> rule is so trivial that it can simply be

> invoked twice, once for each target.

Only if you know a priori the number and

the name of the resulting files.

There is a big difference between

--more than one file but a fixed number-- and

--more than one file and the number not known at makefile parse time--.

With some make clones you can put rules

without executable bodies "in parallel".

This will solve the first case but not the second

(unless you generate somehow the makefile or

you have another multi-pass mechanism).

About te GNU make in large organizations: You say

you don't care about space in file names. OK, I

agree. But you don't say anything about spaces

between the end of a macro and the # sign of

a comment. Did you encounter such a bug? What

would you do to somebody who introduced such

a bug in one of your nicely crafted makefiles? Is it

intuitive that those spaces are kept? Is there

a documentation where you can show people that

it's normal that it is kept? If none of your if's was

ever ruined by such innocent comment addition to

a makefile, then it is like never being in the army

and talking about railguns.

Please read the second part of the article as well;

it has some more clues about how I came to

the conclusion that the GNU make is not enough

for the large projects that I need to build and release.

06 Jul 2005 16:22 buildsmith

Re: Scons is elegant, Make isn't

> % www.scons.org/

> %

> % It works, it is python

> Well that kills it right there. If any

> build system is going to be portable and

> usable for the proverbial "everyone"

> then it can't rely on anything more than

> Bourne shell.

... and a C compiler.

Indeed, lethal arrow. But look, it's not completely dead, it is still moving a toe.

Joke apart, you have to know that most modern build tools deal with the boostrap issue in some way. Only shell, only C compiler, OK everybody has to agree. But that is already aknowledged for a while (it's is the philosophy behind the GNU Build System, isn't it? and GBS is not really new). My point is that, before throwing tomatoes to a tool written in language X or depending on the heavy package Y, ask if and how can it bootstrap?

In the particular case of SCons, this is not an extraordinary quality but it is not bad either. It can be made quite easily to dump from its native build decriptions shell scripts that will do full builds or incremental builds. It is not an out-of-the-box feature but it worked fine for me (I could compile C, compile doc and copy files, all in one script without SCons or Python installed).

Please read the second part as well.

06 Jul 2005 16:27 buildsmith

Re: Scons is elegant, Make isn't

> ... the devleopers

> who are interested in their

> applications, but maybe not so much in

> portable build scripts...

The casual developers or developers of small project are most certainly not interested. Big software companies selling multiplatform applications are certainly interested.

06 Jul 2005 16:32 buildsmith

Re: Scons is elegant, Make isn't
I agree with the requirements you put forward. But you jump too fast to implementation decisions. Please read the answer below about bootstrapping with SCons.

06 Jul 2005 17:22 buildsmith

Re: Thanks for the article, but...
Thank you for your insightful comments. I agree very much with the considerations about the C language (I'm one of the embedded software engineers who learned the hard way the limitations and the pitfalls of the C development. Now that I know them, I still don't have the freedom to choose something else, Java or C++, because of the target platforms that I build for).

> ..., but GNUmake is portable. The tools

> used with it are portable.

I'm affraid that we disagree here. First thing to clarify: I don't question the portability of GNU make itself (or cp or sed for that matter). GNU make is a portable enough piece of software. What I question is the portability of makefiles and scripts used in makefiles. And then I criticize the syntax of makefiles, the (lack of) debugging features, the difficulty to add features to the build in a portable way, etc. etc.

You say that GNU make is used to build a lot of sizeable software projects all over the world. True, I was building such a project with make. Because of the complexity (number of teams involved, number of target platforms and build platforms involved) features have been added (by me and by others) to the build system up to the point where the make tools were the weakest point in the chain (there were several make clones involved, among them gnu make 3.79.1).

It is true that it was possible to build a gnu make version on all our build machines. But that wouldn't help a bit to make builds safer, to make the tool friendlier for its users, etc.

I have to admit that going to that single make flavor could have bring something. Problem was that the GNU make lacks features we wanted (some were added in 3.80); it was actually only our fall back for the less capable platforms. Moving all teams to GNU make was clearly perceived as a painful, smallest denominator move.

06 Jul 2005 17:55 buildsmith

Re: maybe it's your requirements that are the problem

> If "make" doesn't get the

> trick done, then the problem is probably

> with your project organization, not with

> "make".

Well, Tom, this is not quite a mature argument. Complexity comes with the size of the project, trust me on it. Imagine the following scenario:

A software company decides to embarque in a million dollar project with a big automotive manufacturer. The software product has many thoushand C/C++ files, it has a 5 years history already (meaning the design is even older than that) and it is developed by about 50 people on 2 continents. They want to hire you as release manager. What do you do? Decline the offer and go to Alaska to build 50 files projects on Linux that are fine with make? I guess not. Me at least, I would get excited about the job. And if they take me I would do what is in my power to do: search the best tool for the job. I simply cannot scale down the requirements, I simply cannot redesign the software and quiting the job would simply make me a coward, wouldn't you agree?

> However, if you want something that is

> slightly cleaner than "make",

> there is Plan 9's "mk".

Something that is a bit better than make will not justify the costs for the migration of the build system for any existing large body of code. It has to be a lot better than make. Fortunately, such tools exist, both commercial and open source.

06 Jul 2005 19:09 buildsmith

Re: Weak article

> check out aap

I know A-A-P and I studied its pros and cons. It hasn't been kept in my "recommended" list. If I am asked, I'll post the reasons after the second part of the article.

> To be specific, I don't think that safe
> and fast are mutually exclusive.

I wonder if you believe that more checks can be done in no time or if you believe that more checks are not needed at all. Both are plain wrong.

> Make can be made safe,

This is theoretically true. With a lot of patches outside make, that can be achieved. But there are better tools that already do that. Like the A-A-P that you mentioned yourself.

> used properly, there
> isn't anything unsafe about it,

What do you mean by "used properly"? If I don't write myself a script to check if the compiler changed, if the compiler options changed, if the makefile itself changed, does this mean that I don't use make properly? Do you use make properly then?

> if you choose
> not to build a safe system then it's not safe.

Sure, but some tools make the job easier and some make it harder. Make clones are poor in this respect.

> if you're spending a lot of time solving scaling
> problems ... it
> sounds like bad engineering to me;

You don't always have the freedom to redesign the software product. Does your employer encourage you to develop the C code around the limitation of the build tool? In real life, it is the opposite. Please read also my answer below to the comment on high requirements.

> why don't you have a simple high level make
> that builds the components and then a
> make for each component? I think it's
> a managable problem.

I'm sorry but this tell us that you are completely unaware about the complexity of build systems in real life projects. Your statements are true but overly simplistic. There are many examples on the net that may help you understand where the complexity lies (integrate some module in the Linux kernel configuration, for example, or take Perl distribution, build it on a few platforms and change a header and see if it is rebuilt properly). On my side, when I choose a build tool, I have to ask myself the question if the change of a template that generates a C header file included in only a source for a lex/yacc tool and no other C file will trigger or not the proper rebuild of the project.

On your side, if you don't see any limitation of make, feel free to stick with the tool but, please, until getting some more experience, don't question the needs other people may have.

> It's also somewhat
> weak to not suggest something better.

It is also somewhat weak to not read the first line that says that the second part of the article will suggest alternatives ;-). Be my guest next week.

07 Jul 2005 01:43 trsk

Re: interesting article

> I wonder if a good solution might not be

> to create a scripting language extension

> that provided an api for a topological

> sort, a simple syntax for stating the

> basic dependencies, from which if

> necessary you could escape to the full

> language, thereby solving the problem of

> the lack of if/then/else, and an api for

> appropriate "change sensors":

> time-stamps, hashes, diffs, whatever.

> Of course there could be bindings for

> multiple languages.

Check out bras (bras.berlios.de/); it does all of that, using TCL as the scripting language. This straightforward procedural approach is much cleaner than make and its derivatives.

07 Jul 2005 01:48 trsk

Re: Weak article
The article is a lot stronger than your response.

> Make

> can be made safe, used properly, there

> isn't anything unsafe about it,

So there's a way to use make "properly" so that it checks content rather than just timestamps? Do tell.

> It's also somewhat

> weak to not suggest something better.

Perhaps it's your eyesight that's weak.

07 Jul 2005 01:57 trsk

Re: No hint how to overcome the limitations of Make

>

> > Unfortunately the article doesn't say a

> > word about how to overcome the

> > limitations of Make.

>

> To be fair, he says he's going to talk

> about alternatives to make next week, so

> he isn't done yet. This is just the

> first part.

>

Indeed; you have to wonder what is wrong with people so incapable of reading an article, or who accuse people generously writing technical articles of "selling" something, or who make "predictions" that the alternatives won't address the noted problems before reading about them, or who naysay that any of the noted problems are problems. With that sort of negativity, anti-intellectuality, and hidebound adherence to the status quo, it's no wonder that the general level of software quality is as low as it is.

07 Jul 2005 02:02 trsk

Re: Scons is elegant, Make isn't

> I see not less issues with scons scripts

> as with make files compared to the

> number of applications using either one.

What the heck does that mean? How does one compare numbers of issues to numbers of applications? This sort of mindless naysaying contributes nothing, but it does help explain why technically inferior tools dominate.

07 Jul 2005 02:13 trsk

Re: Scons is elegant, Make isn't

>

> % www.scons.org/

> %

> % It works, it is python

>

>

>

> Well that kills it right there. If any

> build system is going to be portable and

> usable for the proverbial "everyone"

> then it can't rely on anything more than

> Bourne shell. That is the only

> scripting option that can be assured to

> exist on all platforms. At least it's

> supposed to be.

That's nonsense, because Make rules can invoke any command under the sun, as noted in the article (which few of the people responding seem to have read, or absorbed). The weakness of the Bourne shell as a scripting language, rather than a command invocation "language", makes Makefiles very non-portable. cons (perl) and scons (Python) are far more portable because fewer components are involved -- the vagaries of various sed, expr, dependency generators, etc. aren't relevant because none of those are needed to perform the basics, as they are with sh.

07 Jul 2005 02:18 trsk

Re: Thanks for the article, but...

> Of course, there are difficulties, but I

> would say there are difficulties using a

> number of important technologies, such

> as the C language.

And this is relevant how, exactly? Because one tool has problems, that's a good reason to use another tool that has problems? google "logical fallacies" and you'll find that you've committed a few here.

07 Jul 2005 02:42 trsk

Re: GNU Make is just fine for big projects

> But you don't say anything about

> spaces

> between the end of a macro and the #

> sign of

> a comment.

Indeed, it seems a lot of people are only interested in justifying their existing practices; they can't be bothered to actually read and absorb your article. It's curious that they don't just read the article, think "nothing here for me", and move on, rather than going out of their way to naysay.

Of course it's *possible* to use GNU Make for large projects, but that doesn't necessarily make it the best choice. Over and over, I have found that organizations that move away from the dominant tools (e.g., perl, C++, make) produce higher quality products faster, with much lower maintenance costs. As with Windows, dominance comes from factors other than technical superiority.

07 Jul 2005 02:51 trsk

Re: maybe it's your requirements that are the problem

> If "make" doesn't get the

> trick done, then the problem is probably

> with your project organization, not with

> "make".

Classic argumentum ad ignorantiam: "I've never seen a system for which make isn't an adequate tool, therefore there isn't one." Much like Bill saying that no one would ever need more than 640K.

07 Jul 2005 07:35 jepler

Re: GNU Make is just fine for big projects

> Could you elaborate a bit on

> "automatically restarts"?

> I'm interested and I hope you

> have a good solution (see also my

> comments on

> omake in the second part of the

> article). If it is just

> recursion (calling make from a makefile)

> then it

> doesn't invalidate my statement

> (recursion is

> fundamentally the same as multi-pass,

> the dependency tree is split in memory

> across

> several processes). I stated that it is

> not possible

> in one pass not that it is not possible

> at all (make

> depends followed by make works, isn't

> it?).

Read the section 'How Makefiles Are Remade' in make.info. When there is a rule like 'depends/redhat/%.d: %.c', and later you '-include $(patsubst depends/redhat/%.d, %.c, $(SRCS))', it all happens like magic. A .d file might read 'depends/redhat/libexample/example.d objects/redhat/libexample/example.o: libexample/example.c libexample/example_int.h include/libexample.h'

> % User changes F to F', and then T is rebuilt.

> % User puts back F. At the next

> % invocation Hash(F) = H, so

> % whiz-bang-build concludes T need not be

> % rebuilt.

>

>

> Stop right there. Where do you get that

> the user puts back F and the hash is

> still H?

> In all build tools using hashes that I

> know,

> as soon as F' is used to build T the

> hash is

> updated as well (becomes H').

For the hash-based system to work, it must be atomic to

calculate H = Hash(F) *and* build T. If it's not, there's a window for the misuser of the system to replace F with F', so that T is built from something that doesn't actually hash to H. The old contents of F can be restored anytime before the next build invocation.

> [Y]ou don't say anything about

> spaces

> between the end of a macro and the #

> sign of

> a comment. Did you encounter such a bug?

What bug? This is documented in make.info. The behavior you describe isn't even related to comments; it is related to trailing whitespace. Are you planning to sell us a build system that doesn't have a configuration or rule file? I'm betting there will be some feature of your tool's syntax that will trip up someone, somewhere.

07 Jul 2005 07:47 jepler

Re: maybe it's your requirements that are the problem

> Well, Tom, this is not quite a mature

> argument. Complexity comes with the size

> of the project, trust me on it. Imagine

> the following scenario:

>

> A software company decides to embarque

> in a million dollar project with a big

> automotive manufacturer. The software

> product has many thoushand C/C++ files,

> it has a 5 years history already

> (meaning the design is even older than

> that) and it is developed by about 50

> people on 2 continents. They want to

> hire you as release manager. What do you

> do? Decline the offer and go to Alaska

> to build 50 files projects on Linux that

> are fine with make? I guess not. Me at

> least, I would get excited about the

> job. And if they take me I would do what

> is in my power to do: search the best

> tool for the job. I simply cannot scale

> down the requirements, I simply cannot

> redesign the software and quiting the

> job would simply make me a coward,

> wouldn't you agree?

You're right. I'd use GNU Make for this project. Developers would either run their editors remotely on a single site, or use local editors but store all files on the remote site using a network filesystem. New binaries would be sent to the second site using "xdelta" or similar technology to reduce the amount of data transferred.

If necessary, I would place a full, independent build system at each site, but this leads to integration problems that are completely outside the purview of the build system.

And don't scare me with talk of a 5-year-old system, as though that's positively ancient. The system I work on displays the message "Copyright (C) 1983-2005" on startup. This is the system I mentioned in an earlier reply to this article, where we use a non-recursive make crafted by one of my co-workers after learning the lessons of Recursive Make Considered Harmful.

07 Jul 2005 17:01 buildsmith

Re: GNU Make is just fine for big projects

> Read the section 'How Makefiles Are

> Remade' in make.info. When there is a

> rule like 'depends/redhat/%.d: %.c', and

> later you '-include $(patsubst

> depends/redhat/%.d, %.c, $(SRCS))',

Thank you, I'll study.

> For the hash-based system to work, it

> must be atomic to

> calculate H = Hash(F) *and* build T.

The time window you point out exists, indeed. The tools I know are computing the signature a tiny bit before or a tiny bit after the actual build command. But that is mostly theoretical; it is so short that, for all practical purposes, it is like it wouldn't exist.

> What bug? This is documented in

> make.info. The behavior you describe

> isn't even related to comments; it is

> related to trailing whitespace. Are you

> planning to sell us a build system that

> doesn't have a configuration or rule

> file? I'm betting there will be some

> feature of your tool's syntax that will

> trip up someone, somewhere.

True, it is not related to comments (it just hit me through good-will comments). True, it is not really a bug. But, if it is a feature, it is a damn nasty and useless feature.

Every syntax has its own disadvantages: Jam has its issues with semicolon, the XML of Ant is oververbose, even well-known script languages used in some build descriptions (Python, Perl, Tcl and to a smaller extent Ruby), they all have their weak points. But Makefile syntax is the worst one (for the reasons I mentioned in the article). BTW, I'm not selling anything to anyone.

07 Jul 2005 17:53 buildsmith

Re: maybe it's your requirements that are the problem

> I'd use GNU Make for this project.

I do understand that you master your subject and it is likely that you would do a fair job with your GNU make for such project but that is no reason to get a fixation on GNU make. There are plenty of better tools (both radically different and incremental-improvement, close-relatives make-clones).

As I mentioned in a reply above, for the project I am talking about, there were already several make clones used (on different sites but also at the same site on different build platforms) among them gnu make 3.79.1. GNU make was the one with the smallest number of features and it was used only as fall back on the least capable build machines.

It is true that moving to GNU make as unique make flavor would have helped (and updating it to 3.80). Nevertheless, moving all teams to GNU make was clearly perceived as a painful, smallest denominator move and, for that reason, it never happened. We moved to SCons last year (Python was already a requirement in our build chain because it was generating the makefiles for all those make flavors from one XML file).

For the actual develpment, as you say, we do both remote access work and code base replication plus local work.

> And don't scare me with talk of a 5-year-old system,

Cool down ;-) It was not meant to "scare" anybody. It was just saying that enough time passed to make the initial design dusty and "heavy on our back" but not enough to pave the way of a complete redesign (at least not enough seen by our managers). BTW, I hope for you that you didn't face the request for a physical deployment redesign (libs, so files and exes) because that is really painful with hand-crafted makefiles and a system of your's size.

As a general comment, you are probably a bit too much into the GNU World. It's true that it is large enough and one can acomplish quite a lot never taking foot outside the brave GNU World. But that would be a pitty.

07 Jul 2005 21:40 paxmark1

smake
What about Mr. Schilling's smake.

Does it have potential?

Peace, Mark

08 Jul 2005 13:13 tomfm

Re: maybe it's your requirements that are the problem
Well, Tom, this is not quite a mature argument. Complexity comes with the size of the project, trust me on it.

Well, Adrian, fortunately, we don't have to rely on anybody's (imagined or real) superior knowledge, we can simply look at the facts. "Make" has been used to create the original Bell Labs UNIX system. It has been used to create the BSD system. It's being used for the Linux kernel, for Solaris, and thousands of other complex projects, both open source and proprietary. There are dozens of "make" replacements available, yet almost everybody still sticks with the basic thing, often even eschewing the GNU extensions.

"Make" may be ugly and messy, but it works well enough for people so that they don't bother replacing it. Since we know that "make" has been used by many people to build big and complex systems, we have to infer that if "make" is such a problem for you, then the problem must be with the way you organize and manage your projects, not the tool itself.

A software company decides to embarque in a million dollar project with a big automotive manufacturer. The software product has many thoushand C/C++ files, it has a 5 years history already (meaning the design is even older than that) and it is developed by about 50 people on 2 continents. They want to hire you as release manager. What do you do? Decline the offer and go to Alaska to build 50 files projects on Linux that are fine with make?

If C/C++ software isn't built using standard UNIX tools, that's a bad sign already. Regardless of what build tools it uses, if the software doesn't decompose into subprojects with at most a few dozen files each, then there is a really serious architectural problem. Yes, that's an offer I would decline.

And if they take me I would do what is in my power to do: search the best tool for the job.

So, you are saying that you are the kind of person who would introduce a lot of non-standard tools into a build process in an effort to avoid modularizing the software. See above for what I think about the fruits of your labor.

08 Jul 2005 13:18 tomfm

Re: maybe it's your requirements that are the problem
Classic argumentum ad ignorantiam: "I've never seen a system for which make isn't an adequate tool, therefore there isn't one." Much like Bill saying that no one would ever need more than 640K.

No, the argument goes the other way around: lots of people have used "make" for the development of big and complex software systems. Therefore, if you have serious problems using "make" for the development of big and complex software systems, the problem must be with you, not with "make".

08 Jul 2005 18:13 trsk

Re: maybe it's your requirements that are the problem

> Classic argumentum ad ignorantiam: "I've

> never seen a system for which make isn't

> an adequate tool, therefore there isn't

> one." Much like Bill saying that no one

> would ever need more than 640K.

>

> No, the argument goes the other way

> around: lots of people have used "make"

> for the development of big and complex

> software systems. Therefore, if you

> have serious problems using "make" for

> the development of big and complex

> software systems, the problem must be

> with you, not with "make".

That argument is fallacious in exactly the way I pointed out -- you

haven't seen a system too complex for make, therefore you think

there isn't one. It's telling that you give unix as an example,

because it's a poor one. unix is largely a set of disjoint tools; it

simply isn't very complex in terms of build requirements.

The other point is that, yes, people use make; people also have

written operating systems in assembly language; people also write

programs in Intercal and brainf*ck. That fact doesn't give any

indication of how good those tools are for the job.

08 Jul 2005 18:50 trsk

Re: maybe it's your requirements that are the problem

> So, you are saying that you are the kind

> of person who would introduce a lot of

> non-standard tools into a build process

> in an effort to avoid modularizing the

> software. See above for what I think

> about the fruits of your labor.

Well, really, Tom, why should anyone care what you think of Adrian
or the sort of person he is? That sort of ad hominem is indeed
immature. Adrian has pointed out a number of specific problems
with Make, none of which you have refuted. And he isn't alone,
considering how many people and organizations have developed
tools in an attempt to address its problems -- cmake, nmake,
smake, makepp, makexs, ant, bras, cons, scons, cook, jam,
rake, ... it's hard to think of any other tool that has produced so
many alternatives (cvs gets close, and for the same reason: it
doesn't scale to the level that some projects are operating at --
and you can make the same fallacious argument that somehow
the fact that people use cvs proves its adequate, but it doesn't
prove any such thing). Even GNU make itself, with 3.80, added a
large amount of functionality in a response to the serious problems
people have with it, but there are well known problems with that
sort of accretion of features where, rather than pick the best
design, Paul Smith had to shoehorn it into an existing piece of very
complicated code without breaking backwards compatibility with
every previous error in design. 3.81 is still in beta nearly 3 years
after 3.80 was released, and the kinds of response one sees to
the bug reports for Make are telling, such as "Ooh, this is
complicated." and "Changing this will be fairly challenging, and I
don't have time to solve this problem before the next release, so I
am deferring this bug although I'll leave it open. In the meantime
I'm going to install a check to disallow target/prerequisite
definitions inside evals in command script contexts, so at least
you'll get an error and not a core dump. As for your issue, I
suggest you post to the help-make@gnu.org list with a more
detailed description of exactly what you want to do. I strongly
suspect that your situation can be handled through
auto-generated included makefiles."

These kinds of complications and the need to auto-generate
makefiles are a consequence of the sorts of problems that Adrian
pointed out, and your rude and irrelevant attacks on his
competence have no
bearing on that.

08 Jul 2005 19:16 trsk

By the way ...
You write "Yes, that's an offer I would decline." and then
you have the gall to write "So, you are saying that you are the
kind of person who would introduce a lot of non-standard tools into
a build process in an effort to avoid modularizing the software."
Who's
doing more avoiding, the person who takes on the project or the
person who walks away? But your first statement recognized the
obvious fact that someone who is "hired as a release manager" is
not being asked to "modularize the software", which you promptly
forgot when your second statement came blowing out of your ...
ass.

08 Jul 2005 21:18 rocky

GNU Make debugger

For the most part I agree with the deficiencies pointed out including the fact that debugging is difficult.

However I want to point out that there is a stab at real debugger for GNU make (based on 3.80). freshmeat.net/projects...

It may be much less "Stone Age" than suggested above and definitely more helpful than the tracing that comes with GNU make via option -d.

A difficult part of the design of such a debugger is figuring out what things to put in, and then in some cases how to get them into the code.

For those who are interested and want to get involved the best sources are probably what's in CVS

For the module name, use is "remake".

For those that really want a tarball, the last one released is
here

09 Jul 2005 03:57 Carl0

Re: Scons is elegant, Make isn't
% This sort of mindless

> naysaying contributes nothing, but it

> does help explain why technically

> inferior tools dominate.

Uh, thanks. I did not say anything about the quality of any of these tools. I spoke about those who use them.

10 Jul 2005 13:11 coudercd

Re: Scons is elegant, Make isn't

> Part of the problem is that a lot of the

> replacements are written in

> $FAVORITE_LANGUAGE. Like it or not, the

> most widely accepted and widely

> distributed tools are written soley in

> C. Expecting Python to be installed by

> default on most Linux distributions is

> acceptable these days, but expecting it

> on most commercial systems is not. I

> don't think you are going to find Python

> by default on Solaris, AIX, or Irix.

>

> What you will find though is a C

> compiler. If some forward thinking

> developer would come out, write a build

> system that makes sense, get commercial

> support behind it, and implement it

> well, then we might finally have a

> viable alternative to make. But until

> those conditions are met, we are stuck

> with autoconf/automake madness.

The PMK project (pmk.sf.net) is an alternative to autoconf written in C. It also aims to be easier to use than autotools.

10 Jul 2005 21:03 buildsmith

Re: By the way ...
Hey, trsk, please cool down and control your language. Anyone has the right to speak out their opinion. Please try to be more respectful.

13 Jul 2005 16:03 buildsmith

Re: maybe it's your requirements that are the problem

> "Make" has been used to

> create the original Bell Labs UNIX

> system. ... the

> Linux kernel, for Solaris, and thousands

> of other complex projects,

Your statement is correct but your conclusion is wrong. First, the fact that Gizeh pyramids were build with very primitive means is not an argument to close all crane factories around the world (not even a single one). Second, you seem to ignore how much code is surrounding make in large code bases like the ones you mention. Make did the job but with a lot of external help. In short, if something is possible, it doesn't mean it is smart to do it.

> yet almost everybody still sticks with

> the basic thing, ... it

> works well enough for people so that

> they don't bother replacing it.

"Everybody using only the basic thing and not replacing it" is largely exagerated. Actually *all* release managers that I know work either on paching your basic thing or on replacing it completely.

> if "make" is such a problem for you,

Make tool is not a big problem only for me (it was my bread, for a long time, but I moved to better tools for years now). Why don't you take a look at the book on the autotools (link in the article) to see what the authors of those tools have to say about GNU make?

> the problem must be with the way you

> organize and manage your projects,

I wouldn't stand out to defend the code base layout of that project. It's clear that it could have been much better. Nevertheless, *nobody* will redesign a product of thounsand source files just to better fit a build tool, especially when there are better tools out there (and increasing competition in this field, read the next part of the article).

> If C/C++ software isn't built using

> standard UNIX tools, that's a bad sign

> already.

Then I am working in an industry, embedded systems engineering, full of bad signs. You have to know that, for no OS embedded systems, MS Windows has made large inroads as the preferrd build machine. In other words, when a customer comes to me, it has a hardware platform and some compilation toolchain made by the hardware vendor that only run on MS Windows. If I want to sell him my C software, it has to be compiled for that platform with that toolchain. I am the Unix kind of guy, I use cygwin whenever I can, yet I have no say and no other option in my customer hardware decision and that binds us to a given toolchain 99% of the time.

Your statement about how something *should* be build clearly shows that your experience is limited, may be not in number of years but at least in number of different platforms that you saw. You are questioning too easily other people requirements and you assume degrees of freedom that not everybody have.

> Regardless of what build tools

> it uses, if the software doesn't

> decompose into subprojects with at most

> a few dozen files each, then there is a

> really serious architectural problem.

That is correct and I have to agree.

> Yes, that's an offer I would decline.

That is childish. That kind of denial/protest will never lead to any constructive outcome. Me, I would accept the job and I would try to change something (like pushing a build tool that not only allows modularity but also enforces it).

> So, you are saying that you are the kind

> of person who would introduce a lot of

> non-standard tools into a build process

It is worse that what you think. All those non-standard additions are there already (shell scripts, Perl scripts, etc.). There are so many open source products where you can check that. Take something you like, I don't know, the Linux kernel build, for example. Try first to change something to the build and then we talk again.

> in an effort to avoid modularizing the software.

"Modular" is an overloaded buzz word. Let me give you an example that is hard to achieve with make: modular deployment, meaning the ability to change on customer request how the code is distributed in executables and libraries (static and/or dynamic). How do you go about this requirement with Makefiles? The way we did that in the past was with makefile generation from a deployment description file. That added complexity (multipass, even if only one make call) and safety holes (it was very hard to detect all the situations with relevant changes and regenerate the makefiles appropiately). We moved to a build tool that is more programmable and those issues were gone.

In conclusion, if you want "to stick to the basic thing" feel free to do it, but don't bend the world to the set of requirements you are used to deal with. You risk an anti-progress reputation and I hope that this is not your goal. BTW, I wouldn't dare to ask you to believe blindly something but a little bit of faith helps sometimes.

15 Jul 2005 11:30 proghelper

I *Strongly* agree that these are myths
First and foremost, make system is not portable at all. I have had many bad experiences using make on multiple platforms.

Also, for the dynamically changing software systems (where new files are added/removed daily), make is quite tough to maintain.

I would like auto-make to be much more user friendly than what it is today.

19 Jul 2005 03:09 trsk

Re: By the way ...

> Hey, trsk, please cool down and control

> your language. Anyone has the right to

> speak out their opinion. Please try to

> be more respectful.

>

If Tome had the right to his comments, I had a right to mine ..

which were a lot less offensive.

19 Jul 2005 03:40 trsk

Re: Predictions

> I predict that the next article will

> suggest some other build tool which is

> less portable than make.

>

> I predict that this tool will be no more

> scalable than make for large builds.

>

> Finally, I predict that the tool won't

> solve the problem of partial rebuilds.

>

> Tune in next week for more predictions.

Your predictions were of course off the mark and of course you

didn't show up the next week to make more silly and offensive

predictions.

19 Jul 2005 03:42 trsk

Re: This guy has something to $ell...

> Can't wait to read the next article.

> Bet this guy want to $ell us a

> replacement.

You lose your bet.

19 Jul 2005 04:08 Avatar ulriceriksson

Re: Predictions

> Your predictions were of course off the
> mark and of course you
> didn't show up the next week to make
> more silly and offensive
> predictions.

Interesting theory. Let's see, shall we.

I predicted that the next article would suggest some tool which is less portable than make, is no more scalable than make and does not solve the problem of partial rebuilds. In fact, the next article suggested a whole bunch of tools, several of which certainly fit my predictions.

I apologize for the lack of further offensive predictions. The reason was that the other article was actually pretty good, so I didn't think it deserved any.

26 Jul 2005 00:27 Avatar jengelh

Make gone wrong?
The Linux kernel uses (g)make. The BSD kernels use make. After ./configure, you use make. Though that's three different things, it's all boiling down to one: we are still using the plain make, as provided by our OS/distribution. So it must be all perfect.

And to comment on parallel builds, though I end up with "missing file" when building wxWidgets (uses ./configure) with -j2 or more, I do not get problems on the +big majority of other software, including the Linux kernel when building it with -j bigFancyValue. So, as has been pointed out before, this is a problem either with improperly written Makefile, or with the author.

26 Jul 2005 10:17 buildsmith

Re: Make gone wrong?

> we are still using the plain make, as

> provided by our OS/distribution. So it

> must be all perfect.

Starting from a correct statement you jump to a plain wrong conclusion. The fact that something is possible it doesn't mean at all that it is smart to do. The fact that the pyramids exist is not a reason to throw away all Caterpillar machines.

> as it was pointed out before, this is a problem either

> with improperly written Makefile, or

> with the author.

This is not really open minded. Your argument says that no problem exists because you didn't face any or you didn't hear other people facing problems.

This only shows that you lack experience and it shows also that you don't read a lot. Read the second part of this article, read the discussion below started by Tom and try to be more constructive in your future comments.

If you never faced the need to change then feel free to stick with what is best for you. But, please, don't take this as an argument to deny requirements that other people may have.

26 Jul 2005 10:31 buildsmith

Re: Predictions
The article was split in two for technical reasons of this site. It was written and designed from day one as a single article.

BTW, I'm glad that you decided to quit the future-teller business ;-)

26 Jul 2005 11:18 Avatar ulriceriksson

Re: Predictions
I cancel one performance, and suddenly I'm rumoured to be retired? ;-)

How about this: One year from now, Make won't have been significantly displaced by any tool which is significantly better in any of the respects mentioned above.

Oh, and one more: one year from now, C will still dominate the Programming Language category on Freshmeat.

C and Make. Not because they're all that great, but because they're there.

03 Aug 2005 07:30 buildsmith

Re: Predictions
C and Make, indeed. I have to agree, disregarding if I like it or not.

In my group, we almost replaced them (we have OOC, an object oriented language that works with straight ANSI C compilers and SBuild, a tool that makes SCons build descriptions). From my experience with replacing them, I can surely tell that C and Make are here to stay...

03 Nov 2005 17:16 melledejong

Re: maybe it's your requirements that are the problem
Not having a solid requirement spec will allways give serious project problems, whatever development tools is used.

15 Nov 2005 19:48 nogin

Check out OMake!

The OMake build tool is designed specifically to address all these many limitations of make, while preserving the "spirit" of make.

OMake is a build system with a similar style and syntax to GNU make but with many additional features, designed to scale from tiny project, where an OMakefile might be as small as one or two line to large projects spanning multiple directories. It has native support for commands that produce several targets at once. In also includes fast, reliable, automated, scriptable dependency analysis using MD5 digests. It is highly protable portable (Lnux, Windows, Cygwin, Mac OS X, FreeBSD, etc) and comes with built-in functions that provide the most common features of programs like grep, sed, and awk. OMake also provides active filesystem monitoring that restarts builds automatically when source files are modified. OMake comes with default configuration files simplifying the standard compilation tasks. A companion command interpreter that can be used interactively is included.

See OMake Project page (freshmeat.net/projects...) for detail.

27 Dec 2005 11:32 rocky

Re: GNU Make debugger

> For the most part I agree with the

> deficiencies pointed out including the

> fact that debugging is difficult.

>

>

> However I want to point out that there

> is a stab at real debugger for GNU make

> (based on 3.80).

> freshmeat.net/projects...

>

>

> It may be much less "Stone Age" than

> suggested above and definitely more

> helpful than the tracing that comes with

> GNU make via option -d.

>

>

> A difficult part of the design of such a

> debugger is figuring out what things to

> put in, and then in some cases how to

> get them into the code.

>

> For those who are interested and want to

> get involved the best sources are

> probably what's in CVS

>

> For the module name, use is "remake".

>

>

> For those that really want a tarball,

> the last one released is

> here

Links seem to have disappeared. The main one is bashdb.sorceforge.net (bashdb.sourceforge.net...) or the Freshmeat (freshmeat.net/projects...) announcement.

I've now used this in a number of places on automake-generated Makefiles and have been able to figure out what's wrong and patch things. In some cases just using the tracing option, --trace or -x, (rather than a full debugger) was all that was needed.

04 Apr 2006 11:30 buildsmith

Re: Check out OMake!
I am so sad to notice that this is the 3rd tool called "omake" I encounter. The Opus Make and the Omake of IBM/Rational/Atria have at least compatible syntax. This omake has a functional style syntax (which I agree is better) and it gets its name from that language. But it is not happily chosen. It seems that people don't read a lot before implementing another make-like tool ;-)

04 Apr 2006 11:37 buildsmith

Maintaining dependencies with GNU Make
I have to apologize for my statement saying that "there is no way to implement dependency computation in one pass" using GNU make.

I knew it is possible with omake and its %restart command. It turns out that it is possible with GNU make also. It is suggested in the comments on this article and it is also nicely explained in the Dr. Dobbb's Journal April 2006 issue. The article is "Dependency Management" by John Graham-Cumming. I have to admit that, athough I got close, I didn't exactly get there in my old-time GNU make based build system.

Screenshot

Project Spotlight

schily

A set of tools for a variety of purposes.

Screenshot

Project Spotlight

Glade

GUI builder for GTK+ and GNOME