[H-GEN] gnu make command-line target ordering problems?

Ben Carlyle benc at foxboro.com.au
Tue Oct 19 00:21:08 EDT 1999


[ Humbug *General* list - semi-serious discussions about Humbug and
Unix-related topics. ]

Anthony Towns wrote:

> Basically, I'm arguing two things:
> 
>         * the current behaviour of GNU make is a perfectly reasonable
>           interpretation of `make these goals in parallel'. The
>           alternative is too, but at least it can be simulated by make
>           foo && make bar, whereas simulation the other requires changes
>           to the Makefiles.

>         * your Makefiles would make more sense if they didn't depend on
>           either behaviour. Whether this is possible/realistic in your
>           environment's another matter.

I agree with you on the first point, and thus think the documentation
should be updated.  On the second point... I suppose so.  It becomes
a rather trivial point with GNU make, as it is such a small program.
The old make system would take roughly 10 seconds per invocation, so
it was a major concern to keep sequential orderings on a single
command-line.  Still, the scripter in me wants to minimise invocations
even with gnu.

> The Proper(tm) way of dealing with this would be something more akin to
> having a `target.d' file for each target, specifying how it is to be
> build, and having each target depend on its own rules.

Consider this situation:

A single template library is shared between all object files in a
directory.  Due to bugs in the compiler handling of templates, the
template database must be entirely eradicated before objects are
built.  Due to a problematic design, all templates must be shipped
to a central location immediately after compilation.

Solution 1:  My current/old approach
---
objects:  #  This could be a script, if you like
	make pre.objects && make do.objects && make post.objects
pre.objects:
	clean_templates
post.objects:
	ship_templates
do.objects:  ${OBJS}

# Standard implicit rules are used for the ${OBJS} constituents,
# making use of various CCFLAGS and CPPFLAGS macros.
---

Would I be right in believing the following solution is what you
suggest?

Solution 2:  Your suggested approach (the one I tried before Solution 1)
2a ---
objects: ${OBJS}
	ship_templates
clean_out_templates:
	clean_templates
include ${OBJS:%.o=%.d}
---
or perhaps
2b ---
objects: clean_out_templates ${OBJS} ship_all_templates
clean_out_templates:
	clean_templates
ship_all_templates: ${OBJS}
	ship_templates
include ${OBJS:%.o=%.d}
---

where
foo.d is
---
foo.o foo.d: clean_out_templates foo.c foo.h bar.h
---

This fully expresses the dependancies, but is quite specialised
and has enough problems of it's own to warrant further thought[1].
I opted in my design for a more general approach that could be
used in several applications.  Most important targets have a
pre.% do.% post.% structure that can be used to guarantee sequence
across different included makefiles in specialised cases.
I consider these atoms to be "sub-targets".  They are not part of
the make philosophy, and use make only as a vehicle to put the
commands together.  The real targets are things like "objects",
"libraries", and "applications".  These targets provide the
dependancies between the lower-level targets.

[1]  It's complicated.  It's difficult to understand.  It forces
     targets that could otherwise be centrally defined to know
     about the ${OBJS} macro and hence must be included after the
     definition of ${OBJS}.  It doesn't solve the problem of
     building a single object file and having it's templates
     shipped (as my solution does not).  I don't think it adds
     anything to the make system, despite it being more
     "spiritually aware".  If you have other solutions I'd be
     thrilled to hear about them.

> (Hmmm. Perhaps a better way of dealing with this would be something
> like: ``.SEQUENTIAL : clean'' to ensure that clean is *never* executed
> in parallel)

...but clean should be executed in parallel.  I have four hundred
binaries directories to traverse :)

> > This case is perhaps more complicated than you realise.
> >
> > Firstly, I am controlling a centralised rule set that is
> > included by each makefile.  The rule set has to be strict
> > enough to keep the makefiles simple, but also versitile
> > enough to make everything from C and C++ code to gperf
> *blink*. gperf? The perfect hash generator? Neat.

It is, rather :)

> This is your project not mine, so take this with a grain of salt, but
> still. Simplifying the dependencies is all very, but not at the expense
> of accuracy. Changing, say:

>         a : x y
>         b : y z

> to:     a b : x y z

In simple cases like this, I agree with you.  Depedancies should
be fully expressed.

> > Now you'll argue that I should be running these as shell scripts,
> > and that could work at the top-most level... in fact that is
> > truely what I am doing. [...] At every level life is
> > simple, despite the size.
> ...in which case I don't see what the problem is.

As you have already said... it doesn't fit well with the make
philosophies.  It also requires a helluva lot of make executions
if you can't specify them one after the other on a single
command-line.  (I'd estimate ~2500 or 3000)

> > I call into question your expertise with systems of this size
> > and complexity.  `Always' is a very strong word in software
> > engineering.
> You should always adhere to the assumptions your software makes. Make
> assumes the Makefile specifies all the dependencies accurately, and that
> there's nothing more to building a project than satisfying dependencies.

I think make assumes that all the dependancies you tell it about
are accurate, and those which are too complicated for it to handle
will be executed outside make.  I walk a middle line, where the
external executions are nameable targets without make's explicit
knowledge.

> (And ad hominem attacks are pretty weak. Tsk.)
'course they are, but it doesn't mean they don't apply ;)

> > That's not a bad idea.  In my CM tool the only simple way to do that
> > is to completely delete the work areas, then sync them back out of
> > the database.  This takes an hour or two for a system of this size,
> > so until I have a better solution I'll continue to use make for this
> > purpose.
> Using CM tools that can't do basic things properly isn't Right either, of
> course. Determining the relevance of this to anything remotely real is
> left as an exercise to the interested reader.

OTOH, when a company invests 100k on a CM system there is a certian
political momentum, especially after a similar amount again is
invested to get it only to this state.  It does certain things
rather well, however it falls short of my expectations of mature
product in many ways.


Benjamin.
(chained to the tool of the devil)

--
This is list (humbug) general handled by majordomo at lists.humbug.org.au .
Postings only from subscribed addresses of lists general or general-post.



More information about the General mailing list