[CMake] Effort to create a new generator (tup)

Bill Hoffman bill.hoffman at kitware.com
Thu May 6 08:28:34 EDT 2010


On 5/5/2010 9:33 PM, Mike Shal wrote:

>>   Not sure yet, perhaps none of it.  It would have to build custom commands
>> and targets.   Basically, support for code generators like
>> swig/moc/lex/yacc.  I guess the auto-depend stuff in tup would work for
>> C/C++, but the Fortran 90 stuff in CMake would not work.  It would be an
>> interesting project.  To be honest I really did not research tup very much
>> at all.  CMake is also used to build very large projects like KDE, sounds
>> like tup is designed with that in mind, but it would have to handle very
>> larger numbers of targets and .o files.
>
> I've used lex and yacc with it before (or at least flex and bison),
> but not the other code generators you mentioned. I also don't have any
> experience with Fortran 90, so I can't speak to that. The dependency
> analysis is done during the sub-program's execution based on which
> files are opened for read and write - it doesn't have any domain
> specific knowledge. If the file access patterns are funky, it's
> possible tup won't work with it. On the plus side, I don't think you
> will be able to construct a program large enough that tup won't be
> able to scale to.

The problem with fortran 90, is that you have to find out the depends to 
figure out the order in which files are built.   It has a system where 
"include like" files are generated by the compiler.  So, if you have:

a.f90 -> produces a.mod
b.f90 -> uses a.mod

If you compile b before a, then it fails to compile.  To solve this 
CMake's dependency code parses all the f90 files in a target first, then 
creates depend information so that b.f90 will depend on a.f90.  I 
suppose this part could be pushed to the generator time for the tup 
generator.  It might make for a slow generate step in the CMake process.


For the code generators, we just need to compile an executable and then 
be able to use that executable to create some code that is then compiled 
into a library or executable.

As far as the scalability, are there any studies that have been done 
with larger numbers of files?  On way to find out would be to create the 
generator.

How would a large tup system be organized?   Would there be a tup file 
for each target?   Would there be one huge tup file for the whole project?

>
>>
>>   Would tup support the idea of building .o files with -j 5, and linking
>> executables with -j 2 say?  Some targets are more expensive than others, and
>> you want to limit parallelism sometimes.   Note, the makefiles in CMake do
>> NOT do this now, it would be a nice feature.
>
> I don't see how this could be done without some domain specific
> knowledge, or by adding this info manually into the build
> configuration. I would think the optimal values would also vary widely
CMake has that domain knowledge.  It knows what is a link and what is a 
compile.   If there was a way to express that to tup, that would be very 
cool.  I have had repeated requests for this feature, and it is not 
possible to do with make, I was hoping perhaps it was with tup.

> for different users based on their particular machines. One advantage
> of tup here is that incremental builds are always accurate, so you
CMake builds are already very accurate with incremental builds.  :)


-Bill


More information about the CMake mailing list