C++ depot build tool
rachelbythebay.com“If you've used Make, you're used to creating Makefiles. In them, you manually build this immense graph where all of the parts hopefully add up to a working program.”
Actually, I used to generate rules using gcc -MM which does pretty much this: track dependencies in sources files using #include directives.
Nevertheless, I'm happy to see that the idea of automatically handling dependencies is still being worked on. I'd be so happy to trash all this scons config files we have to maintain on a near-daily basis.
I think she's talking about adding link rules based on the #includes. But yeah, you can learn how to use Make, or you can write yet-another replacement tool that other people who never learned to use Make get excited about.
Most make tools indeed suck, but some don't. And Make itself also sucks -- it was just there first.
djb/apenwarr's "redo" is a good make alternative, infinitely simpler and about as capable.
tup is a good make alternative, that does away with dependency definitions, recursive makes, etc. It just works and works quickly.
Ah makefiles. It's 2014 and they still can't handle spaces in directory names.
http://savannah.gnu.org/bugs/?712
I'm pretty sure that trying to replace them at this point is sacrilege.
Replacing Makefiles isn't the hard part. People do it all the time.
The hard part is figuring out a migration path away from Autotools. Not even replacing them, that's relatively easy. Integrating them into a world dominated by Autotools, that's difficult.
Hm. At first glance this doesn't seem too bad: idiomatic autotools usage involves files that are relatively easy to parse and are also relatively declarative, so making a non-autotools buildsystem that's capable of directly reading configure.ac and Makefile.am files seems doable.
Why doesn't this work in practice? Non-idiomatic usage? Custom autoconf scripts?
Really diggin the idea of using include files to figure out dependencies. Also seems ideal for beginners: look up the number of questions on SO asking 'which library to include for xxx.h' by people who don't yet grasp the difference between compiling and linking. Here you just say 'hey, I have this header, that's for this lib', directly. But it won't be for me though (well, this version at least, who knows what the future brings): it puts constraints on the directory structure used and on how you include your files. And is probably gcc only. Yielding basically none of my projects a candidate for testing this. But still, nice.
Also seems ideal for beginners: look up the number of questions on SO asking 'which library to include for xxx.h'
Pretty sure pkg-config can help with that. Don't see why you need SO.
Because beginners have no clue what pkg-config is?
Also it's the first-step problem of knowing what to ask pkg-config (or any other tool that can tell you). Header file names don't always map to the library or package name.
I really like the idea and having read the manual I think it's something that I want. However, it doesn't seem to be open source so how do I know that I can trust the binaries? The binary download is also over HTTP and not HTTPS.
My site has a https version, but you're right: it's still a mystery meat binary blob. No particular reason to trust those. They could be rather evil.
Looks quite promising! But: "It will sniff out all of the objects which also need to be built to make it work, and it will build them first." Seems like every source file will always be compiled regardless whether it has changed since the last build or not. So it's just for rather small projects.
Hi, thanks for checking it out. I had no idea this would be found all of the sudden. I haven't actually thought about this in a while (I just use it now... it's just another tool for me...) so bear with me here.
It won't rebuild stuff that already exists. Once it decides that it needs to make foo.cc and foo.h into foo.o, it won't recompile foo.o unless foo.cc or foo.h changes... or one of its source dependencies changes.
It's something like this: take all of the timestamps for all of the inputs and outputs for any given target. The oldest output has to still be newer than the newest input. If any input is newer than any output, then we need to build.
It sounds weird, but if you write it out like a number line it makes sense.
Maybe it compares output file timestamps against input file timestamps to decide whether to re-build, as make does? It's hard to figure out details like this without either documentation or source.
Used in conjunction with ccache, this may not matter too much.
cool!
A comment for the ROOT users (http://root.cern.ch/) amongst you HN readers: this appears to be very similar to 'ckon' which emerged from my PhD since 2011, and takes the humongous headache out of building C++ software modules within the ROOT analysis framework: http://tschaume.github.io/ckon/
@rachelbythebay: Since 'ckon' uses the same principles as your depot build tool, I thought you might be interested to take a look: https://github.com/tschaume/ckon :-)
Seems cool!
Anyone knows if the "everything for your project must be contained within a single directory root" constraint can be tricked using symbolic links?
I've never tried symlinks but I don't see why it wouldn't work. The key is that "foo/bar.h" needs to be readable with cwd == "src". If whatever filesystem you have will make content appear there, it should just work.
Having just tried it... sure, it'll work. Starting in my "depot" dir...
---
$ mkdir /tmp/hn $ echo 'int main() { return 0; }' > /tmp/hn/hn.cc $ ln -s /tmp/hn src/hn $ bb hn/hn I1106 144900 4720 build/dep.cc:591] Compiling: hn/hn I1106 144902 4720 build/deptracker.cc:184] Linking: hn/hn -rwxr-xr-x 1 u g 7364 Nov 6 14:49 bin/hn/hn $ bin/hn/hn $ echo $? 0
system_header { name: "gnuradio" name_type: DIRECTORY cflag: "-I/usr/local/include/gnuradio" }
system_header { name: "microhttpd.h" ldflag: "-lmicrohttpd" }
system_header { name: "mysql/mysql.h" ldflag: "-L/usr/lib64/mysql" ldflag: "-lmysqlclient" }
Let's see... hard-coded absolute paths, compiler-specific flags... Yeah I'll stick with QBS and wait for proper modules.
No, you're right, it's goofy. Clowny, you might say.
It should be more like this:
pkg-config --identify-file=microhttpd.h --> "libmicrohttpd"
The problem now is that you can't go from the .h name to the package name. Once you have the package name, you can use pkg-config to give you --libs and --cflags, but there's a (big) piece of the puzzle missing at the moment.
Changing pkg-config and its users to add that mapping would be amazing.
Once we had that, tools like this could see #include <foo.h>, look it up to a package name, use that to get the cflags and ldflags, and that would be it. No config needed.
Cool tool, making building easier seems to trend at the moment. :)
The page ("deps.html") that introduces the .build.conf file says "You can use pkg-config to help find those flags if your system has that installed", but then the actual configuration shown uses only absolute paths.
That threw me off, since it's not at all obvious how or where you can insert calls to external tools (like pkg-config) in the static-looking configuration. I think it'd be a good idea to edit in an example showing pkg-config being used.
Did Rachel update bb or release a new version or something? Just curious why this is appearing here now.
Probably a response to last night's "Building C Projects" https://news.ycombinator.com/item?id=8563005
Ah, that makes sense. If I pay close enough attention, I can track this rhythm of hacker news. Of course, I have a happier life if I don't :-)