For those unfamiliar, Nix is a powerful package manager and build system that ensures reproducible builds via strict dependency isolation. However, one of its limitations has been that it builds packages as monolithic units - if you change one source file, Nix rebuilds the entire package from scratch.
nix-ninja solves this by bringing compilation unit level granularity to Nix builds. It targets ninja build files, a format for describing build graphs output by popular build systems like CMake and meson. This means that when you modify a single source file, only the affected compilation units need to be rebuilt, significantly reducing build times.
We're excited because this gives Nix fine-grained caching with early cutoff optimisation (see the Build Systems à la Carte paper). Combined with remote build farms like Nixbuild.net, this moves Nix into the incremental cloud build systems arena, alongside Google's Bazel and Meta's Buck2.
Dynamic derivations and content-addressed derivations are still experimental features in Nix, so we're hoping to accelerate their maturation by providing a compelling use-case and implementation. Our north star is to have Hydra (nixpkgs' CI runner) support incremental compilation in nixpkgs for slow builds like LLVM.
nix-ninja can compile Nix itself today, but we're still very early in its development. Given community interest in dynamic derivations and incremental compilation in Nix, we decided to open source it in an pre-alpha state to involve the community in its design.
Please take a look. We'll be available in the comments to answer any questions: https://github.com/pdtpartners/nix-ninja
The GitHub RFC says it's only 80% complete
The short version is that dynamic derivations depends on content-addressing derivations, and content-addressing derivations has been in experimental-feature limbo for a while now. However (!), with a little push + judicious cutting scope, we can get the latter stabilized. And then the remaining road to stable dynamic derivations should be very smooth.
It's a more complicated story to tell because the work is a series of yak shaves, but it's not actually a huge amount of work.
What other formats do you think are worth targeting?
Since there's a target that depends on a generated source file for Nix's bison parser. But other targets incrementally compile comparably with regular ninja. So far we observed that Nix's sandboxing overhead is neglible.
A long term end goal is that if you don't change headers and only change is single C/C++, anywhere in the build graph for your entire system you should get a quick recompile of just that file + relink just the executables/shared libraries the output object file is built in.
This will require boiling an oceans, to get all the packages' build systems using Ninja or similar like this. But that's hopefully
- less boiling that rewriting the whole world of open source in Bazel/Buck2.
- a far more incremental, crowd-source-able project, as you could convert packages one-by-one, starting with big builds like LLVM and Chromium as Edgar says.
Nix provides the toolchain and dependency management (like where boost is from), as well as ability to remotely execute it which makes it attractive to scale out large builds like a `-j 999`. Nix also lets you do things like patching boost and then recompiling both boost and the downstreams (incrementally and with early cut-off if using nix-ninja) all in one build graph.
All in all, probably not useful if you're not already needing features from Nix. But if you are, this should speed up your builds by a significant amount.
Less features is probably good, because make originally was just the dependency thing, but got abused as a build system.
Any performance improvements such as incremental compilations / incremental builds / interpreters / copy-on-write / hot-code reloading are always welcome.