"But you can use 3rd party repositories!" Yeah, and I also can just download the library from its author's site. I mean, if I trust them enough to run their library, why do I need opinionated middle-men?
These are very, very common problems; not edge cases.
Put another way: y'all know we got all these other package management/containerization/isolation systems in large part because people tried the C-library-install-by-hand/system-package-all-the-things approaches and found them severely lacking, right? CPAN was considered a godsend for a reason. NPM, for all its hilarious failings, even moreso.
Honestly? Over the course of my career, I've only rarely encountered these sorts of problems. When I have, they've come from poorly engineered libraries anyway.
That risk/QA load can be worth it, but is not always. For an OS, it helps to be able to upgrade SSL (for instance).
In my use cases, all this is a strong net negative. npm-base projects randomly break when new "compatible" version of libraries install for new devs. C/C++ projects don't build because of include/lib path issues or lack of installation of some specific version or who knows what.
If I need you to install the SDL 2.3.whatever libraries exactly, or use react 16.8.whatever to be sure the app runs, what's the point of using a complex system that will almost certainly ensure you have the wrong version? Just check it in, either by an explicit version or by committing the library's code and building it yourself.
1. The accepted solution to what you're describing in terms of development, is passing appropriate flags to `./configure`, specifying the path for the alternative versions of the libraries you want to use. This is as simple as it gets.
As for where to get these libraries from in the event that the distro doesn't provide the right version, `./configure` is basically a script. Nothing stopping you from printing a couple of ftp mirrors in the output to be used as a target to wget.
2. As for the problem of distribution of binaries and related up-to-date libraries, the appropriate solution is a distro package manager. A c package manager wouldn't come into this equation at all, unless you wanted to compile from scratch to account for your specific circumstances, in which case, goto 1.
I primarily write C nowadays to regain sanity from doing my day job, and the fact that there is zero bit rot and setup/fixing/middling to get things running is in stark contrast to the horrors I have to deal with professionally.
The idea of a protocol for “what artifacts in what languages does $thing depend on and how will it find them?” as discussed in the article would be incredibly powerful…IFF it were adopted widely enough to become a real standard.
I think you're going to need to know that either way if you want to run a dynamically linked binary using a library provided by the OS. A package manager (for example Cargo) isn't going to help here because you haven't vendored the library.
To match the npm or pip model you'd go with nix or guix or cmake and you'd vendor everything and the user would be expected to build from scratch locally.
Alternatively you could avoid having to think about distro package managers by distributing with something like flatpak. That way you only need to figure out the name of the libssl package the one time.
Really issues shouldn't arise unless you try to use a library that doesn't have a sane build system. You go to vendor it and it's a headache to integrate. I guess there's probably more of those in the C world than elsewhere but you could maybe just try not using them?
I'm not very familiar with MySQL, but for C (which is what we're talking about here) I typed mysql here and it gave me a bunch of suggestions: https://packages.debian.org/search?suite=default§ion=all... Debian doesn't ship binary blobs, so I guess that's not a problem.
"I have to build something on 10 different distros" is not actually a problem that many people have.
Also, let the distros package your software. If you're not doing that, or if you're working against the distros, then you're storing up trouble.
GLIBC_2.38 not found
If you're supplying your own binaries and not compiling/linking them against the distro-supplied glibc, that's on you.
But that's not the point I'm making. I'm attacking the idea that they're "working just fine" when the above is a bug that nearly everyone hits in the wild as a user and a developer shipping software on Linux. It's not the only one caused by the model, but it's certainly one of the most common.
Plus, we already have great C package management. Its called CMake.
That would be fine if it only effected that first layer, of a basic library and a basic app, but it becomes multiple layers of this kind of habit that then ends up in multiple layers of software used by many people.
Not sure that I would go so far as to suggest these kinds of languages with runaway dependency cultures shouldn't exist, but I will go so far as to say any languages that don't already have that culture need to be preserved with respect like uncontacted tribes in the Amazon. You aren't just managing a language, you are also managing process and mind. Some seemingly inefficient and seemingly less powerful processes and ways of thinking have value that isn't always immediately obvious to people.
I am not sure if it is just me, but I seem to constantly run into broken vcpkg packages with bad security patches that keep them from compiling, cmake scripts that can't find the binaries, missing headers and other fun issues.
Avoid at all cost.
And so when it comes to dynamic dependencies (including shared libraries) that are not resolved until runtime you hit language-level constraints. With C libraries the problem is not merely that distribution packagers chose to support single versions of dependencies because it is easy but because the loader (provided by your C toolchain) isn't designed to support it.
And if you've ever dug into the guts of glibc's loader it's 40 years of unreadable cruft. If you want to take a shot at the C-shaped hole, take a look at that and look at decoupling it from the toolchain and add support for multiple version resolution and other basic features of module resolution in 2026.
You meant: it's 40 years of debugged and hardened run-everywhere never-fails code, I suppose.
I've written two C package managers in my life. The most recent one is mildly better than the first from a decade ago, but still not quite right. If I ever build one I think is good enough I'll share, only to mostly likely learn about 50 edge cases I didn't think of :)