Right now there is no language that is good at io-uring. There are ok offerings, but nothing really has modern async joy that works with uring.
Whoever hammers out a good solution here is going to have a massive leg up. Rust is amazing in so many ways but it has been quite a brutal road to trying to support io-uring ok, and efforts are still a bit primitive, shall we say. If Zig can nail this down that would be fantastic!!
I would way rather Zig keep learning and keep changing, keep making new and better. Than to have it try to appease those who are too conservative for the project, unwilling to accept change and improvement, people focused on stability. It takes a lot of learning to make really good systems, to play with fit and finish. Zig is doing the good work. Imo we ought be thankful.
Hardly anything radical.
A language doing it today is doing it in the context of an ecosystem where even higher level languages exist and they have made the choice to target a lower level of abstraction.
And GPGPUs as well, like senders/receivers whose main sponsor is NVidia.
1. Standardise on a sync/async agnostic IO interface (or something similar) so you don’t get fragmentation in the ecosystem.
2. Stackless coroutines. Should give the most efficient async io code, and efficient code is one of the reasons I want to use low level language
If you want to compete with C, you can't do so without understanding that its stability and the developers focusing on mastering its practices, design, limitations, tooling has been one of the major successes.
Is there any reason to be rushing it? Zig isn't languishing without activity. Big things are happening, and it's better in my opinion for them to get the big important stuff right early than it is to get something stable that is harder to change and improve later.
"Competing with C" means innovating, not striving to meet feature parity so it can be frozen in time. It's not as though C has anything terribly exciting going on with it. Let them cook.
> They are now available to tinker with, by constructing one’s application using std.Io.Evented. They should be considered experimental because there is important followup work to be done before they can be used reliably and robustly:
> They are now available to tinker with, by constructing one’s application using std.Io.Evented. They should be considered experimental because there is important followup work to be done before they can be used reliably and robustly:
And then they proceed to list 6 important pending work to be done.
"should be considered experimental because there is important followup work to be done"
I guess not clear enough to some people.
I don’t see anything wrong with this, it’s kind of how Windows forces developers to use DLL to access syscalls (the syscall numbers can change) which IMO is a good architectural decision.
(The green threads coro stack stuff makes this more important.)
[1]: https://github.com/ziglang/zig/issues/23475#issuecomment-279...
There's a lot of hate in these comments. Nobody is forcing you to use Zig and it's not trying to be "done" right now. And in fact, if the only thing they were focusing on was putting a bow on the project to call it "1.0", it probably wouldn't achieve any of it's long term goals of being a mainstream systems programming language. If it takes another five years or fifteen, as long as the project moves forward with the same energy, it's going to be fine.
For a fairly small project that's largely one dude, this is far more than most of us have or could hope to ever achieve ourselves. Give the people putting in the work credit where credit is due.
It might be because I've done it a few times now, and/or because of the existence of LLMs, but this is the most fun I've had doing, and the most productive I've been, and the engine absolutely rips performance wise.
Zig makes it very easy to do this kind of lowish-level data-oriented programming, and tbh, I'm hooked. I was using rust for my performance critical services but dancing around the strictness and verbosity of memory management in rust gives me nothing in comparison and just gets in my way. This is partially a skill issue, but life is short and I just want to make fast, well organized software that works.
That thing, right here, is probably going to be rewritten 5 times and what not.
If you are actively using Zig (for some reasons?), I guess it's a great news, but for the Grand Majority of the devs in here, it's like an announcement that it's raining in Kuldîga...
So m'yeah. I was following Zig for a while, but I just don't think I am going to see a 1.0 release in my lifetime.
And tbh, I take a 'living' language any day over a language that's ossified because of strict backward compatibility requirements. When updating a 3rd-party dependency to a new major version it's also expected that the code needs to be fixed (except in Zig those breaking changes are in the minor versions, but for 0.x that's also expected).
I actually hope that even after 1.x, Zig will have a strategy to keep the stdlib lean by aggressively removing deprecated interfaces (maybe via separate stdlib interface versions, e.g. `const std = @import("std/v1");`, those versions could be slim compatibility wrappers around a single core stdlib implementation.
Maybe you would, but >95% of serious projects wouldn't. The typical lifetime of a codebase intended for a lasting application is over 15 or 20 years (in industrial control or aerospace, where low-level languages are commonly used, codebases typically last for over 30 years), and while such changes are manageable early on, they become less so over time.
You say "strict" as if it were out of some kind of stubborn princple, where in fact backward compatibility is one of the things people who write "serious" software want most. Backward compatibility is so popular that at some point it's hard to find any feature that is in high-enough demand to justify breaking it. Even in established languages there's always a group of people who want somethng badly enough they don't mind breaking compatibility for it, but they're almost always a rather small minority. Furthermore, a good record of preserving compatibility in the past makes a language more attractive even for greenfield projects written by people who care about backward compatibility, who, in "serious" software, make up the majority. When you pick a language for such a project, the expectation of how the language will evolve over the next 20 years is a major concern on day one (a startup might not care, but most such software is not written by startups).
Either those applications are actively maintained, or they aren't. Part of the active maintenance is to decide whether to upgrade to a new compiler toolchain version (e.g. when in doubt, "never change a running system"), old compiler toolchains won't suddenly stop working.
FWIW, trying to build a 20 or 30 year old C or C++ application in a modern compiler also isn't exactly trivial, depending on the complexity of the code base (especially when there's UB lurking in the code, or the code depends on specific compiler bugs to be present - e.g. changing anything in a project setup always comes with risks attached).
Of course, but you want to make that as easy as you can. Compatibility is never binary (which is why I hate semantic versioning), but you should strive for the greatest compatibility for the greatest portion of users.
> FWIW, trying to build a 20 or 30 year old C or C++ application in a modern compiler also isn't exactly trivial
I know that well (especially for C++; in C the situation is somewhat different), and the backward compatibility of C++ compilers leaves much to be desired.
It’s not like Clojure or Common Lisp, where a decades old software still runs, mostly unmodified, the same today, any changes mainly being code written for a different environment or even compiler implementation. This is largely because they take breaking user code way more seriously. Alot of code written in these languages seem to have similar timelessness too. Software can be “done”.
No, not even remotely. ABI-stability in C++ means that C++ is stuck with suboptimal implementations of stdlib functions, whereas Rust only stabilizes the exposed interface without stabilizing implementation details.
> Unfortunately editions don't allow breaking changes in the standard library
Surprisingly, this isn't true in practice either. The only thing that Rust needs to guarantee here is that once a specific symbol is exported from the stdlib, that symbol needs to be exported forever. But this still gives an immense amount of flexibility. For example, a new edition could "remove" a deprecated function by completely disallowing any use of a given symbol, while still allowing code on an older edition to access that symbol. Likewise, it's possible to "swap out" a deprecated item for a new item by atomically moving the deprecated item to a new namespace and making the existing item an alias to that new location, then in the new edition you can change the alias to point to the new item instead while leaving the old item accessible (people are exploring this possibility for making non-poisoning mutexes the default in the next edition).
One business domain that Rust currently doesn't have an answer for, is selling commercial SDKs with binary libraries, which is exactly the kind of customers that get pissed off when C and C++ compilers break ABIs.
Microsoft mentions this in the adoption issues they are having with Rust, see talks from Victor Ciura, and while they can work around this with DLLs and COM/WinRT, it isn't optimal, after all Rust's safety gets reduced to the OS ABI for DLLs and COM.
Do you know one industry that likes very much tossing closed-source proprietary blobs over the wall?
Game studios, and everyone that works in the games industry providing tooling for AAA studios.
And you can still internaly have it, if your deps have sources, or compile artifacts for only allow single Rust version (additional rules may apply).
There is work on Rust ABI (crabi), but there isn't a huge push for it.
You know what else is common in the games industry? C# and NDA's.
C# means that game development is no longer a C/C++ monoculture, and if someone can make their engine or middleware usable with C# through an API shim, Native AOT, or some other integration, there are similar paths forward for using Rust, Zig, or whatever else.
NDA's means that making source available isn't as much of a concern. Quite a bit of the modern game development stack is actually source-available, especially when you're talking about game engines.
Thus it will never be even considered outside the tech bubble.
ISO C++ standard is silent on how the ABI actually looks like, the ABI not being broken in most C and C++ compilers is a consequence of customers of those compilers not being happy about breakages.
It's a straightjacket that has application in few select cases.
Things ABI prevents in C++:
- better shared_ptr
- adding UTF8 to regex
- int128_t standardisation
- make most of <cstring> constexpr
And so on: https://cor3ntin.github.io/posts/abi/
I get you might have particular criteria on this. But it's a feature that comes with huge, massive downsides.
In theory. In practice the standards committee, consisting of compiler vendors and some of their users, shape the standard, and thus the standard just so happens to conspire to avoid ABI breakages.
This is in part why Google bowed out of C++ standardization years ago.
The entire C, C ABI and standard lib specs, combined, are probably less words than the Promise spec from ECMAScript 262.
A small language that stays consistent and predictable lets developers evolve it in best practices, patterns, design choices, tooling. C has achieved all that.
No evolving language has anywhere near that freedom.
I don't want an ever evolving Zig too for what is worth. And I like Zig.
I don't think any developer can resolve all of the design tensions a programming language has, you can't make it ergonomic on its own.
But a small, modern, stable C would still be welcome, besides Odin.
Not if you look into C23, include all the compiler extensions devs keep thinking are part of ISO C, and the "C ABI" is one per each existing OS written in C.
Zig is one of my favorite new languages, I really like the cross-compiler too. I'm not a regular user yet but I'm hopeful for its long-term success as a language and ecosystem. It's still early days, beta/dev level instability is expected, and even fundamental changes in design. I think community input and feedback can be particularly valuable at this stage.
If it’s a compiler frontend-> LLVM interaction bug, I think you are commenting in the spot - it should go in a separate issue not in the PR about io_uring backend. Also, interaction bugs where a compiler frontend triggers a bug in LLVM aren’t uncommon since Rust was the first major frontend other than clang to exercise code paths. Indeed the (your?) fix in LLVM for this issue mentions Rust is impacted too.
I agree with the higher level points about stability and I don’t like Zig not being a safe language in this day and age, but I think your criticism about quality is a bit harsh if your source of this complaint is that they haven’t put a workaround for an LLVM bug.
True in general but in the cloud especially, saving server resources can make a significant impact on the bottom line. There are not nearly enough performance engineers who understand how to take inefficient systems and make improvements to move towards theoretical maximum efficiency. When the system is written in an inefficient language like Python or Node, fundamentally, you have no choice but to start to move the hotpath behind FFI and drop down to a systems language. At that point your choices are basically C, C++, Rust, or Zig. Of the four choices, Zig today is already simplest to learn, with fewer footguns, easier to work with, easier to read and write, and easier to test. And you're not going to write 100k LOC of optimized hotpath code. And when you understand the cost savings involved in reducing your compute needs by sometimes more than 90% by getting the hotpath optimized, you understand that there is very much indeed a business case to learning Zig today.
Personally, it is a huge pain to rewrite things and update dependencies because the code I am depending on is moving out from under me. I also found this to be a big problem in Rust.
And another huge upside is you have access to best of everything. As an example, I am heavily using fuzz testing and I can very easily use honggfuzz which is the best fuzzer according to all research I could find, and also according to my experience so far.
From this perspective, it doesn’t make sense to use zig over c for professional work. If I am writing a lot of code then I don’t want to rewrite it. If am writing a very small amount of code with no dependencies, then it doesn’t matter what I use and this is the only case where I think zig might make sense.
Real example: I had to wait some seconds to compile and run benchmarks for a library and it re-compiles instantly (<100ms) with c.
Zig does have a single compilation unit and that might have some advantages but in practice it is a hard disadvantage. And I didn’t ever see someone pointing this out online.
I would really recommend trying to learn c with modernC book and try to do it with c for people like me building something from scratch
With the exception of fewer foot guns, which Rust definitely takes the cake and Zig is up in second, I'd say Zig is in last place in all of these. This really screams that you aren't aware of C/C++ testing/tooling ecosystem.
I say this as a fan of Zig, by the way.
That's a very good point, actually. However...
> with fewer footguns
..the Crab People[0] would definitely quibble with that particular claim of yours.
[0] https://en.wikipedia.org/wiki/Crab_People of course.
I really see no advantage for Zig over Rust after you get past that 2 first two weeks.
Zig is trying to get me instant compilation and I see that as a huge advantage for Zig (even past the first 2 weeks).
I'll probably stick with Rust as my "low level language" due to its safety, type system, maturity, library ecosystem, and career opportunities.
But I remain jealous of Zig's willingness to do extreme things to make compilation faster.
A full build was definitely much faster, but not as useful. Especially when using a build system with shared networked caching (Bazel for example).
Yes those projects were a bloated mess, as it always seems to be.
And avoid header libraries, C++ isn't a scripting language.
But I digress. I was thinking of Zig in comparison to C when I wrote that. I don't have a problem conceding that point, but I still believe the overall argument is correct to point to Zig specifically in the case of writing code to optimize a hotpath behind FFI; it is much easier to get to more optimal code and cross-compilation is easier to boot (i.e. to support Darwin/AppleSilicon for dev laptops, and both Linux/x64 and Linux/arm64 for cloud servers).
In theory no. In practice it really does.
> unsafe code is still unsafe
Ok, but most rust code is not unsafe while all zig code is unsafe.
> and the borrow checker and Rust's language complexity are their own kind of footguns
Please elaborate. They are something to learn but I don’t see the footgun. A footgun is a surprisingly defect that’s pointed at your foot and easy to trigger (ie doing something wrong and your foot blows off). I can’t think how the borrow checker causes that when it’s the exact opposite - you can’t ever create a footgun without doing unsafe because it won’t even compile.
> but I still believe the overall argument is correct to point to Zig specifically in the case of writing code to optimize a hotpath behind FFI; it is much easier to get to more optimal code and cross-compilation is easier to boot (i.e. to support Darwin/AppleSilicon for dev laptops, and both Linux/x64 and Linux/arm64 for cloud servers).
I agree cross compilation with zig is significantly easier but Rust isn’t that hard, especially with the cross-rs crate making it significantly simpler. Performance, Rust is going to be better - zig makes you choose between safety and performance and even in unsafe mode there’s various things that cause better codegen. For example zig follows the C path of manual noalias annotations which has been proven to be non scalable and difficult to make operational. Rust does this for all variables automatically because it’s not allowed in the language.
Close, but not the way I think of a footgun. A footgun is code that was written in a naive way, looks correct, submitted, and you find out after submitting it that it was erroneous. Good design makes it easy for people to do the right thing and difficult to do the wrong thing.
In Rust it is extremely easy to hit the borrow checker including for code which is otherwise safe and which you know is safe. You walk on eggshells around the borrow checker hoping that it won't fire and shoot you in the foot and force you to rewrite. It is not a runtime footgun, it is a devtime footgun.
Which, to be fair, is sometimes desired. When you have a 1m+ LOC codebase and dozens of junior engineers working on it and requirements for memory safety and low latency requirements. Fair enough trade-off in that case.
But in Zig, you can just call defer on a deinit function. Complexity is the eternal enemy, and this is just a much simpler approach. The price of that simplicity is that you need to behave like an adult, which if the codebase (hotpath optimization) is <1k LOC I think is eminently reasonable.
You’re contradicting yourself a bit here I think. Erroneous code generally won’t compile whereas in Zig it will happily do so. Also, Zig has plenty of foot guns (eg forgetting to call defer on a deinit but even misusing noalias or having an out of bounds result in memory corruption). IMHO the zig footgun story with respect to UB behavior is largely unchanged relative to C/C++. It’s mildly better but it’s closer to C/C++ than being a safe language and UB is a huge ass footgun in any moderate complexity codebase.
The only major UB from C that zig doesn’t address is use after free afaik. How is that largely unchanged???
Just having an actual strong type system w/o the “billion dollar mistake” is a large change.
* Double free
* Out of bounds array access
* Dereferencing null pointers
* Misaligned pointer dereference
* Accessing uninitialized memory
* Signed integer overflow
* Accessing a union field for which the active tag is something else.
https://github.com/ityonemo/clr
(Btw: you can't null pointer dereference in zig without using the navigation operator which will panic on null; you can't misalign a pointer unless you use @alignCast which will also create a panic)
But also, there is no reason why it should have to be in the main compiler. I've architected it as a dlload plugin. It's even crazier! The output is a zig program which you must compile and run to get the final result.
Some of those tools are static, others are dynamic, some require a special build, others are hybrid, others exist on all modern IDEs.
So it can be a mix of lint, clang tidy, VS analysis, Clion, ASan, USBsan, hardned runtimes, contracts (Frama-C), PVS, PurifyPlus, Insure++,....
If you believe I mischaracterized zig, please enlighten me what I got wrong specifically rather than attacking my ad hominem
Arguing about whether certain static analysis should be opt in or opt out is just extremely uninteresting. It’s not like folks are auditing the unsafe blocks in their dependencies anyways.
If you want to talk about actual type system issues that’s more interesting.
The proof is in the pudding. TigerBeetle despite having a quite opinionated style still almost hit by UB and basically got lucky it wasn’t a worse failure. By contrast, even though unsafe isn’t audited for all dependencies, it does in practice seem to make UB extremely unlikely. And there’s work ongoing in the ecosystem to create safe abstractions to remove existing unsafe into well tested and centralized things
There is no point throwing it all away to get back to the starting line.
It has issues like panicking or segfaulting when using some data types (arrow array types) in the wrong place.
It is extremely difficult to write an arrow implementation in Rust.
It is much easier to do it in zig or c(without strict aliasing).
I also had the same experience with glommio in Rust.
Also the binary that we produce compiles in several minutes and is above 30mb. This is an insane amount of bloat. And unfortunately I don’t think there is another feasible way of doing this kind of work in rust, because it is so hard to write proper low level code.
I don’t agree with noalias being bad personally. I fuond it is the only way to do it. It is much harder to write code with pointers with implicit aliasing like c has by default and rust has as the only option. And you don’t ever need to use noalias except some rare places.
To make it clear, I mean the huge footgun in rust is producing a ton of bloat and subpar code because you can’t write much and you end up depending on too many libraries
Nothing is forcing you to do that other than it’s easy to add dependencies. I don’t see how zig is much different
Hashmap is a good example to this. I was able to fairly easily port some parts of hashbrown to c but I’m pretty sure I can’t write that code in Rust in a reasonable amount of time.
Yes, with almost complete lack of documentation and learning materials it is definitely the easiest language to learn.
I remember when learning Zig, the documentation for the language itself was extensive, complete, and easily greppable due to being all on one page.
The standard library was a lot less intuitive, but I suspect that has more to do with the amount of churn it's still going through.
The build system also needs more extensive documentation in the same way that the stdlib does, but it has a guide that got me reasonably far with what came out of the box.
Its Zig 0.15 effort started in August and was only complete in October (see first PR at https://github.com/ghostty-org/ghostty/pull/8372). And many issues were encountered and solved along the way. And of course during all of this they also encountered an issue in Zig itself: https://github.com/ziglang/zig/issues/24627
My main Zig project will need over 1,000 edits to get it up there :O I've already had Claude spec out the changes required. I'll just have it or Codex or whatever fork itself into one agent per file and bang on it (for the stuff I can't regex myself) ;)
But the IO thing is frankly a good idea. I/O dependency injection, when I've used it in the past, has made testing quite a bit simpler (such as routing any I/O stream into a string to assert on) and the code much easier to reason about. The extra argument is a bit annoying, but that's the price of purity and it's worth it.
It is such a readable language that I found it easier learning the API than most languages.
Zig has this on its side. Reading the unit tests directly from the code give, most of the time, a good example too.
I hear great things about the language but only have so many hours in the day and so many usable neurons to spend in that day. Someday it would be nice to play with it.
The easiest way to embrace any new language is to have a compelling use to use it. I've not hit that point yet.
This would translate as ~"eats pussy", where "broûter" is a verb reserved for animals feeding on grass, implying a hefty bush.
Though perhaps the Zig developers have promised this will not happen.
Lol, I’ll borrow this.
Kudos Zig contributors!
I am not understanding the point here, do people expect they ship 1.0 before they know it is good or ready?
No wonder why software quality have deteriorated rapidly in the past 20 years.
LLMs are good at dealing with things they've seen before, not at novel things.
When novel things arise, you will either have to burn a shed ton of tokens on "reasoning", hand hold them (so you're doing advanced find and replace in this example, where you have to be incredibly precise and detailed about your language, to the point it might be quicker to just make the changes), or you have to wait until the next trained model that has seen the new pattern emerges, or quite often, all of the above.
Real-world example: Claude isn't familiar with the latest Zig, so I had it write a language guide for 0.15.2 (here: https://gist.github.com/pmarreck/44d95e869036027f9edf332ce9a...) which pointed out all the differences, and that's been extremely helpful in having me not even have to touch a line of code to do the updates.
On top of that, for any Zig dependency I pull in which is written to an earlier version, I have forked it and applied these updates correctly (or it has, under my guidance, really), 100% of the time.
On the off chance that guide is not in its context, it has seen the expected warning or error message, googled it, and done the correct correction 100% of the time. Which is exactly what a human would do.
Let's play the falsifiability game: Find me a real-world example of an upgrade to a newer API from the just-previous-to-that API that a modern LLM will fail to do correctly. Your choice of beer or coffee awaits you if you provide a link to it.
that’s why it sounds to me like these people commenting haven’t even used these models yet.
Tbh, while impressive that it appears to work, that guide looks very tailored to the Zig stdlib subset used in your projects and also looks like a lot more work than just fixing the errors manually ;) For a large code base which would amortise the cost of this guide I still wouldn't trust the automatic update without carefully reviewing each change.
I still wouldn't want to deal with that much churn in my language, but I fully believe an agent could handle the majority of, if not all of, the migration between versions.
People see the languages/libraries they use as their sellable articles. And why wouldn’t they? Every job application begins with a vetting of which “tools” you can operate. A language, as a tool, necessarily seeks to grow its applicability, as securing the ROI of if its users.
And even when not tied to direct monetary incentives, it can still be tied to one’s ability to participate and influence the direction of various open source efforts.
Mix in barely informed decision makers, seeking to treat those engineers as interchangeable assets, and the admirable position being promoted above falls down the priority chain.
> You don't need to treat it like an identity.
This is an eternal problem in this industry and it is by far the most annoying thing about it.
It's another language stack that would need to be maintained within Linux distributions for years to come (security support, architecture support etc).
Upstream developers always seem to assume that there is no cost associated to introducing new software stacks. But in the end, someone has to maintain it. And they keep forgetting the purpose of software is to serve users, not developers.
And I'm not sure what's so revolutionary about Zig that couldn't have been solved by improving other languages.
For Zig in particular, the language isn't even stable enough that you can compile packages like Ghostty with any recent version of the Zig compiler. It has to be a very specific version of the compiler.
Personally I'm glad that there are more people trying to break out of the C tar pit. Even if I'd never chose to use the language.
Developers are the users of these software stacks though? I don't really understand your point.
I don't have any horse in the game, but I do think Zig is interesting. This remark is funny to me because it's literally one of the tenets the Zig devs make decisions by!
https://ziglang.org/documentation/master/#Zen
> * Together we serve the users.
And if the new software stack just improves a fraction of the ecosystem, it isn't worth the effort.
So what is your point?
> Why do people "worry" about Zig potentially never becoming mainstream?
There are people who want to learn Zig because they're excited to gradually transition their way away from some other ecosystem, and into the Zig ecosystem, as the Zig ecosystem takes root and develops.
But they don't want to regret that decision. They don't want to end up in a place where they're writing blog posts about how much they love Zig, and are the maintainer of five popular Zig libraries, and yet still feel forced to use C++ for their next actual app project (where they're then mostly unable to make use of those Zig libraries!) just because Zig, not being sufficiently "mainstream", can't attract/force the corporate owners of big fat libraries (Vulkan, CUDA, LLVM, etc) to invest effort into integrating with it, and continuously maintaining those integrations for it (i.e. including a Zig build in their CI matrix, so that their upstream changes can't silently break that integration.)
Rust will also never replace C or C++ in any meaningful way, at best new code gets written in new languages (and Rust being only one among many, and among languages used for new projects will also be C and C++, just maybe not that often).
I think the era of 'pop star languages' is over, the programming language future is highly diverse (and that's a good thing).
Not only do I disagree it never will, I think it's already well on its way to doing exactly that.
C++ on the other hand? Possibly, though I think that it's just as much because of the own-goals of the C++ standards committee as it is the successes of Rust. I don't really consider Zig a competitor in this space because if you're reaching for C++, you are reaching for a level of abstraction that Zig is unwilling to provide.
This is ironic since these two crowds are mostly solving the same type of problems. It's just democrats vs republicans type of split, some of it is just for show and philosophical.
This is a painfully shallow framing.
Yes, programming languages solve problems by emitting instructions that a programmable logic chip can use to preform calculations on input resulting in output. And the scaffolding you use to get there isn't just a matter of philosophical show. Rust as a first order decision will refuse to emit perfectly valid programs because it's unable to prove it's correctness. Zig will emit any program it has enough information to do so. People coding in rust off load much of the effort in understanding and proving that correctness to the compiler. In Zig that relationship is reversed, where the compiler offloads that responsibility to the programmer.
The person you responded to is correct. For some people. Rust solves the difficult and annoying problems, for others it creates difficult and annoying problems.
Some people like creating art, some people like creating software. I guess you could frame that as philosophical, but to call it a political show, belies ignorance to the interactions between systems and predispositions of individuals.
Interestingly, Carbon is kinda trying to tackle both at the same time (though starting from C++ in their case) which is a bit of a challenge.
> Carbon: graduating from the experiment
https://ndctoronto.com/agenda/carbon-graduating-from-the-exp...
As for it being widely adopted, people keeping missing the point that Carbon is mostly for Google themselves, as means to integrate into existing C++ projects.
They are the very first ones to assert that for green field projects there are already plenty of safe languages to chose from.
In case that you are well familiar with for instance pattern matching, might you have any opinions on the pattern matching that is currently proposed for Carbon?
https://docs.carbon-lang.dev/docs/design/pattern_matching.ht...
Regarding the linked pattern matching proposal, it seems alright to me, not everything has to be ML like.
match (0, 1, 2) {
case (F(), 0, G()) => ...
}
> Here (F(), 0, G()) is not an expression, but three separate expressions in a tuple pattern. As a result, this code will call F() but not G(), because the mismatch between the middle tuple elements will cause pattern matching to fail before reaching G(). Other than this short-circuiting behavior, a tuple pattern of expression patterns behaves the same as if it were a single expression pattern.How would that work with exhaustiveness checking? As far as I can tell, they themselves believe that Carbon's exhaustiveness checking will be very poor.
And OK with implicit conversions? Especially when combined with their way of handling templates for pattern matching?
https://tomas-svojanovsky.medium.com/mitchell-hashimoto-go-a...
https://www.youtube.com/watch?v=dJ5-41u-e7k
https://weeklyrust.substack.com/p/why-roc-is-moving-away-fro...
Perhaps there is room for both... via C FFI interop, of course, lol
(C FFI will probably long outlast C itself...)
Those are the companies I care about when chosing which programing languages to invest my time on.
Both undoubtedly are talented programmers, but you overestimate impact and importance of these project.
GitHub stars and HN posts are not very good indicator of what happens in the real world
Rust has definitely gained some ground while they're hardly any relevant products using Zig.
I mean, less is true, but “hardly” is doing a lot of work there
The other tailwind for Zig is that it’s easier than ever to translate an existing codebase with tests into a new language with AI.
That requires literally rethinking every language and standard library facility and asking "is this putting up artificial roadblocks or even invoking straight UB when one tries to use it idiomatically in unsafe contexts?", then coming up with more flexible, more "C like" facilities in that case. It's hard work but quite doable.
In other words, the Rust approach to safety is to make as few unsafe LoC as possible, and the Zig approach is to make every unsafe line as safe as possible. As long as their philosophical approach to safety is such that Zig makes writing unsafe code easy and Rust avoids it as much as possible, then writing unsafe code in Zig will always be easier.
This has a big effect on unsafe code. When unsafe code gets called indirectly from safe code, the unsafe code has to make sure that whatever the safe code does, the result is still safe. This requires very careful design of the interface provided by the unsafe code.
I think it is a research question whether Zig would make writing Rust unsafe code a lot easier. In particular the boundary between safe Rust and unsafe code written in Zig could become very tricky. Possibly tricky enough that it would be as complex to write as unsafe Rust today.
For instance, any &mut reference in Rust is assumed not to be aliased, and any &reference not involving UnsafeCell<...> is assumed not to be written to. These implied contracts can be loosened, e.g. by using &Cell<> if applicable (may alias, can be read or written safely, but only "as a whole object": access to the internals does not escape beyond any single operation) which is arguably closer to idiomatic C.
MaybeUninit<> is another common example: C and Zig code often works with possibly-uninitialized data, but this possibility has to be accounted for explicitly in a safe Rust interface. It's always insta-UB if a safe Rust function is passed uninitialized data, even when the equivalent would work idiomatically in C.
Though I can imagine that unsafe Rust still has to many of the safe Rust's rules. So there could be a better unsafe language that has fewer restrictions.
> If anything, the Rust language would benefit from adding as much friction to unsafe code as possible
The friction is there already. I'm not advocating for getting rid of explicit 'unsafe' markings, 'SAFETY' comments or even clunky boilerplate, just for closing a very real gap in capability.
Maybe it's just C++ teams being conservative, but a lot of them really seem to hate Rust when you talk with them for whatever reason. I can't imagine why though, but then I've only ever worked with C when I had to, and I have never worked with C++. From having played around with both C++ and Rust, I'd personally pick Rust, but I'm sure it's because I don't know enough. Either way I doubt I'll ever see Rust in a real world job in my corner of the world.
Specially because it's not a drop-in replacement for C++. As Zig is for C.
So when Zig hits 1.0 these companies will probably consider Zig much more than they do today. Understandably.
Secondly, let us know when Amazon rewrites Firecracker in Zig, Android replaces Rust with Zig, or Mark Russinovich goes to some Zig conference explaining why Azure is dropping Rust for Zig,
"Mark Russinovich, Microsoft Azure CTO tells Rust Nation UK 2025 why Azure is moving to Rust from C++"
But in the video he says they are starting with critical systems first.
Emphasis on starting.
So yeah, not as pervasive as you, the evangelists or that video title implies.
"Microsoft is Getting Rusty: A Review of Successes and Challenges - Mark Russinovich"
https://www.youtube.com/watch?v=1VgptLwP588&t=1903s
The only Microsoft divisions that are still all are into C++ is Windows and XBox, and still, C++/WinRT is now in maintenace, the team has moved into windows-rs, WDK now has Rust bindings available.
Given that Rust is quite an old language now and its adoption is still so low, I don't think that should be much of a worry, although that doesn't mean Zig will be the option of choice, and not stabilising is certainly not a good sign. At Rust's adoption rate, a language that hasn't been invented yet and that would show a more normal rate of adoption for a popular language could easily overtake it.
> There is also the issue of will people actually code by then.
Now that could be a bigger issue. :)
So being part of 3 major OS (Windows, Android and now Linux), the big 3 cloud providers having SDKs for the language, used by so much tooling (js + python) and being part of major internet infrastructure means its “slow” adoption then wow…
There's no denying Rust's popularity in open-source CLI dev tools for Python and JS/TS, but when you talk to C/C++ shops who've evaluated Rust and see how many of them end up using it (and to what extent) you see it's not like it's been with languages that ended up achieving real popularity (which includes not only super-popular languages like C, C++, and Java, but also mid-popular languages like Go).
So even though C++ is the language I reach for outside Java, C#, TypeScript, I would assert that downplaying Rust adoption by Amazon, Adobe, Microsoft, Google, is losing track where things are going.
> that when the time gets to be rewritten it won't surely be C++.
It looks like it won't be Rust, either. I mean, some may be written in Rust, but not the majority. My point is just that as much as some erstwhile Haskell fans have taken to Rust, comparing Rust's adoption to Haskell's - a language whose joke motto was "avoid success at all costs" - is not the right metric. Given that Rust's goal was to replace C++, its success should be compared to C++ and other languages that ended up achieving similar success. I'm saying that compared to them Rust's success has been mediocre, if that, and it's not a young language anymore by any stretch of the imagination.
So many language designers would dream to have such adoption numbers by tech giants for their hobby language.
All major cloud vendors deploying my Java, .NET and nodejs containers do so, in infrastructure that has various layers of Rust code in it.
To value that as mediocre is quite strange.
C++ came out in 1985 and competed with C, COBOL, Pascal and FORTRAN. It was an overall improvement than those and therefore there is a legit reason for it to take off.
> how many of them end up using it (and to what extent) you see it's not like it's been with languages that ended up achieving real popularity
I assume many places that have a huge codebase in C++ would just do a port to Rust. That would almost always cause problems but for greenfield projects it's a no brainer IMO.
Of course. The rate of adoption is related to the increase in value compared to the status quo, much like how genes spread. But Rust's adoption is slow precisely because its "fitness benefit" is low.
> That would almost always cause problems but for greenfield projects it's a no brainer IMO.
It would have been a no brainer if, when writing a new codebase expected to last 20 years or more (which is often the case with software written in low-level languages), you'd believe the chosen language to be very popular over the next few deacdes. But given its slow adoption compared to languages that ended up achieving that status, despite it's rather old age, it's not looking like a safe bet, which is why Rust's adoption for important greenfield projects is also low (again, relative to other languages).
No, this completely overestimates how quickly languages gain prominence.
C came out in 1972 and didn't gain its current dominance until approximately the release of the ANSI C spec in 1989/1990, after 17 years.
C++ came out in 1985 and didn't become the dominant language for gamedev until the late 90s (after it had its business-language-logic niche completely eaten by Java), after 14 years or so.
Python came out in 1991 and labored as an obscure Perl alternative until the mid-late 2000s, after about 16 years (we can carbon-date the moment of its popularity by looking at when https://xkcd.com/353/ was released).
Javascript came out in 1995 and was treated as a joke and/or afterthought in the broader programming discourse until Node.js came out in 2009, after 14 years.
Rust is currently 11 years old, and it's doing quite excellently for its age.
While it kept growing in popularity later, by 1983-5 C was already one of the top programming langugages in the world.
> C++ came out in 1985 and didn't become the dominant language for gamedev until the late 90s
Major parts of Windows and Office were being written in C++ in the early-mid 90s, before C++ turned 10. Visual C++, one of Microsoft's flagship development products, came out in 1993. Huge mission-critical, long-term, industrial and defence projects were being written in C++ during or before 1995 (I was working on such a project).
> Python came out in 1991 and labored as an obscure Perl alternative until the mid-late 2000s
Even in 2002 Python was widespread as a scripting language. But it is, indeed, the best and possibly only example of a late bloomer language.
> Javascript came out in 1995 and was treated as a joke and/or afterthought in the broader programming discourse until Node.js came out in 2009
AJAX (popularised by Gmail) pretty much revolutionised the web in 2004. When jQuery came out in 2006, JS was all over the place.
Major parts of Windows, Android, and Linux were being written in Rust before it turned 10. Major parts of AWS were being written in Rust before it turned 4. Major parts of Dropbox were being written in Rust before it turned 1. So you agree by your own criteria that Rust is a major language?
> While it kept growing in popularity later, by 1983-5 C was already one of the top programming langugages in the world.
In the mid-80s, C still had plenty of major and healthy competitors, as pjmlp will imminently arrive to remind you. By the criteria of mid-80s C, Rust is already one of the top programming languages in the world.
> AJAX (popularised by Gmail) pretty much revolutionised the web in 2004. When jQuery came out in 2006, JS was all over the place.
No, despite the existence and availability of freestanding interpreters (e.g. Rhino), Javascript was an also-ran everywhere except the web; which is to say, nobody was choosing to use Javascript except the people forced at gunpoint to use it. There are infinitely more people choosing to use Rust at the age of 11 than were choosing to use Javascript at the age of 11, which means that, once again, by your criteria, you must consider Rust a major language. You can just admit it instead of being a tsundere.
No, because I was merely responding to your specific points and the extent is nowhere near the same. C++ had a huge market share before it turned 10. In the late nineties I was working on a critical air traffic control system, first used in 1995, written half in Ada half in C++. Around 1995-6 C++ was already ubiquitous in serious software.
> In the mid-80s, C still had plenty of major and healthy competitors, as pjmlp will imminently arrive to remind you.
He doesn't need to remind me because I was there. Yes, there were major competitors, yet C was already very near the top.
> Javascript was an also-ran everywhere except the web
That's pretty much where it is today, yet it's still #1 (if we count TS as part of JS).
> There are infinitely more people choosing to use Rust at the age of 11 than were choosing to use Javascript at the age of 11
The number of people using Rust professionally is a fraction of that of any of the languages we mentioned, plus many more (C#, PHP, Ruby, Go, Kotlin) at the same age. It's more similar to Ada's adoption at that age in size (of course, there wasn't much small OSS projects then at all, let alone written in Ada, but there were many more big, mission- and even safety-critical systems being written in Ada at that age than in Rust). Is it a serious language? Absolutely! But so far its trajectory doesn't resemble that of any language that's achieved wide popularity.
And see which compiler toolchains had more ads, articles submissions, or source code listings.
What's the conversion ratio of 1 article in Dr.Dobbs to YT video or Twitter post by a dev-celebrity?
Pretty bad situation for Rust.
> the rate of release of projects written in that language
I guess back in 90s they didn't do grep clones every weekend, so Rust wins here definitely.
> the rate of release of books and learning materials
And what this data tells us?
> the rate at which universities begin teaching the language
Pretty bad situation for Rust.
> the volume of discourse devoted to that language in magazines and the online venues which did exist (e.g. Usenet)
But you need to normalize this volume against total volume of content out there produced every second, and then situation becomes complicated.
> crucially the declining metrics of all of the above for the direct competitors of that language
Why ignore job offerings?
It's true that the total market share for low level languages (C + C++ + Rust + Zig + others) continues declining, as it has for a couple of decades now (that may change if coding agents start writing everything in C [1] but it's not happening yet), but that's all the more reason to find some "safe bet" within that diminishing total market. Rust's modest success is enough for some, but clearly not enough for many others to be seen as a safe bet.
GitHub's survey did not say much about Rust I think, despite Rust projects often having lots of starring. Rust projects might have a greater ratio of stars-to-popularity than projects in other languages, though.
StackOverflow's survey was much more optimistic or indicated popularity for Rust.
Redmonk places Rust at place 19th.
https://www.devjobsscanner.com/blog/top-8-most-demanded-prog...
https://uk.indeed.com/career-advice/career-development/codin...
https://www.itransition.com/developers/in-demand-programming...
https://www.hackerrank.com/blog/top-developer-skills-in-2025...
Viewing these numbers through an optimistic or pessimistic lens is a matter of perspective and, of course, no one knows the future. But when you compare Rust, which is a middle-aged language now, to how languages that ended up "making it" were at the same age, the comparison is not favourable.
Zig OTOH is clearly, to me at least (opinion alert), a "better C". It even compiles C!
I expect LLMs to be really good at converting C to Zig.
> There is also the issue of will people actually code by then.
LLMs don't take responsibility. So even if code is generated, a human will have to assess it. I think assessing Zig is easier than assessing C, which gives this language a selling point that holds out in the AI assisted programming future.
Or should I say, I've not written a single line of Zig because I've been managing AI's coding in Zig.
Turns out Zig is a fantastic language to "centaur" on. (Reference is to "centaur chess" and which is also sort of becoming a term indicating close code design cooperation with an LLM.)
All of that C code that the LLM trained on ends up helping it work out algorithms in Zig, except that Zig has waaaay more safety guarantees and checks. And is often faster compiled code than the equivalent C, which is astonishing.
The same can be said about Zig's comptime. It's entirely unlike anything C, C++ or Rust has to offer.
> I expect LLMs to be really good at converting C to Zig.
While it's possible to translate C to Zig code - and you don't need an LLM for that, it's a Zig compiler/build-system feature - the result will be quite different from a project that's developed in Zig from the ground up since the translation output wouldn't make use of Zig's unique features (and Zig isn't really unique as 'C translation target', C can also be translated to unsafe Rust, or even to Javascript - see early Emscripten versions).
Also, the 'C compatibility' of Zig is implemented via a separate compiler frontend, Rust toolchains could do exactly the same thing by integrating the Clang frontend into the Rust compiler.
OTOH going from C++ (OO) to Rust (not OO, borrow checker) is a big leap.