Nim fixes many of the issues I had with Python. First, I can now make games with Nim because it’s super fast and easily interfaces with all of the high performance OS and graphics APIs. Second, typos no longer crash in production because the compiler checks everything. If it complies it runs. Finally, refactors are easy, because the compiler practically guides you through them. The cross compiling story is great you can compile to JS on the front end. You can use pytorch and numpy from Nim. You can write CUDA kernels in Nim. It can do everything.
See: https://www.reddit.com/r/RedditEng/comments/yvbt4h/why_i_enj...
It heavily encourages namespace pollution and `import *`, making it very hard to figure out what where a given function is coming from and hence what it does.
Has the LSP situation improved yet? Similar issue with Crystal lang, which I enjoy even more than Nim.
Unfortunately it may not be until Nim 3 based on the Nimony rewrite comes out. It supports incremental compiling and caching which will make the underlying LSP tooling much better.
However I find with Cursor that I don’t mind so much the missing autocomplete. I’ve actually thought about writing an alternative LSP just for quick find and error inlining…
But LSP as a major concern? For me these little helpers are useful to catch small typos but I could happily do without them.
I can work without an LSP, but when I'm searching for a new language that would be used by a team (including Junior devs) it's hard to justify something missing the basics of good DX. I haven't tried it with Cursor though, it might be less of a dealbreaker at this point.
You can do it with just rg or something similar but it will give you many false positives and are going to waste quite some time.
Gentle correction: Python is typed now too and you can get the benefits of typing both in your IDE (via LSP) and before deploying to production (via mypy and the like). This happens both by type inference as well as explicit type annotations.
Not to mention, if a library does not or does sloppily use type annotations, you would not get reliability even with a perfect type checker.
I write 95% of my code in python, I love it, it's my go to language for many things. But for huge code base which I don't master, the lack of type system makes me more than nervous.
"Can be typed now" is somewhat correct.
"Can be partially typed now" is actually correct.
Php even seems to have decent static types these days, but do my coworkers use them? Hell no
No it's not and you should lose your union card for lying like this.
I guess a lot of languages are kind of fungible. If you want a fast, cross platform, GC-based OOP language, the truth is, there are many choices. I'm not saying they are the same, but for 80% of the use cases they kind of are, and there are always good reasons to use established languages rather than new ones.
The ones that make it offer something very unique, not merely better than peers. So Rust, as a memory-safer non-GC language has a clear use case with little or no competition.
Nim doesn't have this luxury. I wish it well, I like the language and often contemplated learning it properly. But I fear the odds are against it.
I program a lot in Nim including professionally and strongly prefer it over Rust or even Zig.
Primarily because I just really enjoy programming in Nim and getting things done I wouldn’t have otherwise or be capable of doing.
For example recently I needed to automate a GUI app. I tried the Python libraries but found they kinda sucked. Despite pulling in opencv they were slow at finding buttons in a screenshot. Then the one I tried also broke on hidpi displays.
Instead I got Claude to write me up a Nim library to find images in a screenshot. Then had Claude add SIMD to it.
It’s far faster than the python libraries, supports hidpi, and is far easier to install and use. I still use a small Python app as a server to take the screenshots but it’s a nice balance.
> I guess a lot of languages are kind of fungible. If you want a fast, cross platform, GC-based OOP language, the truth is, there are many choices.
It’s true, in many cases they are fungible. Though much less so for languages which compile to native code. LLMs do lower the barrier to switching.
Nim isn’t really a GC’ed OOP language though it supports some bits of that.
It’s really a systems language that can also run anywhere from an embedded device to a web server and as a JavaScript app.
The new default memory management is based on reference counting with or without a cycle collector. So it’s great for latency sensitive settings.
To my mind, to the contrary :( LLMs get trained on corpora of existing texts, including source code. They become much better at massively used languages because of the plethora of training data, and struggle with more niche languages because of the dearth of good examples. It's a positive feedback loop for mainstream languages, because more stuff gets vibe-coded, or code-assisted when the AI does a better job at it. It's a negative feedback loop for niche languages because the opportunity cost of choosing them grows :(
On top of that I’ve been able to “vibe” code a couple of different libraries for Nim. The bigger limits with LLMs is still “deeper” and loosing track of what it’s doing.
It helps that Nim doesn’t have much unique syntax requirements or paradigms like say Rust lifetime annotations. Some of the rarer Nim idioms like `set[Enum]` being castable to a cint aren’t used.
But generally what you’d do in most C++ code works well in Nim, but with simpler syntax.
I'm not sure about the OOP part, but last time I checked the standard library assumed the GC was enabled, so on that side I believe it's much closer to those languages than to C/C++/Rust/Zig
C++, Swift, and even Rust rely on reference counting quite a bit.
A Nim object is a stack-allocated value type, without refcounting. You can put that in a heap-allocated seq (pass-by-value with move semantics), ref (refcounting), a custom container (you decide), or ptr (unsafe, like C).
Yeah much of the basic stdlib only uses vector, string, and map types which are similar to C++ & Rust ones with single ownership and move semantics. Static arrays can be used with many basic functions like sort.
The JSON module in Nim’s stdlib does use ref ARC types. However neither C++ nor Rust provide a JSON module by default.
Actually, a fair bit of Nim’s stdlib can even be used without allocators! Unlike Rust’s #[no_std] for embedded which is just a pain.
Also contrary to systems where reference counting is part of the type system, and not library types, they aren't able to optimize counts away.
When this happens is as side effect of the compiler optimizing away inlined code, but not by pairing inc() with dec() operations that never leave scope.
It was dead prior to this. A subset of programmers think it's hard to program in any other language other than the one or two they studied.
They are wrong. Most programming languages are very very similar. And learning one means you learned almost all. i learned new languages on the regular pre - llms and it required barely any effort.
Most company interviews are also language agnostic for this reason. Because languages are so trivial to learn once you "get" how to program.
For the most part, yes. I'd add the caveat of needing to differentiate significantly between imperative and functional programming, though. I've used Python, Java, C, C++, C#, PHP, and a TINY bit of Perl (Enough to know it's a terrible language -- seriously, why would anybody want to use a language that people refer to as "write-only" because it's so hard to read?), and Haskell just makes no damn sense to me. It seems its only use is to show off how Quicksort can be written in two lines with it, or to start fights over whatever the hell a Monad is.
Also, if you've only ever written in memory-safe languages like Python, Java, and C#, then switching to an unsafe language like C or C++ will probably result in writing a ton of memory leaks, buffer overflows, and segfaults.
It’s probably easier than it’s ever been to create a high quality new language but to get as good as Rust has become just takes so much time and collective effort that it can be out of reach for most ecosystems/groups of builders. Getting the language features just right (and including novel stuff) is just the beginning.
Remember when Rust and Go were comparable? Feels like forever ago when they both just looked like “new systems programming languages” a we know how that turned out.
For example Zig is probably the most impressive new language, and it’s got a ton of awesome stuff but the chance that I’m going to adopt it over a language with often comparable performance that is ALSO much safer? Rounds to zero.
Maybe some day I’ll have the brain cells and skill to write code in zig and be confident I’m Not introducing a whole class of gnarly bugs, but it seems like I should just focus my limited brain power on writing high quality Rust code.
They were never intended for the same niches. Go is a Java/Python replacement. Rust is more of a C/subset of C++ replacement. They were compared mostly because they had usable versions released at approximately the same time, but you (correctly) don’t see those comparisons anymore.
The definition of "same niche" is doing a lot of heavy lifting.
There's a marked difference between what these tools became and the impression of them along the way and the axes along which they were compared in actuality. In the simple "I need to build a backend" conversation, Rust and Go were comparable long after Rust changed to a "real" systems language -- I'd go as far as to say people still compare them to this day (and Go probably wins that larger niche for many contextual reasons).
I think the point is partially reflected in your own statement -- there are many axes along which Go is not a Java replacement, and is not a Python replacement either. The definition of the "niche" vastly changes the answer.
For example, if you pick the right niche (e.g. HFT) Java meaningfully outperforms Go because high performance Java is... actually high performance -- those millions of man years into the compiler, runtime, etc produced progress. That said both of those languages get trounced by the FPGAs and custom hardware in the end.
I'm not convinced it was always wrong to compare Rust and Go, but the modern understanding of how different they are today is definitely better/I am glad they're more separated/more people know not to directly compare them.
Just because Python is dynamic, doesn't mean it lacks the notion of a type system, or the whole set of concepts that Go has no way to express, like meta-classes, multiple inheritance, comprehensions (eager or lazy),....
Many folks going from Python to Go, would never done it if Python had a modern JIT on CPython, or the package management wasn't the way it is, hardly anything to do with the language itself.
Same applies to all other remaining types.
That is how even though people see Lisp as a dynamic language using only lists, in reality most Lisps (and Schemes) since the 1980's have all necessary types to write machine friendly code (aka mechanical sympathy), including having existed graphical workstations that only failed the market because UNIX and C were as cheap as free beer for all pratical purposes.
Static type checking is nice and is certainly my preference, but dynamic type checking doesn’t mean no types. It means the types are checked at runtime.
Pity that it has taken all these years for folks to finally embrace Modula-2/Object Pascal safety as the bare minimum in systems languages.
Not only I was aware that there were other ways to do systems programming, I was educated that there were safer ways to achieve the same goals.
My trasition to C++ came thanks to Borland having the TP/C++ symbiosis, which later became Delphi/C++.
Strings were indeed fat pointers, and for the usual critic of being limited to 255 characters, you could eventually use fat pointers to unbounded arrays, with functions to get the lower and upper bounds.
Since Pascal supports pass by reference (var parameters), there was no need to mess up with pointers for in/out, or out parameters, thus one less scenario where pointers could go wrong.
Memory allocation has special functions New() and Dispose() that are type aware, thus no need to do sizeof() that might go wrong. And there was Mark()/Release() for arena style programming as well.
Numeric ranges, with bounds checking, was another one.
Enumerations without implicit conversations.
Bounds checking, checked arithemetic (this one depended on the build mode), IO checking, and a few other checked modes, naturally they could be disabled if needed.
Mapped arrays, so you could get the memory address directly, while having bounds checking.
Unbounded arrays provided a way to do slices, even if a bit verbose.
Pointers were typed, no decays from arrays, although you could do exactly the same clever tricks as C, by callign specific functions like Adr, Succ, Pred,...
Record variants required a specific tag, so while they weren't tracked, it was easier to spot if "unions" were being used properly.
Units provided a much better encapsulation as header files + implementation, with selected type visibility.
Yes, use after free, doing the wrong thing in mapped memory, or inline Assembly were the main ways how something could go wrong, but much less than with plain C.
Which is why I tend to do my remarks with Zig, you will notice quite a few similarities with 1980's safety from the likes of Pascal and Modula-2, the later even better than those Pascal dialects, although Turbo Pascal eventually grew most of the Modula-2 features, plus plenty of C++ inspired ones as well.
Hence why with the prices Ada compilers were being sold, and the "cheaper" Pascal dialects and Modula-2, it was yet another reason why it did not took off back then.
Ironically GNU Pascal is no longer relevant, while GCC now supports Ada and Modula-2 as official frontends.
You're running out of brain for rust?
1) Rust, in practice, is "safer" than Zig but doesn't seem to be "much safer".
See: https://mitchellh.com/writing/ghostty-gtk-rewrite
> Our Zig codebase had one leak and one undefined memory access. That was really surprising to me (in a good way). Our Zig codebase is large, complex, and uses tons of memory tricks for performance that could easily lead to unsafe behaviors. I thought we'd have a lot more issues, honestly. Also, the one leak found was during calling a 3rd party C API (so Zig couldn't detect it). So this is a huge success.
Take that as you will. And what Ghostty does probably requires a decent chunk of "unsafe" that would likely hide a bug anyway.
To me, the tradeoff of a demonstrably small number of bugs in a system software language in return for the demonstrably faster developer velocity (Zig compiles WAY faster than Rust, Zig wraps C/C++ libraries way easier than Rust, Zig cross compiles way more easily, etc.) is a decent tradeoff for my use cases.
For me, programming is about corralling motivation more than anything else. Rust saps my motivation is ways that Zig does not.
I love what the Oxide folks are doing. Having someone pwn your big server because of a memory bug? Yeah, I'd reach for Rust in their use cases. Or, if I'm handling credit cards, I'll have a very different set of tradeoffs that swing against Zig. (Although, in that case, I'll probably reach for a GC language in that case so that I don't have to think about memory management at all.)
2) Rust is to C++ like Zig is to C.
Zig is simply a much smaller language than Rust. I use Zig because my brain isn't big enough for either C++ or Rust.
I'm not a 10x programmer, but I still want to get stuff done. Zig and C enable me to do that without bending my brain.
> For me, programming is about corralling motivation more than anything else. Rust saps my motivation is ways that Zig does not.
Yes, agree with you a lot! Maybe our brains are just wired differently, for me, no other language (until now) gives me as much motivation as Rust, as it's type system and docs make me feel really good.
> Zig is simply a much smaller language than Rust. I use Zig because my brain isn't big enough for either C++ or Rust.
Disclaimer: haven't really tried Zig yet. IMO you don't need to keep the whole of Rust in your brain to use it, I usually can just focus on the business logic as if I make a stupid mistake, the compiler will catch it. That (and the type system) is what makes me more efficient with it than other langs. I also stay clear of lifetimes unless I really need them (which is almost never for application code). An extreme example of the opposite is C, where I need to take care about anything I do as it will just accept anything (e.g. auto casting types) so I need to be vigilant about everything.
All of that said, there are patterns that will just be harder to reason about in Rust, mostly with self-referential things, and if you area already used to using them a lot, this can be a hassle. If you're used to smaller data structures and data-oriented programming, it will be a lot easier.
This is not trying to convince you or anyone else, just showing a different perspective. If you feel better with Zig, use it! Everyone has their own experience.
Also an underrated chance to be forced to learn about Arenas, most of the time.
Yeah, not everyone is mitchellh -- I'd argue that his intuition was right (expecting to find more issues), there's a reason we don't all write C.
That said, it's great to hear that others are having success with Zig and aren't writing very many bugs.
> To me, the tradeoff of a demonstrably small number of bugs in a system software language in return for the demonstrably faster developer velocity (Zig compiles WAY faster than Rust, Zig wraps C/C++ libraries way easier than Rust, Zig cross compiles way more easily, etc.) is a decent tradeoff for my use cases.
Reasonable points! This is one of the reasons I have on my list to consider Zig at all (that and wanting to write more C), it seems people that like it are wonderfully productive in it.
I really like a lot of the design decisions and watch the streams than Andrew and Loris, Tigerbeetle folks and others do to see the decisions they make. Their managed IO approach is certainly interesting and quite powerful.
Dislike some small petty syntax things (whyCamelCaseFunctions()!?), but that's a me thing.
> For me, programming is about corralling motivation more than anything else. Rust saps my motivation is ways that Zig does not.
Agree with this, anecdotally.
> I love what the Oxide folks are doing. Having someone pwn your big server because of a memory bug? Yeah, I'd reach for Rust in their use cases. Or, if I'm handling credit cards, I'll have a very different set of tradeoffs that swing against Zig. (Although, in that case, I'll probably reach for a GC language in that case so that I don't have to think about memory management at all.)
Yup, Typescript works for me here, and usually the hard parts are offloaded to an external service anyway (e.g. Stripe)
> Zig is simply a much smaller language than Rust. I use Zig because my brain isn't big enough for either C++ or Rust. > > I'm not a 10x programmer, but I still want to get stuff done. Zig and C enable me to do that without bending my brain.
I wouldn't put Rust and C++ in the same league there as far as brain size requirements, but point taken!
I'd argue this is also down to the debug memory allocators in Zig which run a bunch of checks for things like use-after-free. Again, one of the nice things about Zig is that debug mode isn't dramatically slower than release so you're not inclined to turn it off too soon.
However, one thing that does shock me is that no "reference to stack allocated thing escapes" errors existed. Zig does not help you very much on this front, and I'm stunned that none existed.
Ah another good point -- also excited for what opens up with regards to this approach with managed/explicit I/O. Pretty obvious that permutation testing of async behavior gets unlocked which is nice. The equivalent in Rust would be loom.
> However, one thing that does shock me is that no "reference to stack allocated thing escapes" errors existed. Zig does not help you very much on this front, and I'm stunned that none existed.
TIL -- is everyone just bolting on asan here and sticking to the clang backend for now?
Hopefully the zig strike force will drop by.
There are domains where performance is critical and safety isn't so important (e.g. video games). Zig has an advantage in such domains because of the pervasive support for passing around allocators in the standard library (avoiding hidden allocations, and allowing efficient memory reuse via block allocators), while in the Rust stdlib custom allocators are relatively cumbersome to use (and not easy to prove safe to the compiler).
The new "managed" async strategy (I was previously mistaken thinking it was the same as sans-io) is also really intriguing IMO, and feels like a fantastic balance (people in Rust are doing this too, but for the unrelated reason of trying to support various async runtimes).
Llms solve this problem
I think LLMs are doing the exact opposite
I actually enjoyed zig because it prevented me from using LLMs to code in this way
In Nim it's very fast and ergonomic to write a MVP, but in Rust you have to think of the problem first.
I guess it's a double-sided edge: Rust is much worse for explorative programming compared to Nim, but that also forces people to build proper abstractions from the start. Which seems to directly feed into the quality of produced libraries.
Just a hunch, not something I've thought through.
Forget about syntax or semantics or unique features or whatever. Having money and resources are the most important factor for a successful language.
So, I think that I had watched the video and honestly, I will watch it now once again since I don't remember it clearly but If I remember, I really liked it.
A quick search points me to the video, though I am not sure https://www.youtube.com/watch?v=XZ3w_jec1v8
Sure, you can scrape by with a rag tag group of volunteers, but to ever get to mass adoption requires a lot of work and dedicated resourcing. Zig has been in development for how long now, and that is with the designer now working on it full time.
$$$money$$$
Rust (backed by a foundation) won over Go (backed by Google). Oh, and remember Dart (backed by Google)?
I wouldn't say either of them have won. I'd say both are amazing tools, too. We're spoiled for choices these days.
Go and Rust don't compete, Go has a GC, they're in fundamentally different domains. Google also uses Rust, among other languages (C++, Java, Python, Go, JS, etc...).
Unfortunately, a lot of this is just enterprise sponsorship. Rust had Mozilla's backing. React had FB's backing, while Vue doesn't. And so on.
There are of course, many other reasons as well, but to me this seems to be a critical one.
IME language success has very little to do with syntax, and much more to do with stdlib, tooling, library availability, coverage on StackOverflow and all the other things that make solving problems simple.
In practice, Mojo feels like it pivoted to being a pythonic scripting language for CUDA
I like D, but it's a bit of a mess at the moment with the DIP-1000 (Rust-inspired stuff for memory management) half-baked changes, the new experimental allocators (probably coming from Zig), it keeps breaking (on MacOS at least). With the very advanced metaprogramming, it also suffers from lack of good tooling, just like Nim... Nim was going to "solve" that a few years ago, but as far as I can see from the outside, nothing really "shipped" recently to improve the situation. Without good tooling, I am quite sure I am more productive even in Rust (and I am no expert on it), let alone OCaml (tooling is decent, and the language is just really powerful) and Dart (which has absolutely awesome tooling), for example.
I wanted to check those languages at some point, didn't have time, but I doubt they offer a performance for the stuff I am doing. It is also another reason I discard Java and Go (C# seems slightly better than those).
I deal primarily with scientific/engineering calculations, mostly ODE simulations, so I have some demands regarding performance. Python is of course important, but not for the core production code. Fortran is nice for math stuff, but not much else. C/C++ are as performant as awful. Rust is viable, has nice tooling and ecosystem, but introduces too much hassle that is not that important in my domain. Julia has nice ideas, but poor execution, low quality tooling, AOT delegated to some package that may or may not work with your code etc.
I played with D and it strikes a nice balance. With LDC and careful allocation strategies it can be as performant as Rust, with DMD it compiles much faster, and outside performance-critical code parts it is more ergonomic because of GC. Tooling is decent. Because contrary to sentiment popular on this website I actually think Visual Studio is still the best IDE out there, I am glad that Visual-D exists.
My main issue with D is exactly that - it didn't take off. As a consequence the knowledge-base is small, a lot of things are poorly documented etc. With Nim it is similar, but worse.
I ask because I've seen Dart beat Java/Go/D (very high memory throughput, no idea what Dart is doing differently but it's amazingly good at it)... but I can imagine that if you only perform numerical computation, Dart may miss optimisations that will make it slower (normally, it will be optimised to approximately the same as you'd written in C, but in some cases it can hit the "slow path" - those are getting less and less every release, and if you provide an example the Dart team would likely be happy to fix that).
Some very good frameworks/service to look at for backend development:
* Serverpod - https://serverpod.dev/
* DartFrog - https://dart-frog.dev/
* DartStream - https://dartstream-prod.frb.io/
* Appwrite - https://appwrite.io/docs/quick-starts/dart
It's just a really good language to work on, try the tooling and you'll see it for yourself (either with Flutter or just Dart):
type StatusCodes int
const (
Success StatusCodes = iota
Ongoing
Done
)
instead of (in Java and C#), enum StatusCodes {Success, Ongoing, Done}
or if you prefer something from 1976's Pascal, type
StatusCodes = (Success, Ongoing, Done)
the last thing one can say about Go is that it is expressive
That is they are going to have to appeal to people who want that style of programming, and they have the additional hurdles of A) having more stuff one has to learn to use the language, B) dealing with a community of people who all want the magic from their previous language ecosystem imported into this new one, and C) the additional burden on libraries to come about which use these features.
Note: I have not used Nim, so please take the above with a lump of salt.
I'd also argue that high expressiveness actually allows you to avoid magic. The actual library implementations are rarely very scary. We're ahead of time compiled and type checked so it's not like some rails codebase where a clockwork goblin can just explode out of nowhere.
It's just important to keep in mind that library / framework authors, library / framework users, and final end users are not all the same person or even type of person. Metaprogramming enables the first group the most, but can deliver benefits to the latter two groups by easing their lives quite a bit. This is easy to lose track of in a niche language like Nim where all 3 groups might well be the same human for most of the life of some project.
This partly also relates to how much of a "full stack debugger (of either correctness or performance or both)" personality one has. Then there are even more layers -- like all the levels in the language / compiler itself down to the CPU and beyond (such as "u-code" & micro-architectural resource sharing).
What macros mostly do is enable fluid, simple syntax for things either too rare, too niche, too unanticipated or too disagreed upon to be in "the main language", whatever that is. As a shibboleth (or marker for group inclusion), the existence of this expressivity selects for programmers who want to fix rare, niche, unanticipated or disagreed upon things, and these people can be even harder "cats to herd".
Nim has other very flexible features - such as the rare "user-defined operators" that enable very nice looking custom syntax. But you can also SCOPE their use very simply. E.g., put their definitions inside a template so that you can say `ptrArithmeticNow: expression` and only have pointer arithmetic within `expression` (maybe with +! to look a little diff from ordinary arithmetic), and (ptrArithmetic itself can be scoped within a module name). It also has even more powerful (and more rare) term-rewriting macros, used most commonly in Lisp systems, perhaps doing computer algebra.
Then around 2019 Nim started to gain momentum (preparing for 1.0 release) and when I looked into it a bit deeper it became evident that for most of the code I usually write for myself Nim is just a more pragmatic choice than Rust. It gets me there faster.
Zero job perspectives both times, scratched my own itch twice though.
Glad I haven't gone with Go. Nim is not a perfect overlap with Rust, but it definitely covers everything Go can do and more and is a better design in my opinion.
I mean, compared to zig which is starting to have a lot of hype and libraries / help, lets say that we now wish for nim to have this, then are the feature differences b/w zig and nim worth the revival of nim in that sense? (Pardon me if this doesn't make sense)
I keep hoping nim will get better, because it's a beautiful language that produces absolutely tiny, efficient binaries
Compare to other communities where you need to stand out from the noise.
Personal anecdote: I was exactly there a decade ago when I was working on Chrono, now one of best-known date and time libraries in Rust. Since I was simply my own, my design was not as good as it could have been. Of course it is still much better to have a suboptimal Chrono, but I can't deny that it remains and probably will remain suboptimal due to my design a decade ago.
Please, file bugs or complain on the official matrix room. The community tries its best to keep up the official documentation in sync with the changes.
I mean, personally I really like golang. Its actually gotten an ecosystem that functions while being able to cross compile and actually being easy enough to learn unlike rust.
I also sometimes exclusively just use typescript/python just for their ecosystem as a junior programmer. For me ecosystem matters a lot and this was a reason why I wish that more languages can create more libraries which can work.
Like the article pointed, nim is really solid language. But I'd like to see more libraries being made for it.
It makes much more sense to compare Nim to, say, Swift, D, or other modern compiled languages with lots of syntax sugar.
https://nimble.directory/ I'd pick Nim if my concern is general app development, not specifically system programming.
Can one use the zig compiler after nim has compiled to C?
* It doesn't have many libraries.
I recently compiled some stats on Nims popularity on GitHub: https://code.tc/nim-stats-august-2025/
It’s growing steadily but I do qualitatively feel like the ecosystem has lost a bit of steam over the past year. Excited for the ongoing Nim tool chain work though.
Nim is a statically typed programming language that compiles to native dependency-free executables in C, C++ or JavaScript.
So you can easily combine codebases written in different languages. I guess something like wasm and llvm.
But significant whitespace has always made sense to me. You are going to indent your code anyway, so you might as well give the indentation some meaning. I write Python, JavaScript and Lua most of the time, and I never waste any thought on whitespace VS braces VS keyword delimiters.
As of the "you're going to indent your code anyway" claim, this seems to completely ignore one-liners, either embedded in other documents (onClick=...) or fed to the shell as a quick way to get some work done without spawning new tools.
They can also be shown always in a very muted color if having frequent problems, for some reason.
This is a minor inconvenience, and I still use Python regularly when it makes sense, but from a language design perspective, I think it's one of those decisions that makes your users' lives that slight bit harder with no real upside.
The benefit is not having to constantly read/write delimiters. Imagine a shell where all arguments had to be delimited by , or | chars instead of whitespace.
I don't really get the readability argument - like the lisper I was replying to said, it's all much of a muchness. A shell where arguments are delimited by commas is a function call in most programming languages, and people don't struggle with readability there. If anything, I find having an explicit "close block" symbol a useful visual marker on the page, but I write Python fairly regularly and don't really notice much readability differences compared to any of the brace-using languages I work with.
Autoformatters came and fixed that problem. Nobody (except python, haskell and nim coders probably) wastes time indenting code anymore, you save and your code is indented. Nowadays, code that relies on whitespace and has no ending delimiters (like nim and python) gets awkward because the formatters have less information to go on. If your rust or typescript code formats to bad indent or none at all, you have an error.
2, I sometimes want to fuck up the indentation on purpose, if I'm just quickly pasting some code to test if it even works or to mark it as "look at it in 5 minutes from now". Also I can mix spaces and tabs
quick edits it notepad/vim/whatever become painful
also converting tabs to spaces or vice-versa is tiring. A language shouldn't force me to a particular code style
This is always puzzling to me.
I've always felt that the indentation is what told me a code block ended, not a closing brace. It's graphical rather than textual.
Maybe that's why Python feels more natural to me. The indentation tells me a code block ended, and the closing brace used by other languages is just a redundancy. Languages that use things like "endif/endfor/etc" end up looking downright cluttered.
> also converting tabs to spaces or vice-versa is tiring.
Get a better IDE. I've never had this be an issue in PyCharm.
loop do
case message = gets
when Nil
break
when ""
puts "Please enter a message"
else
puts message.upcase
end
end
loop do
case message = gets
when Nil
break
when ""
puts "Please enter a message"
else
puts message.upcase
end
end
(Scrambled on purpose).Maybe we both just got bad first expression with most popular, but unfortunately not a good language? =)
Back when most of my code was in C or C++ (or Java), I was told all the time, sure you can omit these braces for a single-statement block, but you shouldn't, you'll regret it later. You can leave all the code unindented or irregularly indented, it won't matter to the compiler, but you'll appreciate lining it up in the long run. And all that advice was correct; I was better off in the long run, and I saw others come to regret it all the time. But then, over time, I started to wonder why I had to scan past all these diagonal lines of close braces as I read the code. And I cursed that these languages made it too difficult to pull out the inner parts into separate functions, disentangle setup and teardown from the main loop etc. But I also cursed that after putting in the effort (even if was just "using a proper text editor") to make things line up beautifully and in agreement with the logical structure, I still had to take up vertical space with these redundant markers of the logical structure.
Python was my first language using significant whitespace, and it was a breath of fresh air. That was a bit over 20 years ago, I think. I've learned several other programming languages since then, but I never "looked back" in any meaningful way.
I'm now certain that significant whitespace is simply wrong.
As far as I'm aware, none of the new languages that have seen success in the last ten years (Go, Rust, Swift, Dart, Kotlin) rely on the author to format code correctly - instead, they do it for you. And that's good! That's one less thing the programmer has to worry about!
Sorry, I don't follow. There's nothing preventing a tool from re-indenting code when it's pasted (i.e.: considering indentation within the pasted text relative to its first line, and then applying an indentation offset according to where it's pasted), and many already do. It's the same kind of logic that's used to auto-format code in braced languages; arguably simpler unless it's also validating the existing indentation of the pasted text.
And actually, who even is relying on "autoformatters that run on save"? I want the code to look right as I'm writing it, not after. The tools you describe are, to me, about maintaining project standards when multiple people are involved, but fundamentally each person is still making the code look locally, personally right before saving (or committing, since these things are also often done with pre-commit hooks). I can't imagine just typing out whatever slop is syntactically correct and waiting to save the file to fix it. Not when I have a proper text editor that facilitates typing it out the way I want it, taking advantage of awareness of the language syntax.
> none of the new languages that have seen success in the last ten years (Go, Rust, Swift, Dart, Kotlin) rely on the author to format code correctly - instead, they do it for you.
Languages do not and cannot format code. Text editors (including the ones built into IDEs) do. If I type a } and the text changes position, it's not the programming language that did this.
And this is also not better in braced languages. Just as I can input a } on a new line in Vim in a braced language and have it dedent, if I want to write more code in Python that's outside the block, I just press the backspace key and it removes an entire level of indentation. And then I start typing that code, and I don't have to input a newline because I'm already on the line where I want the code to be, because I'm not expected to have a } on a separate line from everything else.
I have not had this work reliably for me - my relatively stock VSCode does not indent the pasted code correctly - but I will freely admit that it could, and that this is a point in favour of good tooling.
> And actually, who even is relying on "autoformatters that run on save"? I want the code to look right as I'm writing it, not after. The tools you describe are, to me, about maintaining project standards when multiple people are involved, but fundamentally each person is still making the code look locally, personally right before saving (or committing, since these things are also often done with pre-commit hooks). I can't imagine just typing out whatever slop is syntactically correct and waiting to save the file to fix it. Not when I have a proper text editor that facilitates typing it out the way I want it, taking advantage of awareness of the language syntax.
Most people who write code in these languages rely on them! Format-on-save is one of the first things one sets up in an ecosystem with high-quality formatters. You can write code that is sloppily formatted but conveys your intent, then save and have it auto-format. It completely reformats formatting as a concern. As they say in Go land: "Gofmt's style is no one's favorite, yet gofmt is everyone's favorite."
> Languages do not and cannot format code. Text editors (including the ones built into IDEs) do. If I type a } and the text changes position, it's not the programming language that did this.
These languages ship with first-class robust and performant formatters that encode the language's preferred style; much effort has gone into developing these formatters [0]. For all intents and purposes, they are part of the language, and users of these languages will be expected to use them.
> And this is also not better in braced languages. Just as I can input a } on a new line in Vim in a braced language and have it dedent, if I want to write more code in Python that's outside the block, I just press the backspace key and it removes an entire level of indentation. And then I start typing that code, and I don't have to input a newline because I'm already on the line where I want the code to be, because I'm not expected to have a } on a separate line from everything else.
I just don't think about it. I write my code in whatever way is easiest to type - including letting the editor auto-insert closing braces - and then hit save to format.
In general, you are freed from formatting as a matter of concern. It's just not something you have to think about, and that's liberating in its own way; it makes bashing some code out, or pasting some code in, trivial.
[0]: https://journal.stuffwithstuff.com/2015/09/08/the-hardest-pr...
I find that a little hard to believe. There is a universe of programmers out there who are basically invisible to you.
> I just don't think about it. I write my code in whatever way is easiest to type - including letting the editor auto-insert closing braces - and then hit save to format.
Don't you want to see neatly formatted code while you're writing it?
These tools are very standard and very widely used.
> Don't you want to see neatly formatted code while you're writing it?
Every time I pause, I press ctrl-S or an equivalent. So I really am seeing neatly formatted code while I'm writing it. I would guess that 90% of the time, if my code is syntactically valid, it's also neatly formatted. And even if it's not valid code, it's probably very close to being neatly formatted.
I use Vim, so that would be really unnatural and disruptive.
I mean, Nim looks cool, but I’m not sure what it does that is substantially new. Niceties are generally not enough to foster adoption - something real has to be on the table that meaningfully opens up new avenues, unlocks new paths, enables new use cases.
I have the same criticism of Zig.
Nim compiler has an embedded VM, so any Nim code (that doesn't rely on FFI) can run at compile time.
These are all the same thing.
Nim is in the awkward middle area where it's got a GC so it'll never replace C/C++ but it also isn't as productive as something like Ruby.
It is recommended to use ARC, which has similar semantics to C++, ie moves, copies, etc.
Unlike D, the GC is not mandatory for the standard library.