Pros and Cons of Nim
onlinetechinfo.comWhile I rather like nim as a language, there's a few more cons that need to be considered for any real use of the language:
- It has a bus/lottery factor of 1. The vast majority of all the changes were done by Araq and I have very little faith that the language would survive without him. This is even more pronounced with Zig (mentioned in comments here).
- It has had some very embarrassing bugs after the 1.0 milestone. Most of them were specific to Windows (e.g. [0]), which casts a lot of doubt on its cross-platform promise. Multiple times in the last year, when debugging a nim program, it turned out that the problem was in the language/standard library.
Now, these might not be reasons enough to not use nim, since it's a lovely language when it works, but a pro/con list should be honest.
While Araq is BDFL and is still doing most of the implementation for Nim, there are nontrivial contributions from others, and commercial sponsorship at this point.
And with respect to the ecosystem-at-large, there are tens of contributors and a very healthy package repository: nimble (package manager) written and maintained by dom96; arraymancer (tensor+array+nn) written and maintained by mratsim; an up-and-coming thread runtime by mratsim (called weave) which is better than just about any existing thread runtime for any language. NimPy for seamless python integration (... which produces one DLL that works with every Python version; can your C++ do that?) by yglukov, and many more.
And most libraries you'd need already have a Nim wrapper, (and one is extremely easy to generate if not), though the pure-nim body is growing every day - have a look at https://nimble.directory/
I see one of the top jobs of the dictator as exercising reputation and credibility to gather and delegate expert work to expert lieutenants. If the years pass and you're still the lonely dictator...
But duties are being delegated and Araq isn't alone. For example multithreading has been delegated to the amazing mratsim: https://github.com/mratsim/weave . There are multiple people helping with the new move-semantics-based runtime: https://github.com/nim-lang/Nim/issues/14224 Documentation is delegated to narimiran who also has his fair share of commits: https://github.com/nim-lang/Nim/graphs/contributors?from=201...
> - It has a bus/lottery factor of 1.
What was Rusts early years like? Was it one developer for the first part?
I'd imagine this is not a big deal in the early days, where the benevolent dictator is as much the language as the project itself, not all technology adoption happens on the same timelines. Matz with Ruby took a long time to become super popular, Rich Hickey with Clojure seemed to be a powerhouse even as that found quick adoption before stalling.
When Rust 1.0 was released it had in the order of hundreds of developers doing work here and there.
When Rust started as a hobby project it was a one man effort, but it was also a project with ~1 user. It grew developers before actually growing users, and for a while, it had more developers than users.
Rust is design by committee with complicated and large syntax area and to solve one problem of memory safety created a mountain of borrow checker syntax and a steep learning curve.
So only time will tell if BDFL based language comes up like Linux or design by committee like language. Right now Rust is not that significant like Nim and Zig so all have a chance to come up.
Once there are substantial software written in them like C or C++ than only will know, right now among modern language only golang and Swift reached that stages as being significant systems programming language in spite of GC. Indeed I doubt if Rust will be as revolutionary as Lisp or Haskell or Smalltalk in terms of contributions for the development of compilers and language designs.
Not bad for a bot, tell your master that your coherency levels are well down.
> What was Rusts early years like? Was it one developer for the first part?
In the very beginning, it was a one-man project, but after some time it was picked-up by Mozilla research as an official research project, with several developers working on it (brson and pcwalton in addition to the language creator) and they also started a research new browser (in partnership with Samsung) using this experimental language. That's when people started to hear about Rust (and it was still very far from 1.0 at this point).
Certainly. And I hope same thing happens with nim. All I'm advising against is people betting their livelihood on nim reaching critical mass before a single unexpected event happens that removes the benevolent dictator from the picture. Or at very least be aware of it and make an informed decision.
While the lottery factor is concerning, also many corporate-driven languages have a similar risk: the company can drop the language or bend it out of shape to satisfy business needs. It happened many times.
Which languages are you referring to where it's happened many times? Racking my brains but falling short.
Visual Basic
HTML, CSS, JavaScript, and the Browser Wars come to mind
> Most of them were specific to Windows, which casts a lot of doubt on its cross-platform promise.
Oh. That really surprised me, as I had assumed the bugbears I have as an occasional nim user were because it was developed for/on Windows primarily. Actually bothering to take a look seems to show me that isn't the case at all.
Bugbears such as the linking story on Linux¹, the argument handling², the style and verbosity of the compiler output, [a bunch of others]. Nothing show stopping to be fair, but a bunch of things that just seem out of place(and that always seem to require explanation when co-workers see a nim tool).
1. https://github.com/nim-lang/rfcs/issues/58
2. https://nim-lang.org/docs/parseopt.html , although alleviated by argparse to some extent.
Does anyone know the interpretation/etymology of 'lottery factor'? I assume it's the risk of the Key Person winning the lottery and abandoning the project? I guess it makes more sense in the context of rank-and-file employees, rather than passion projects...
I assume it only applies in the cases where the project is being produced by a company, and the lottery winner would then retire. For a hobby project, Key Person winning the lottery and quitting their day job would be a good thing.
It's a less morbid variant of "bus factor". Agree that it makes less sense than bus factor in this particular case.
It doesn't make any sense: you win the lottery and stop working on your baby?
I know it's not a big deal but it seems borderline superstitious to me.
He missed two that I think are really important.
1. Arraymancer https://github.com/mratsim/Arraymancer
2. The new version of the garbage collector understands move semantics to optimize its reference counting so unlike Rust where you have to deal with it yourself Nim will handle it for you at the expense of a reasonable amount of memory. https://youtu.be/yA32Wxl59wo?t=855 Watch the whole video for more context if you care.
I think it's a really nice language, too many pragmas but still really nice.
I love Nim as a language. I do not love the Nim ecosystem. It is too sparse. Even now there isn't a good web framework that folks can use in production. Jester is ok but it doesnt inspires the same kind of confidence that Echo, Fiber etc would do. Jester still doesnt have its own web page marketing it.
I really think that the folks behind Nim need to focus on getting some killer apps in the ecosystem and in marketing them. That's all Nim needs. Just some useful tools wrapped in a nice package.
It occurs to me that the although they were aimed at different uses cases the language that's actually closest to Nim today is Julia. Python-like syntax, compiled to native code, significant meta-programming capabilities, some native support for concurrency. The biggest difference seems to be the approach to types, since Julia is a dynamically typed language (with optional type annotations) and Nim is statically typed, but Julia's type systems is powerful enough that in practice the difference may not be so big.
Anyone out there who has used both and has more observations?
Julia's ahead-of-time compilation story is not great. If you just want a single executable, it's kind of a pain, and there are all sorts of caveats. I think it's slowly getting better, though, but the language really wasn't designed for that use case. If I needed run-time metaprogramming/JIT compilation/a REPL, I'd go for Julia. For a single executable, I'd choose Nim.
Last I checked, Julia is not AOT compiled, and some people don't like Julia's startup time.
For numerical computing however Julia wins hands own purely based on the community and libraries.
Julia can be AOT compiled, though it's still experimental progress is being made quickly.
Oh!
I have another use-case; first-class (easier-than-others?) cross-compilation support.
I had to build a one-off tool that was a a glorified "curl wrapper" with validations, for Windows; on a Mac. Wrote a simple nim script, cross-compiled for Windows, and it's been fine and dandy for a year now :)
I'm sure other languages support this (golang?), but Google's SEO suggested nim-lang.org
This, for sure. For my work I wrote a dynamic library in Nim. The library is loaded by an application written in C, and I needed it to work on both Windows and Linux (Windows for customers, Linux for myself and the server back-end). Setting up a Docker container that runs the entire build and integrates well into the existing build-system we had was fairly straight forward. All Nim needs to cross-compile is a C compiler that can do it (in my case that was MinGW). So now I'm happily running the same Nim dynamic library compiled to both a DLL and a .so.
To confirm, that’s two dynamic libraries, right?
Yes, it's compiled into two different libraries. Nim hasn't magically made cross-platform dynamic libraries. But the code for both libraries is the same, just compiled for two different platforms.
Also interfacing with C and Javascript libraries is as easy as it gets. Do you need a .dll/.so/.dyn? No problem, the same Nim code will deals with it in a few and clear lines.
I agree with most of the OP's shortcomings. Cross compilation hasn't been a huge problem for me, pretty easy to pull off. But they've been looking into using zig as a cross compiler. Kind of confused as to the current status of the effort though . https://github.com/nim-lang/Nim/pull/13757
Carp also has open pull request for using zig as a cross compiler FWIW
Thanks, I'll add that.
I'm a big fan of Nim, but I really wish it supported cyclic type declarations in separate files and out-of-order functions without forward declarations. As it is, I'm constantly structuring things around those limitations. Big projects often end up squeezed into a single huge file (or a few huge files).
You can re-order functions by using the {.experimental: "codeReordering".} pragma now.
consider `include` instead of `import` if you want to break things down to smaller files.
Also, I'm with Araq on this one -- in my experience, every time I reached for a cyclic-cross-file-type-declaration, there was a much simpler acyclic solution I found later.
Include doesn't work for cyclic types, they need to be in the same "type" block. I do not agree with Araq's stance here, there are often cases when closely-related types need to reference each other, and it's not always convenient to put them in the same file. That's why you see the "types.nim" file in larger Nim projects -- no one has a good solution and just shoves them all together. And, if you use types.nim, you no longer have proper visibility control, since fields cannot be private and there is no package-level visibility. Most languages understand this, C/C++/Rust all allow at least the ability to forward declare a type (and Rust allows cyclic, out-of-order types).
yes please!
I had forgotten how irritating creating header like declaration files were till I tried Nim.
What is the current state of REPLs for Nim? If I remember correctly, Nim had an official REPL a long time ago, but it was abandoned a few years back.
There is one bundled with a standard nim installation, and I use it regularly (when working in nim). It works, but it's not very good.
I expect the developers to know this, since its accessible only though the `$ nim secret` command.
inim [0], which already works pretty good, recently got an enthusiastic new maintainer, so things are good and improving.
The thing that put me off was the allowed inconsistent name formatting: snake_case, pascalCase, (and is it case-insensitive too). This makes it hard for text editors to find all references besides feeling like you're reading multiple codebases.
This feature is meant to make linking easier but it shouldn't have leaked into variability of a single module's sources.
If you like Nim also check out Crystal.
I really like Nim and have been eager to try for a project -- though for anything other than a simple script/wrapper I'd really want to see source-level debugging support in VSCode for one or two of the backends.
Neverwinter Nights nwsync tools are written in Nim and the source is available to tinker with. I've compiled and used the tools on my FreeBSD box to break apart the modules and prepare them for nwsync distribution (nwsync is used for client/server asset management mostly for persistent worlds, but is handy for just running a one-off server with your buddies).
Now you can learn a bit about Nim and have fun doing it :)
A couple of links:
https://docs.google.com/document/d/1eYRTd6vzk7OrLpr2zlwnUk7m...
Why use Nim over Go?
Nim has actually decent type-system with support for algebraic data types and generics. Moreover, Nim is a much higher level language and doesn't sacrifice performance to do it.
In short, while Go has deliberately shunned all modern developments in PL design, Nim has embraced them. Also Nim has real macros, while Go does not.
It's clear to me that though immature, Nim is a much better and more expressive language than Go.
Nim seems to be almost the anti-Go. While Go is all about simplicity and "less is more" [0], Nim's list of features [1] is huge.
Therefore, I think the two languages appeal to different sets of people.
[0] https://commandcenter.blogspot.com/2012/06/less-is-exponenti...
People complain about lack of generics and proper try/except/finally in Go. Nim has both.
And no native concurrency.
Nim has thread support natively. And an async implementation in it's standard library which is made entirely as a library, so you're free to write your own, in fact there exists alternative implementations of async/await in Nim. Just the fact that a feature like this _can_ be implemented as a library is a testament to how powerful of a language Nim is.
It's not my area but, surely concurrency can be added, and why does it need to be native? Surely a library is fine.
(I chose not to learn go because of it's lack of generics and exceptions. It seemed like a huge step backwards)
Related article on Nim concurrency: https://onlinetechinfo.com/concurrency-and-parallelism-in-ni...
Why do you say that?
It's far from perfect but it does work (and has for a few years), and it helps you by requiring proof that actions are disjoint. There is work now on including Z3 which would work this much smarter.
https://nim-lang.org/docs/manual_experimental.html#parallel-...
Then let's ask instead, why use Nim over Pascal?
Actually type safe, with support for various kinds of automatic memory management.
Pascal dialects, while much safer than C, suffer from use-after-free and possible memory leaks, also you don't need to mark unsafe code as such.
This includes any modern Pascal variant.
Why use nim over Kotlin with AOT?
I'm not familiar enough with Kotlin to give a real answer, but from what I do know the following might be relevant:
* Backends: Nim compiles to JS both directly, and indirectly (emscripten) with various trade-offs (e.g. 64 bit ints require emscripten). It also compiles to C, C++ and Objective C giving you the simplest most efficient FFI to those languages one can hope for (including e.g. exception handling) while at the same time addressing the largest set of platforms (got a C compiler? you can use Nim). And you also have (almost but not yet quite production quality) native code compiler NLVM. What's the platform range of Kotlin's AOT?
* Metaprogramming: Nim's metaprogramming ability is second only to Lisp[0], I think, and only because Lisp has reader-macros (whereas you can't ignore Nim's syntax with macros, as flexible as it is). For example, Nim's async support (comparable to Python and C#) is a user-level library. So is, for example, pattern matching. Can Kotlin do that?
* Size: Nim compilation through C produces standalone and (relatively) tiny executables; it matters for embedded platforms. How does Kotlin fair in this respect?
[0] Lisp, scheme and other Lisp derived languages, of course.
>> Metaprogramming: Nim's metaprogramming ability is second only to Lisp[0], ...
There are some other languages that have metaprogramming abilities equal to Lisp. For eg. Rebol, Red & Forth.
I think some people would also consider Prolog and Smalltalk to be in the same ballpark.
And there are languages that would also claim to be second only to Lisp, for eg. Julia, Elixir, Raku/Perl6 and probably some others to!
Nim compiles to JS both directly, and indirectly (emscripten) with various trade-offs (e.g. 64 bit ints require emscripten) Kotlin can also compile to js directly and can autogenerate typescript definitions for the generated code.
It also compiles to C, C++ and Objective C giving you the simplest most efficient FFI to those languages one can hope for (including e.g. exception handling) while at the same time addressing the largest set of platforms (got a C compiler? you can use Nim). Wow, nim has implemented that much transpilers? This is kinda impressive but I would rather want a language that compile to Binary instead of compiling to another language, AND that offer nice FFI interop Kotlin has state of the art language interop through graalvm but here it does not qualify as native. For native interop it can be done but is subpar. But openjdk is working on a new API luckily.
Metaprogramming Interesting topic for sure! I've never learnt a LISP. I did use macros when I was doing C/C++ and honestly I don't get their advantages vs @Decorators() (pre compile time codegen) and they have a reputation of breaking IDEs Kotlin like Java can generate code/classes at runtime and has full support for reflection. Tangeant: Kotlin can mark any function as in/postfix which allows to easily create DSLs. Due to this Kotlin community has created a lot of elegant, declarative DSLs such as for testing.
Nim's async support (comparable to Python and C#) is a user-level library. Kotlinx.coroutines is the official library the language simply has to expose the keyword suspend.
So is, for example, pattern matching. Can Kotlin do that? Kotlin has some pattern matching features in its when keyword,but no it cannot currently destructure in when. But the subject is active and it should come in a following release, especially since Java is getting pattern matching. https://github.com/Kotlin/KEEP/pull/213
But if your point was that macros allows to modify the abstract syntax tree, Kotlin compiler plugin API offer much more power (you can modify anything anytime (the AST, the IR, the bytecode) It is arguably far harder to use than powerful macro but it does not prevent experimented guys from scalaifying Kotlin through https://github.com/arrow-kt/arrow-meta/issues They are bringing for example union types as an unofficial extension to the language. They could bring pattern matching earlier in theory.
Size: Nim compilation through C produces standalone and (relatively) tiny executables; it matters for embedded platforms. How does Kotlin fair in this respect?* I'm afraid kotlin is not made for such a use case but today even embedded platforms should have a few dozens of free MBs
Kotlins stdlib is seriously lacking, at least the last time I looked at it, you couldn't even read a file without java.io.
You can and you should use Java libs where appropriate. It's always nice to have pure idiomatic Kotlin but calling Java from Kotlin is already idiomatic.
As for multi platform io they are working on it https://github.com/Kotlin/kotlinx-io
But the discussion was about Kotlin AOT. Not kotlin JVM.
Right but using graal native instead of kotlin native you can have kotlin calling Java code and compiling to a binary
I'm hopeful that Kotlin/Native will improve in these areas (excepting the last one) but:
* Faster compilation.
* Tooling is less memory-hungry.
* C++ interop.
* Macros.
Tooling as in intellij/gradle? I doubt they'll ever use less memory.
Actually intellij since 2020.1 release is kinda fast :0
Why macros vs decorators?
I think you basically covered it in your other comment. Macros are much quicker and easier to write than compiler plugins or annotation processors, but they are difficult for the IDE to understand. So it's a trade-off where Nim and Kotlin both make reasonable, but quite different, choices.
Among other things, Nim is programmable using templates and macros and very powerful at that.
Also it's faster than Go on most benchmarks.
Some people prefer Nim because of small things like syntax. However Go is much more popular. I will do more research on this subject and see if I can list the differences in detail. However any such list could be soon made redundant when Go 2 comes out.
> small things like syntax
This is no small thing. The expressiveness of a language makes the difference between happy productivity and tedious typing and swearing.
Nim is in the same ballpark as Python here, while many other languages require a lot of boilerplate.
Most of your list would probably remain intact. The main issue is that Go has a traditional garbage collector, and Nim has ARC, ORC, nogc, gogc, deferred reference counting, and more options for memory management.
Why are options a bad thing? What's the default? Is it reasonable?
I'm using the literal definition, "an important topic or problem for debate or discussion." By the main issue I mean the main matter of discussion/point of interest/topic to cover. Not suggesting there is something wrong with Nim's having various means of memory management. Default currently is refc (deferred reference counting) and it's reasonable. Arc is better for some use cases but not as ironed out as refc.
Nim currently in uber-cool phase. I love it.
After adding a million requested features it will end up bloated like all the others . . .
Nim has been trimming stuff out of its stdlib instead of adding them lately. And because of its huge focus on meta-programming almost anything can fit in an external package anyways (even async is a module in Nim, without any specific support in the compiler). The Nim language as such seems to be "complete" as it is now, and I'd say it's quite unlikely to gain bloat in any way that would affect people not using the bloated parts (i.e. the compiler will still stay fast).
Great to hear. Just hope they can resist the inevitable flood of "good ideas" and stick to their design.
[EDIT] recognized your nick from IRC a while back, my "good idea" is still to add support for 80-bit floats ;)
> Memory leaks are not a concern as Nim uses one of several available garbage collectors. The new ARC option works with reference counting instead of a GC.
Using "no memory leaks" and "reference counting" in the same sentence is #fakenews. Reference counting leaks cycles unless accompanied with a tracing GC (at which point reference counting makes little sense).
> Reference counting leaks cycles unless accompanied with a tracing GC (at which point reference counting makes little sense).
Python proves otherwise. Reference counting gives you deterministic memory use and finalization except when a cycle is involved. The tracing GC helps for those cases (and libraries) that do introduce cycles.
If each one of your objects is in a cycle, then -- yes, reference counting makes no sense. If only 1% of your objects are in a cycle, it makes 99% sense.
Python is a trivial special case as it has no concurrency. In a single-threaded language, that indeed makes sense. Multi-threaded RC is extremely tricky to implement efficiently, so unless you can prove there's no cycles (or you don't care it it leaks memory), it makes little sense.
Oh, so now it's about "efficiently". Goal posts have moved.
No, it's not hard at all to implement efficiently as long as objects don't cross a thread boundary (and e.g. Nim's older GC used to enforce that condition, an old version of K tracked it and switched to "lock; xadd" to count references when something did cross a thread boundary IIRC, which made it inefficient only for those objects that crossed the boundary which usually weren't many.
It's way simpler than multithreaded mark&sweep, for example. Regardless - it makes a lot of sense. It might not make a lot of sense to you, but it does in general in most contexts.
The ARC option does indeed leak cycles. But with ORC (which is ARC with a cycle detector) this is mitigated.
These pros and cons appear to have been curated from around the web; this is a secondary source. If the final selection has passed throught the filters of passion and bitter experience, the author doesn't say. The parent site aims to fill itself with "Useful Tech Content", and this article reads like an assignment completed by a strong student.
So anyone can dress like Moses, come down off the mountain with tablets, and we'll debate the scriptures without considering the provenance? Good to know. California ballot initiatives often work that way.
It's a mix of research from the web (and I don't mean copy & paste), questions I and others have asked, and my own experience with Nim. This is not simply copy/paste. The rest of the comment I won't even bother to address as you sound like a troll.