Deno 1.10 Release Notes
deno.comStill not tempted by Deno to be honest.
All the problem that currently exist in Node are being ported to Deno straight up.
The built-in apis provided by Node.JS are almost non-existent... hence most of them are buggy and quiet tedious to use , it's why the community has created thousands of packages to resolve those issues.
Here I don't see how Deno is solving this , all the APIS seems again so barebone.. instead of having 1000+ dependency from NPM you'll have 1000+ dependency from remote URL with everything set at "read/write" because they need to read one file from your ".env" folder or perform an arbitrary post install process...
The way Ryan managed Node and how the ecosystem as turned to chaos because of is lack of vision and strategy make me not want to try any of it's tech again. Ryan is the kind of guy that gets obsessed over ONE THING and goes berserk for 5 years on that topic until he overdose and quit abruptly.
I don't think that's how you manage a language , when I look at Zig I'm way more confident of what's being done that the current state of Deno...
Node.JS is one of my main language , but the ecosystem around it is an absolute disaster.
It might catch on, it might not. Not everyone has to like it.
I personally do like it a lot. I think of it as Node.JS with a better organized core (with the benefit of hindsight), use of browser APIs whenever possible, and built in Typescript. I think it might catch on once we have some mature MySQL, Express.js, etc libraries.
I know seeing popular tools be rewritten from scratch is tiresome, but I don't think it's unreasonable in this case given that Node.JS and Deno mostly get their JS implementation from a separate program: V8. In that sense, Deno isn't throwing all of Node.JS away. It's just a different attempt to make V8 a command line tool.
And of course, competition is good. Maybe Typescript will become more convenient in Node because of Deno.
Additionally: as someone who uses Linux in their day to day job, I think it's a phenomenal scripting tool and replacement / supplement for Perl / Python. I mentioned this in a comment here the other day, but with this short wrapper, you can execute a bunch of SSH commands simultaneously using the Promise.all JS function (familiar to web devs). Just an example of a cool thing you can do with Deno scripting. https://github.com/gpasq/deno-exec
> use of browser APIs whenever possible
Correct me if I’m wrong, but isn’t Node.js aligning more and more with the browser APIs. For example if you `import { URL } from 'url'` you get the WHATWG standard URL object (it is also available as a global object). Node.js now has EventTarget and event listeners aligned with the DOM Event API. `crypto` is now a global object with the same API as the Web Crypto API. You have ArrayBuffer and Blob in Node.js just like in the browser.
What is it that Deno is doing differently then node here?
fetch() is a notable web feature missing from Node.js core
add WebSockets to that list.
Like I said, is aligning more and more:
* https://github.com/nodejs/node/issues/19393
* https://github.com/nodejs/node/issues/19308
Web APIs is not something that Deno is doing and Node isn’t. It is more something that Deno has done a few of which Node hasn’t yet.
It also looks like Node.js is going to get import maps.
> I think it might catch on once we have some mature MySQL, Express.js, etc libraries.
This resonates a lot with me. I first wrote nodejs apps not because of nodejs but rather because of Express. I could build a simple app very quickly, wire it up to a database, use Passport to secure it and call it a day. It was the libraries that drew me in.
I don't get the "built in Typescript" argument. Everything is running on V8 at the end of the day. With Node you run a one-line compilation step, Deno does the same thing just under the covers.
Maintainability I suppose.
Totally this, and not just for basic shell tasks but also as as scripting tool for CI/CD pipelines.
> you can execute a bunch of SSH commands simultaneously using the Promise.all JS function
It's just as easy in Python 3.5.
Plus async semaphore for resource limiting
And cancel remaining operations when others fail.
Python continues to be the ideal language for scripts.
You can use ZX to have Node in your shell: https://github.com/google/zx
> Ryan is the kind of guy that gets obsessed over ONE THING and goes berserk for 5 years on that topic until he overdose and quit abruptly.
My problem with that statement is: if you knew him personally it's unlikely you'd have said it. And if you didn't, do you have enough samples to be sure he's 'that kind of guy'?
Or are you extrapolating from N=1?
What about Deno's API is still so barebone?
Are you comparing Deno to PHP? Should Deno have all the various database drivers baked in? SDKs to call out to Salesforce/Stripe/Zoom? Should I pay a performance penalty for my app because you need to be able to read XML and make SOAP calls and want a tool that does that out of the box? Would you be happy to take a hit if the roles were flipped and I wanted Excel read/write?
I will not say Deno's perfect, I am still concerned/uneasy enough about how the whole dependency tree resolves for complex dependencies and is managed to not want to try it in production right now, but w.r.t to the ecosystem I'd say it's as good if not better than Node's, albeit smaller, given the focus on building a stdlib within this ecosystem for common usecases (http middleware, database drivers, common file formats etc)
> everything set at "read/write" because they need to read one file from your ".env" folder or perform an arbitrary post install process...
the CLI args allow for tighter scoping that, and atleast the topic of sandox by default is being discussed and the painpoints/edge cases are coming to light, rather than the node/npm model of execute as user and done.
> Here I don't see how Deno is solving this , all the APIS seems again so barebone.. instead of having 1000+ dependency from NPM you'll have 1000+ dependency from remote URL
With Deno you can already do a lot with only what is provided by the main executable. Here's a subset of the available subcommands:
IMO these provide basic tools that are likely necessary for any JS project, yet with Node.js you need a few hundred NPM deps to achieve the same functionality.bundle: Bundles JS. While it doesn't do everything that webpack does, it already provides enough to deploy SPAs. coverage/test: Builtin test/coverage framework. fmt/lint: Builtin lint/formatter.Not to mention the builtin Typescript compiler. Starting a few of years ago, I don't even consider the possibility of creating a Javascript project without using Typescript as the main language. With Deno you have it builtin.
Oh yeah forgot about built in TS, didn't even know about the build tools. Interesting, might have to give it another run
> Starting a few of years ago, I don't even consider the possibility of creating a Javascript project without using Typescript as the main ...
The truth value of this statement is suspect but..
In any case, why should typescript be the default when it compiles to JS? Why shouldn't js be the default in a js framework?
I don't understand what you mean by "default". Deno simply provides a builtin compiler which allows it to run Typescript transparently. Clearly Deno also supports JS out of box.
Isn’t typescript a superset of javascript so any ts compiler must also run js?
I personally like to have a few dependencies in my projects and a simple runtime vs having a monster of a language with a huge amount a unrelated and specialised APIs (java) and still need to install some additional dependencies.
I think the barebone nature of nodejs and javascript is what makes it great. If you don't like it, don't use it, there are other languages and runtimes out there and node is a really good fit for a lot of people.
I don't. I'd rather trust the tens of thousands of developers working on the core language and core stdlib, than on a dependency some random guy in Albania maintains on his spare time.
Or in the case of the JS ecosystem, you might only use established dependencies, which in turn use dozens more which in turn use dozens more, and the probability that there's a dependency some random guy in Albania maintains very quickly approaches 100%.
Have we already forgotten left-pad?
If you have a problem with library written and maintained by a random guy in Albania you have the option of not using it. If a functionality is so niche that you can only find one fit in the entire npm ecosystem, I doubt this functionality will ever make it to the standard lib of a non-node runtimes.
Personally I like dependencies written and maintained by a random guy in Albania on their spare times. And I would use it when making fun stuff at home. I might even open an issue or a pull request. Dealing with a random guy in Albania sounds way more fun then dealing with a language committee in Silicon Valley.
How many people audit their dependency authors more than 1 level deep? That's the problem: I know who wrote all of my first level dependencies (react, react-router, redux, reacstrap, etc). I don't know who wrote _their_ dependencies, and the 3rd level, 4th level. And I don't think anyone has the time to adequately evaluate that every time a dependency's version gets bumped given how deep the graph goes.
I'm like you. I want as few dependencies as possible. I liked this about PHP: You could get very far without any dependencies at all.
For a toy project, pull in some dependencies. For a more serious project that requires to do due diligence on every dependency you pull in, it gets annoying very quickly.
Also, I kinda dread all the fast-changing version numbers in the JS ecosystem.
There is nothing stopping you from looking at their code and, after vetting it, copying the code and pasting it into your own local JS files. Now you don’t have to worry about anyone tampering with it after you have vetted it.
Sure, I'll do that next time I'm at work, I'll tell the frontend dev running `npm install next` to spend the next 6 months doing a code review of the 258 dependencies in the tree. Boss will have to wait.
https://npm.anvaka.com/#/view/2d/next
There's dependencies like webpack, and "dependencies" like lodash-sortby, is-number, isarray, diffie-hellman, encoding, is-negative-zero or assert. Who in good faith can argue that those are better served as standalone dependencies maintained by who-knows-who instead of being in a standard library?
I so wish someone had the balls (and good enough OpSec) to inject malware into one of those 5 lines long dependencies, causing hundreds of billions of dollars in damages, and then we'll perhaps do something about it.
In practice, you rely on libraries that are popular and/or written by someone trustworthy. "Vetting" libraries amounts to thorough tests of the complete application.
Java is not really a good benchmark for having lean build artifacts. It is a monster but it has other desirable qualities.
In .Net land, the dotnet core runtime weighs in at about 30mb (what you'd need to run a production server), the sdk is about 140mb (for development machines and build servers)(once off setup). If you compile a project that depends on 5 other packages, it will only include those 5 packages and a package for your actual project (assuming 1 project per solution, else n packages for n projects per solution). It boils down to having build artifacts that are super lean, provided you are using the installed runtime on the target machine. You also have the option to package the framework along with your own package, then you don't have to install the runtime on the target machine, and these typically compile down to less than 100mb. It is probably less by now, but I don't use it. You also have the option to bake everything into a single file, much like how Rust does it.
So yeah, I wish more people would play with .Net Core and it's tooling a bit, it's bloody great at moment. Java and it's tooling feels like a behemoth once you get used to the new dotnet tooling.
Take it from somebody that builds build servers and custom tooling (cloning git repo's, building prod binanies, packing them if needed and then move them around (deployment, nuget server etc), all on linux, with C# code) - it is a dream. Calling the the dotnet build tools from my own console apps is a no brainer. My build tools can then be called by other processes in linux like any other cli app, or if I build them as asp.net projects, a simple middleware to intercept calls from nginx to trigger workflows remotely... easy peasy. All while the build tools can talk to Digital Ocean, Azure, Aws via their api's...
Simple, and lightweight sound great in theory but some problems are complex enough they they require a lot logic, and that complexity needs to live somewhere, whether it’s your code base or a dependency. Node.JS could provide its libraries in a modular way so that you only install what you need. Essentially, that’s how npm works now except that the packages are provided by random people, and often times the work of getting them to work together to form a complete solution to a problem is left as an exercise to the reader.
When everything is broken down to very simple packages, you often end up in a situation where your dependency tree is very deep and now keeping track of which packages you use and vetting them becomes a complex task. Many devs are too trusting of the packages they take a dependency on. Remember that npm package that everyone used but was buried like 3 levels deep in people’s dependency trees, and then maintainer got tired of working on it so he handed it over to someone else who then purposefully injected a vulnerability into it which affected a lot of projects?
> If you don’t like it, don’t use it
Again, simpler said than done. My guess is most of us are working on projects where we don’t/didn’t get to choose the tech stack.
I don't understand all the hate towards .Net. The included libraries are amazing.. about 80% of what you will ever need is provided by the framework. It basically provides you with a massive selection of tools, ready to go. The rest you can either build yourself or pull in a (precompiled) nuget package.
It is my main gripe with JavaScript (and with typescript): a lack of a standard library that everyone uses and trusts. Something that should be predictable and boring, is absolute chaos in the js world. I think it's half the reason things like jQuery, momentjs, lodoash and others exist, because people got frustrated with the lack of built-in functionality. npm has just made everything worse. Can't we have a .Net type framework for javascript? Minus the CLR and compilers of coarse, just the framework bits. Or if we need some kind of CLR type layer, why not build it with webassembly? Then all flavours of javascript can call into that? There has to be a clean way forward.
> It is my main gripe with JavaScript (and with typescript): a lack of a standard library that everyone uses and trusts. Something that should be predictable and boring, is absolute chaos in the js world
This is something that Deno is attempting to build with its stdlib: https://deno.land/std
While the stdlib is not shipped with Deno (it is downloaded like any other third party dep), the code there is reviewed and audited by the core team.
I don't understand all the hate towards .Net.
Because it's the same over-abstracted over-engineered life-sucking ecosystem as the Java world.
I think that's more a property of the C# code that's out there rather than a property of .NET or C# themselves. A lot of (most?) C# and Java code in the world is over-abstracted over-engineered and life-sucking just by virtue of them being popular corporate languages and most code being boring. Using C# with Unity, for instance, is a pretty good experience though.
That really depends on the developers involved. Some people like building over complicated complex nonsense to justify their existence, while others build very lean/shallow code and go on with their lives.
The code I write for production in the "realworld" is maybe 1/4 as complex/convoluted as the academic stuff we did in university. And lines of code -wise, maybe 1/10.
You don't need to build an excavator to add some dirt into potted plants, but I can bet there will always be people who build a space-grade shovels with redundant enterprise level handles that guarantees maximum soil filling rates, even when under water... but not everyone is like that. C#/.Net Core definitely doesn't throw you down that path.
I personally don't like AspNet Core, as plenty of the old mistakes are being repeated, some of the same patterns exists, which I'd argue Microsoft had the opportunity to move away from, but didn't. But .net core itself is pretty great (and lean, no overabstraction in the core system).
Yep, was exploring ASPNet Core for a new project and was like nope ;)
But the bad ecosystem in npm is one of the main reason he made deno, that's what deno/std is for.
While not stable yet, they are working to address this very issue
Zig is looking great but also, solving a different problem
Excuse me, but what's this "bad ecosystem in npm" you're talking about? Every single JS lib, pipeline tool, framework is on npmjs.com (react, webpack, bootstrap, expressjs, and 100'000s others). It's the ecosystem that every contender would love to be.
And the lack of a "stdlib" is exactly how and why npm started over ten years ago, via the community-driven CommonJs initiative (JSCI, connect/express.js, the package.json format, middlewares, etc). The idea being that the core packages on npmjs.com are the stdlib on top of what Node.js/CommonJs provides.
> Every single JS lib, pipeline tool, framework is on npmjs.com (react, webpack, bootstrap, expressjs, and 100'000s others). It's the ecosystem that every contender would love to be.
This is only a strength if you accept that those libs (and their dependencies, and their dependencies' dependencies, and so on...) are adequately scanned for malicious behavior. If you don't accept that, then the incredibly deep dependency graph that is typical of frontend projects these days is a liability.
While that's true, this is really orthogonal to the argument. Especially since Deno's API is also anemic, as complained about elsewhere in this thread.
> It's the ecosystem that every contender would love to be.
Trying to clarify - do you mean other JS ecosystems? Outside of JS, NPM is usually used as what not to do, not as an aspiration.
Could you share some examples of ecosystems that are 1) vibrant and active 2) have working, open source, ergonomic tooling of a comparable caliber to VSCode, typescript and friends 3) can target almost any platform, including but not limited to server, mobile, desktop and web?
I’m trying hard to think of any, Java and Python come closest but both fall short.
There are vibrant and active communities around good projects, but npm is the greatest known repository of abandoned, obsolete, not very good and potentially malicious libraries. The bad scales up along with the good; great tools on npm don't make the Leftpad fiasco more forgivable or technical shortcomings less bad.
Fair enough, but I have no idea how that can be avoided if we take Sturgeon’s Law as a given: 90% of everything is garbage.
I’d argue an essential quality in a modern software engineer is ‘good taste in dependencies’, if you will. Adding a dependency for padding a string with whitespace would have gotten you a friendly but stern lecture from a senior dev, in every good team I’ve been a part of so far.
Npm together with the ergonomics of JavaScript/TypeScript is what keeps me in the Node.js ecosystem. Never understood the hate for a massive ecosystem of community build libraries that you can contribute to, fork or modify at will.
Deno has passed 1.0. Having the standard library still be "not stable yet" doesn't spark much confidence in me for something meant to replace node.
Yes, I would have wanted to see a decent stdlib to be a goal for 1.0.
I recently wrote a 10 line script that had to work with date arithmetic *, and while importing URLs is pretty cool, it still was the same shit of spending 80% of the time it took to write the code browsing and evaluating multiple third party date libraries to find one good enough for my use case. So in practice the only improvement Deno had over Node in that example was that I didn't have to run `npm install`. Yay, great.
*: adding two dates together, converting to and from UTC, given a Date find the next midnight. And as expected the most popular JS library couldn't even get one of these simple tasks correct, there's a bug report open since 2017. Incredible stuff from this ecosystem.
I have yet to find a datetime library in any language that was adequate for all purposes I've needed. That's not a particularly damning example in my opinion. I've had to write very weird datetime code in JS, Python, and Ruby. I don't even want to imagine the horrible things you could find in some other languages.
In Java, Joda-Time was a third party library for ages. It was the dominant date-time lib for enterprise development. Then, it was standardised and added to the stdlib: java.time (JSR-310). The Joda-Time project lead, Stephen Colebourne, ran the standardisation process. It was well received by most, even the breaking API changes that SC was adamant were flaws in the original Joda-Time API. I can vouch for it: Both Joda-Time and the java.time libs are excellent. (In my job, I regularly need to perform complex date-time transformations, including time zones.)
I have also used Howard Hinnant's C++ date-time lib: https://github.com/HowardHinnant/date It is also very good. (He was the guy behind move semantics in C++ 11.)
In Python, the stdlib has a function to get UTC now, that does not include UTC timezone... so it weirdly and surprisingly acts like local time zone! There is endless shit-posting about it. I feel bad for Guido van Rossum et al. To be fair, the original Java date-time lib was horrible, so Joda-Time was created. And JDBC (Java database) dates are still horrible.
There is a difference however between "Does all the things my snowflake app needs" and "does the top 20 most common things". The latter is easy, and even quantifiable if you have the time (to scour open source repos for use cases). When you write dozens of utilities, a few libs, and several apps, its likely the common date time use cases will be suitable for 80% of them. Rinse and repeat across numerous other dependencies, and the result is a std library you learn once, that serves 80% of your use cases just fine. Its not about no dependencies, its about dependencies focused on novel or niche problems, not common and already solved ones.
Just because they are attempting to address it does not mean they will succeed.
So what? Should we all give up now and stop doing whatever we're doing now, just because we might not succeed? That's how people learn, advance and improve the world around us - through failure and mistakes.
> Just because they are attempting to address it does not mean they will succeed.
Strongly agree this statement , hence I don't see how switching from "npm" to "raw urls" will solve anything...
The problem with Node dependency is bigger than just "npm is not a good package manager"... Honestly in this case just fork node and replace npm with something else...
Here the problem relies on a mixture between poor built-in apis which are buggy a lack of vision with the language , which have been core to the language since it's origin.;.
Deno doesn't seems to address those at all...
Again it's just seems to be "npm is bad , and i want to use typescript natively with web apis"...
I just know very well that Ryan is redoing exactly the same mistakes as Node with the same obsession he had on "EPOLL"[0] back then that will end up in a new fiasco.
[0]https://youtu.be/M3BM9TB-8yA Can't find the specific part where he mentioned "EPOLL"
> I just know very well that Ryan is redoing exactly the same mistakes as Node with the same obsession he had on "EPOLL"[0] back then that will end up in a new fiasco.
"epoll" is a Linux API for listening on multiple file descriptors. Different platforms have equivalent APIs, and these are normally core of any scalable non-blocking I/O network program.
Can you elaborate why do you think he had an obsession with "epoll"? More importantly, can you elaborate what was the fiasco? Epoll is still used under the hoods by Node.js (through libuv) and many other network servers such as Nginx
npm maybe bloated but I'd say it's better than pip. pip is insane, you have to set up virtual env (hurts UX) but with node you don't have to.
> (...) will end up in a new fiasco.
Are you saying nodejs is a fiasco?
re-[0]: 15:43
If they don't attempt to address it they'll certainly fail.
Yes but let’s not get that cynical about it. People are working hard on these problems.
deno is definitely better in the sense that it's way more compatible with browser and I think that's really big thing in the long run (but nodejs could do that too, eventually)
regarding 1000+ deps, yes that's a bad thing but it's not really about language, it's rather about people. when node started, usual number of dependencies was low.
I know because I was there and I was making fun of maven and how it pulls half of the universe for a simple thing. Now nodejs is pulling the whole universe.
Yet the problem, in my opinion, is not package manager but rather "look I made a package, it does one small thing and it does it well and I dont want it to do more" which leads to many more packages because you really need that thing so what you are going to do? you will add a package on top of package. rinse and repeat and there we are
If the standard library would be richer you'd just ignore half a million of those packages. They'd just die a quiet death.
Look at the python ecosystem and you'll see that it's not the case. Because of its compatibility commitment, a standard library cannot evolve much and its features end up being replaced with external libs.
“The standard library is where modules go to die”
Isn't that one of the big justifications for not bundling the standard library?
It seems to me like not shipping the std library with the runtime is potentially one of the biggest language innovations we've seen in awhile because it should allow the std library to evolve over time in a much more graceful way ... -- you'll actually be able to make breaking changes to the std library as folks who are unwilling to update their code can just use the older version (until the ossified code becomes irrelevant which eventually code that is never changed eventually will)
How does it work when you want to use third-party libraries though? (because even if you have the biggest stdlib in existence, you'll still gonna end up using some external libraries no matter what)
Python packages, in general, are much bigger though.
And you'd probably be surprised how much the stdlib is used. In many environments third party libraries have to be vetted by security, or the developers are junior and can't probably check/understand a third party library so they just take the safe option and use the stdlib and hand-code a bit to make it do what they want, etc.
Plus, the third party packages that are used generally have to offer much higher convenience or quality or scope (or all three) to be adopted over the stdlib alternative.
So the bar is much higher than leftpad or is-odd.
Not really, I mean what do you think is missing from the standard library json package? It obviously solves most use cases since just yesterday flask dropped simplejson. Standard library is just not great for libraries that are not yet stable.
Exactly this. If anything, Deno can help introduce the use of standard libraries as a source of truth to webdevs who may not be familiar with the concept to begin with.
That does happen (and is happening), its just not often big news https://twitter.com/sindresorhus/status/1320788906888089600
I think you're underestimating just how passionate the Node crowd is on customization and reusability. There are feature-rich, extremely popular packages which act as a stdlib in many ways for particular functions - yet there are constantly alternatives to ecosystem-dominating packages that spring up. Some gain traction, some do not. I don't see this changing, even with a robust stdlib. It's the culture around the toolset that drives this.
That's just a post hoc rationalization.
The whole "culture" popped up because people wanted to share code between browsers and backends and there's no tree shaking in Javascript, so libraries had to be super small and modular to keep the code small for the front end, where download/unzipping/code parsing/compiling code speed matters.
If browsers get a big stdlib, many of these libraries will just go away (bye, leftpad!).
> regarding 1000+ deps, yes that's a bad thing but it's not really about language, it's rather about people.
Not sure that I agree that it's about 'people', except in the sense that every problem with languages/their ecosystems are a people problem because people created them; but I 100% agree that it's not about the language.
My take on the situation is that we have 2 separate issues:
1) Auditing, which is basically an economics issue. It'd help a lot if someone with pockets full o' money were willing to fund a couple mil of auditing infrastructure for npmjs. 2) Devs pulling in lots of packages (which pull in packages, all the way down), which _may_ be partially mitigated by a better base language (no more leftpad, etc). Personally I'm skeptical of the better runtime/language solution.
I think one thing that might help is if there was some automatic way of marking packages as 'safe' in the sense of no side effects, no writing to files, no network activity guaranteed. Such packages could be installed with confidence, and have a lower priority for auditing.
Another possible solution would be a cultural shift among developers to prioritize reducing dependencies with every release. I'd love to see that in a release notes, how many packages were added/removed!
> because you really need that thing so what you are going to do?
When that thing is as simple as left_pad, I’d just copy and paste it into my own code. Or just write it myself.
When did so much of development become glueing other people’s code together? Don’t we all know how to write something as simple as left pad? Why was it ever a good idea to pull it in from somewhere else?
Comparing node, deno and zig is like comparing red apple, green apple and sushi.
But deno is written in Rust, so obviously the Zig squad had to chime in. /s
I am trying to expand my horizons from only node to other frameworks. I've seen so much hype around Deno so I looked into it the other day and I feel the same as you. It doesn't seem different enough from Node from a user standpoint. I also agree that the package system seems messy. For right now I'm going to steer clear and get out django and rails.
Try Elixir and thank me later.
Even as a long time Erlang user, the rise of node.js was just sad to watch. I understand why it happened, but looking from the outside what kind of nonsense they were doing when there are such better solutions was sad.
Elixir's standard library is so pleasant to use. Coming from the JavaScript world it is such a breath of fresh air,
Elixir's stdlib is great. It's small, based on a just a handful of concepts, but thanks to how powerful the concepts are it covers a lot of use cases. Every module contains pretty much everything you'd ever need to work with the concept that module implements, be it a String, an Enumerable, a Stream, or anything else in the stdlib.
But, Elixir is cheating. It can stay clean and compact in part because it sits on top of 30 years of development of Erlang stdlib. Erlang stdlib is messy, spread across multiple applications and modules, with module interfaces inconsistent with each other, not to mention parts of it still include compatibility layers for Erlang/OTP version so old that they didn't have lambdas yet. But, the functionality is there, which enables Elixir to have a small, focused stdlib - because when something is missing, you can just grab an Erlang equivalent.
This is similar to what Clojure does on the JVM. JS doesn't have the luxury of sitting on top of a battle tested stdlib, and trying to cover all the functionality of such stdlib is what results in incomplete and unstable APIs and reliance on so many external packages.
How’s the lack of static types treating you? Honest question as someone looking at Gleam.
I've generally had less typing issues in Elixir than I had when working in Python (which generally wasn't a ton but varied a lot depending on the library).
I've found that Elixir's functional programming model generally alleviates type issues. You're always thinking about what you are passing into a function or what function you are calling and with what data structures.
A lot of type issues in JS, Ruby, Python, etc. seem to come from the mutability and object oriented parts.
> all the APIS seems again so barebone..
What is an example of something you are hoping for?
I setup my business entirely around those remote urls.
> 1000+ dependency from NPM
I've been working with Node in hobby projects since around 2012, been paid for it since 2016, and still don't understand why is that such a problem.
Compared to other language ecosystems, each of those dependencies is smaller and more atomic. If anything, it's closer to the "unix way" of small tools that do one thing and doing it well, rather then developing huge mega-libraries. Since these libraries are smaller, it's easier to change one for another.
Because of that, community is much less likely to settle on one standard way of doing things just because of "how things are done here", and ecosystem continues to evolve and find better ways of writing code. Would any other language ecosystem that is widely used in production go from callbacks to different promise libraries, to standard promise api to async? I don't think so. (Edit: strike that, Rust seems to have done it too. Well, Rust is also awesome). Of course, it means that you have to learn more; but it also leads to things actually becoming better, and not because of some central mandate by language committee, but as a result of a more decentralised gradual evolution. (Not completely decentralised, just compared to alternatives).
In any other ecosystem, pushing a pull request to any framework or library feels like something that you would do only after spending a couple of days of learning all the ropes of this codebase; in NPM, I've done meaningful contribution to a library less then an hour after learning about it's existence.
> still don't understand why is that such a problem
Some reasons it's a problem:
- it's slows and disrupts the development process
- packages get abandoned very easily; not many packages are highly popular/active
- security audits are essentially impossible
> Of course, it means that you have to learn more
JavaScript takes this to an extreme. It literally takes daily effort to keep up.
> In any other ecosystem, pushing a pull request to any framework or library feels like something that you would do only after spending a couple of days of learning all the ropes of this codebase
So you prefer an ecosystem created by amateurs? After years of working with PHP and JavaScript, I don't.
At least if they would have cleaned up the unnecessary promises from the async/await implementation: https://es.discourse.group/t/callback-based-simplified-async...
I can see why npm is annoying, but I wonder what would be a good role model for a better ecosystem. I can personally only compare npm to PHP, Python and Java, and I think npm is far superior. Do you have an example of a better ecosystem/package manager? I'm genuinely curious.
I think some people like to compare Deno’s package management to go’s. I think that is a good comparison since dependencies live (sort of) on a URL in both cases. However I think Deno has improved significantly on the go approach.
When I look for a better ecosystem though, I like to look towards Rust’s Cargo. However that is bit of an unfair comparison since Cargo was heavily inspired by npm, had learned from npm’s past mistakes and were able to improve on it significantly.
Cargo is one of the nicest build systems I've used. Coming from maven, I have nothing but love for what cargo brings and how easy it makes it.
The key thing for me with Cargo (and Rust) is the documentation. I'm able to quickly glean what I need to do from the docs, and often with useful examples that are close to my use case.
I do wish the package ecosystem was set up with namespaces. Abandoned crates, name squatting, etc. should really be a thing of the past. But I guess this fosters creativity in names.
I use NodeJS daily, but I do agree some fundamental APIs are missing. For example, calling remote URLS is critical for most apps, yet Node only offers the "http" and "https" packages (and why different packages!). It should at least have "fetch" support and require installing the "node-fetch" dependency.
I'm glad Deno is here trying to push the ecosystem forward, at least with Typescript, and hopefully Node will learn from them.
Still not tempted by NodeJS or Deno
I was interested in Deno from the start. It has a few very nice features. Notably the sandboxing model, native Typescript support and browser based API surface.
But I also was quite skeptical of their dependency model with plain url imports.
With the current implementation you end up with a half-baked import map that's essentially a poor mans package.json, but without any of the tooling that you'd expect. (like npm upgrade, npm outdated)
It recently dawned on me what they are going for: a "cloud native" computing platform that doesn't require builds and packaging. Targeting both Javascript and WASM.
I do believe there is quite a lot of promise in the project. The tooling can improve. It'll be interesting how things evolve.
IMO this is a terrible idea.
If I have learned anything from working long time with NPM is you can't trust a single command with updating your dependencies and you can't trust developers to respect semver on the long run.
I can't stress how many times I had to review and undo automated dependency bumps because my app suddenly stopped working, and the times I was forced to I had to bump said dependencies in the lock file myself
In a compiled language, where code is compiled once and valid there 'til the end of times this is not even a problem. In an interpreted language where all code is evaluated every time you run your program, this makes automated dependency management an impossible task
I didn’t know what deno was.
It’s a typescript native alternative to nodejs that adopts the browser security model, replicates the golang standard library rather than node’s and is written in Rust.
Not only adopts the browser security model, but more broadly adopts browser APIs.
V8 is written in C++, not Rust.
They're talking about deno, which _is_ written in Rust. https://github.com/denoland/deno
Deno is a simple, modern and secure runtime for JavaScript and TypeScript that uses V8 and is built in Rust.
So it's probably a rust wrapper around a C++ project (unless they reimplement V8 in rust... )
Yeah the communication with V8 and all the API are in rust, V8 is still in c++
Yup, it uses rusty_v8 which are Rust bindings for V8's C++ API - https://github.com/denoland/rusty_v8
Sure, although the rust v8 bindings don’t appear to support FreeBSD as i just found out when i tried to install deno on FreeBSD 13:
> cargo install --locked deno Compiling rusty_v8 v0.22.2 error[E0308]: mismatched types --> /home/craig/.cargo/registry/src/github.com-1ecc6299db9ec823/rusty_v8-0.22.2/build.rs:157:18 | 157 | fn platform() -> &'static str { | -------- ^^^^^^^^^^^^ expected `&str`, found `()` | | | implicitly returns `()` as its body has no tail or `return` expression error: aborting due to previous error For more information about this error, try `rustc --explain E0308`. error: could not compile `rusty_v8` To learn more, run the command again with --verbose. warning: build failed, waiting for other jobs to finish... error: failed to compile `deno v1.10.1`, intermediate artifacts can be found at `/tmp/cargo-installwrTFWi` Caused by: build failedThat looks like a pretty simple problem. Deno probably doesn’t work on FreeBSD simply because nobody has done the work of making it compatible yet. Rust, V8 and nodejs all run great on FreeBSD.
If you care about FreeBSD support, I bet the community would be delighted to receive some pull requests patching the problem.
I gave it a whirl:
Make rusty_v8 build.rs aware of freebsd> git clone https://github.com/denoland/deno.git denoland/deno > git clone https://github.com/denoland/rusty_v8.git denoland/rusty_v8 > cd denoland/deno > vi Cargo.toml ... [patch.crates-io] rusty_v8 = { path = "../rusty_v8" } ... > cargo build --release ... as expected same failure - good ...
Have a quick squizz to see where this is used:> vi ../rusty_v8/build.rs ... #[cfg(target_os = "freebsd")] { "freebsd" } ...
Attempt to fix... and bang! It's using the platform() result to call a python script that pulls binaries from here:> rg "platform\(\)" ../rusty_v8 build.rs 157:fn platform() -> &'static str { 180: .join(platform());https://github.com/denoland/ninja_gn_binaries/
And there's no FreeBSD build there. To much yak shaving for idle curiosity on my part.
Hm. Looks like gn refers to: https://gn.googlesource.com/gn
Guessing it is inspired by v8/chrome build system? Maybe have a look at nodejs for freeBSD for inspiration? Or just provide ninja/gn some other way.
Little sad to see a build process import binaries from the net either way...
I'm pretty confident we'll see a JS engine written in Rust at some point in the future, it'll just take a very long time to get parity with V8 and will likely introduce its own slew of issues.
I don't know about a JS engine written in Rust. Well, I'm sure it will be attempted (no doubt it's being done right now), but I don't see a path to success.
Like you say, it would take a long time for a Rust JS engine to reach parity with V8. But long-running projects need real use-cases to succeed... they drive support and provide direction/feedback. How much will a half-baked Rust JS engine be adopted when a mature, stable V8 or SM is available? Will the projects that do so be successful themselves? A pure Rust JS engine might be destined to peter-out well before becoming viable.
A better approach (though less satisfying) might be to convert an existing engine incrementally. But even there, there probably needs to be continuous, compelling benefits along the way to justify the increased complexity and large amount of additional work. Imagine release after release where the main item in the release notes is "rewrote another subsystem in Rust"... followed by a bunch of bugs in the previously stable subsystem. I know SM has some Rust bits, but I'm not sure how far that is really going to go.
I don't see any benefits (for me, the user) if its written in Rust. Yes, Rust has nice features for more security+stability for the developer, but just because a JS Engine would be written in Rust wouldn't mean its automatically better than V8. It's not the programming language that gives the edge in this case, it is the amount of time, sweat and grease that went into V8. And other engines would have a long way to come.
I rather imagine we'll see a wasm vm/runtime in rust, and a typescript/js to wasm compiler written in typescript...
> I rather imagine we'll see a wasm vm/runtime in rust
Like Wasmtime? https://github.com/bytecodealliance/wasmtime
Yes, but we would also need an event loop I guess. As I understand it nodejs is essentially a (c++) js runtime inside a c++ event loop. But I think we already have a rust event loop (or two?)
Parts of the Firefox JS engine (SpiderMonkey) are written in Rust (although nothing really significant I believe).
Here's our token-based (JWT) user management server using Deno - if you want to see an example.
I'm still waiting for abortable fetch feature, since I have to deal with some services that have unstable network connection and often got stuck and never drop their connection.
Is there any progress over this one? Last time I heard, they still have to wait till rusty_v8 got matured enough for this feature to be available.
We are still working on it, it just requires some work deep inside of Deno internals.
Oh, what happens now - resources not collected until next deno restart? Or just a non-adjustable timeout?
The addition of sandboxed storage (i.e. access to localStorage and sessionStorage APIs without requiring disk permission) is really interesting.
It seems very in line with the model of deno as an edge compute runtime - esentially ephemeral cache on the edge - I wonder what usecases will emerge for this?
I like the ideas behind Deno, but I'm wondering if testing should be so included in the standard library, unless it is extremely flexible.
Each team will have different problems when it comes to scaling, running tests in parallel - sometimes on the same machine, sometimes on multiple machines, e2e tests vs unit-tests, etc. This looks like a problem you solve in a library, not something to add in the core.
There have been multiple test runners built on top of Deno's vanilla testing (through the JSON output feature), those may fit your use case better
The current multi threaded and module isolated model however I think it's good for the great majority of projects, so I'm glad it's built in
How is the built in testing? I'm a bit wary because Jest is so good and wonder if it can compete on every front it's baking in.
It reminds me of Angular coming with it's own Router, Forms, Animations and then since they're provided officially, alternatives don't get created and then the half of the team leaves and the packages are abandoned.
> How is the built in testing?
Much more minimalist than Jest. Very similar to Node's built-in testing [0].
- I don't believe the test environment is recreated between test files
- No way to mock imported modules, as far as I understand. I don't believe the built-in testing has any mocking or spying functionality at all
- No describe/it/expect syntax that Jest inherited from Jasmine (I am not sure where it came from initially — was it from RSpec?)
- The above means that there is no nesting of test blocks
- No setup or teardown functions (beforeAll, beforeEach, afterAll or afterEach)
On the other hand, it's fast. And it doesn't swallow up console logs, like Jest can do. No magic to it at all.
[0] - https://nodejs.org/dist/latest-v14.x/docs/api/assert.html
There are third party modules for providing many of those things. I created 2 modules for testing.
This module has describe/it functions with setup/teardown hooks. It supports nesting test blocks. I think the built in assertion functions are quite good so I didn't bother creating an expect function although there are other third party modules that provide that functionality.
https://deno.land/x/test_suite
I had difficulty getting sinon to work earlier on so I wrote a similar module in TypeScript for creating spys, stubs, and faking time.
I know a fella working on a Deno port of Ava (https://github.com/avajs/ava). When that and Koa are ported or compatible, I'll be giving Deno a legitimate go.
FYI https://github.com/oakserver/oak is a port of Koa for Deno.
Deno test has been fine. oak exists for koa like middleware oriented servers
Great to see Deno going from good to better and better steadily over the past year. It is such a pleasure when compared with Nodejs projects. Always appreciate products that make life easier.
Node is amazing, but it was Electron that really made Node something that every school kid had to learn.
As long as Electron has no plans to support Deno, it will be WAY behind in traction.
/useless prediction
We may have WebGPU powered desktop applications in the near future though. Let's see how things play out, as you daid
Canvas is already a thing, but ultimately the biggest advantage of HTML-based UI is the layout engine. People want to throw together some <div>s and a stylesheet and call it a day.
What's the status of grpc both as client and server? I seem to recall reading somewhere that some features(trailing headers?) needed to call grpc as client were out of the fetch spec and thus would to be implemented. It's also not clear from this issue either. https://github.com/denoland/deno/issues/3326
I have been following the progress of Deno for a little bit, and slowly but surely the project is evolving from a full-scale Node.js replacement to a serverless/edge compute platform. I get that it's the hot thing in tech these days, but to me that just isn't very appealing.
Has Deno failed to get adoption? I haven't seen it used anywhere to be honest.
...it’s only been around ~3 years.
Wrong, 1.0 released exactly one year ago, before there was mostly destabilization of the runtime and not really development features
Okay great. It was announced 3 years ago. Regardless of which version (pre/post 1.0), there really hasn’t been enough time for Deno to take hold. My point stands.
Demo is awesome but ironically I haven't been able to use it because I always want something from npm
You can probably use the package using https://www.skypack.dev/
SemVer != Decimal
Somehow, Deno doesn't excite me.
I love to explore new stuff in programming. Have done with Python, Java, JS (Node), React, Angular, Vue, etc.
Deno just doesn't cut it.
Uhm, so this submission was edited and renamed to Deno 1.1, but this is release version 1.10 as in version one dot ten.
For context, this 11-month old github release <https://github.com/denoland/deno/releases/tag/v1.1.0> is the Deno 1.1 release notes, not the linked post here which is for the 1.10 release notes.
It seems the person responsible for this edit mistook Deno's version with a decimal number.
Isn’t that the normal behavior of JavaScript numbers? Just do your own thing sometimes? Heh.
It's become somewhat the norm that JavaScript projects have adopted semver (https://semver.org/) for releases.
While it may be, in this case, we're dealing with strings.
Fixed now. Sorry!
It's Deno 1.10, not Deno 1.1.
Fixed now! Our mistake.
As an old webdev who was full-stack before Node even came out, and avoided it completely, all I can say is; Have fun, kids! jingle jingle