Deno Joins TC39
deno.com> As TypeScript is a core part of the Deno ecosystem, we are also very interested in pushing for even closer alignment of TypeScript and JavaScript in the future.
I've wondered why the frontend community hasn't gotten together and said, "The next version of JavaScript - is TypeScript!" I've been using TypeScript for five years professionally now and cannot understate how much easier it has made large frontend (not just Web, but mobile and desktop) projects. Surely enough thought and work has been put into TypeScript to make it the next standard.
If TypeScript becomes the next version of ECMAScript, then browsers will have to support it. The day that TypeScript is supported directly on end-user machines instead of going through a developer-controlled compiler pipeline is the day that almost all evolution of TypeScript stops.
There's not really much positive value out of having browsers run TypeScript natively. The main feature is static checking, but static checking doesn't benefit end users. When I go to my bank's website, if their front-end code has a type error, it's not like I can fix it right then and there.
The type system is mostly a developer-time feature, so it makes sense to leave it out of the core runtime environment.
In other words, think of JavaScript/ECMAScript more like the architecture that browsers support. That needs to be slow-moving since it's deployed across billions of devices. TypeScript then just targets that.
Adding TypeScript directly to JavaScript would improve the world to about the same degree that adding C++ features directly to x64 machine code would.
This is not true. There are a number of ways that TypeScript's type-checking can be subverted at runtime, particularly when dealing with APIs that return JSON. You have to trust that the API has returned exactly what you are expecting, or write your own very detailed validation scripts. It's similar to the sorts of testing one would do in vanilla JavaScript without TypeScript, but now executing all the time at runtime.
As a developer, in the case of an API change that violated my assumptions, I would personally prefer my applications to fail-hard at the point of the API call, rather than to have my scripts run merrily on and only error in some other code far away from the root-cause when one of those assumptions fails.
However, I have a lot of hope that runtime type checking based on auto-code generation around TypeScript's interfaces could be developed in a future version of TypeScript.
This would be a big (but exciting) departure from the current TS goal of "Impose no runtime overhead on emitted programs." [1]
Recently, I've been using Zod [2] and find it to be a satisfying equivalent: you define a schema, and then you get both a TS type AND a JS parser/validator (which works as a TS typeguard).
[1] https://github.com/microsoft/TypeScript/wiki/TypeScript-Desi... [2] https://github.com/colinhacks/zod
Zod looks super cool. I'll have to check it out in depth.
TypeScript is an optionally typed language. It's core to the design of the language that the static type system is purely statically typed and doesn't come into play at runtime.
You could argue that that's not the best kind of language for users. I wouldn't disagree. I work on Dart which used to be optionally typed but now has a fully sound type system with runtime checks.
But that's orthogonal to whether browsers should support TypeScript directly. If they supported a statically typed language that had the runtime checks to be sound, that might be a great language, but it wouldn't be TypeScript.
That's a strangely puritanical viewpoint.
I wouldn't say it's core to the design of the language. I'd say it's a key design decision for the development of the compiler, but those are two different things.
Also, I don't understand what you mean by "it wouldn't be TypeScript". Languages change. They change all the time. Did adding nullish coallescing before it became availalbe in JavaSciprt?
It's also not true that TypeScript is purely static and doesn't have any runtime component. There are a bunch of helper sort of functions that TypeScript can optionally include, so there is some precedent for having runtime-oriented code generated by TypeScript, rather than just eliding type information after successful static checking.
So perhaps there could be a syntax for imposing runtime-checking as an optional element. Something like:
Type `Checked<T>` would signal to the compiler that type information for T needs to be made available at runtime, and the `check` keyword would perform the check and "unwrap" the type to a bare reference to T.interface Point { x: number; y: number; } async function getPointFromAPI(): Promise<Point> { const request = await fetch("/api/points/current"); const point = await request.json<Checked<Point>>(); return check point; }I don't know what it would look like, but it would be a huge value add.
> I wouldn't say it's core to the design of the language.
It is absolutely core to the language. TypeScript's core value proposition is that you can take vanilla JavaScript and use it from TypeScript without any overhead, incrementally migrate to TS, or maintain a heterogeneous codebase as long as you want.
If you take away seamless zero-overhead JS interop, the language you have is radically different from TypeScript. To the degree that any language has any identity at all, that would be a pretty fundamental change in its identity. Like taking objects from Java or pointers from C.
To the best of my knowledge, no one has figured out how to have a language that allows mixing dynamic and static typing without either massive runtime overhead or giving up soundness. You basically have three options:
1. Allow dynamic types to flow into statically typed code
2. Soundness inside the statically typed code
3. Tolerable runtime overhead when using dynamic code from static code
But you only get to pick two. TypeScript, Dart 1.0 and other optionally typed languages give you 1 and 3 at the expense of 2. Dart 2.0 and other statically typed languages give you 2 and 3 at the expense of 1. Gradually typed languages like Typed Racket give you 1 and 2 at the expense of 3 (and are rarely used in practice because of it).
You're asking for TypeScript to just add 2. People have been trying to figure out how to get all three for decades but no one has succeeded yet [1].
[1]: https://blog.acolyer.org/2016/02/05/is-sound-gradual-typing-...
No, I'm not. You're way overthinking this and your attitude is weirdly gatekeepy. I'm asking TypeScript to implement a shortcut feature for the tedious, boilerplate code we already write to use type guards to inspect objects from APIs before passing them on.
I'm not sure where the "gatekeeping" accusation comes from. In your original comment, you wrote:
> There are a number of ways that TypeScript's type-checking can be subverted at runtime, ... As a developer, in the case of an API change that violated my assumptions, I would personally prefer my applications to fail-hard at the point of the API call
I interpreted that to mean that you would prefer TypeScript's type system to be sound: If a function expects a Foo, you want a guarantee that you'll never get into the body of the function at runtime with an argument whose type isn't Foo.
I can understand that that seems like a fairly simple request. But when you dig into soundness, you discover that it is anything but. Optionally-typed languages like TypeScript are unsound by design because soundness is a difficult requirement with very severe trade-offs around interop, runtime performance, and usability. Making TypeScript sound would give you a language that felt very little like TypeScript does today.
This is also why Deno running TypeScript directly is a bad idea.
The difference is a that a developer has a lot more control over what version of Deno their server-side app runs on than they do what version of a browser their client-side app runs on.
This isn't really true in the wider ecosystem. Yes, the app developer does, but the library developer does not. As Deno adopts newer versions of TypeScript and config options libraries can fail to compile. This is compounded by the lack of a package manager so that references to specific versions and CDNs are hardcoded into dependents.
All with essentially no benefit over running plain JS with associated typings.
They don't though. They transpile it behind the scenes
Same thing with the same problems. The difference being that Deno isn't going to have the same no-breaking-changes policy that the web does.
Types in TypeScript are great but if JavaScript ever wants to add types it has to be something like a real programming language types in which you can use types in runtime as well. Like in a catch clause I can assert the type of the error and do things with it once that assertion is done. Many other useful things when types space and runtime space are not totally separate.
ES4 was the first shot at adding types to JavaScript which failed due to how big the ambitions were. I'm not sure if there is any more appetite for adding types to JS tho.
In very very serious big applications like a 3D editor you can fall back to WebAssembly and use your favorite typed language. For smaller apps TypeScript is good enough. This way JavaScript stays simple and lean.
"real programming language types"
There are a few languages where types only exist (for the most part, though with exceptions and hacks here and there) at compile-time, like Rust, C++ or even Haskell IIRC.
JS objects already have runtime types, and you can use them in catch clauses.
There was once a Mozilla extension (https://web.archive.org/web/20200111091805/https://developer...) that allowed you to abbreviate the above totry { … } catch (e) { if (e instanceof FooError) { … } else if (e instanceof BarError) { … } else { throw e; } }
It was never standardized, but since it’s just syntactic sugar, if there were demand, it could be standardized without bringing in an entirely new type system.try { … } catch (e if e instanceof FooError) { … } catch (e if e instanceof BarError) { … }There would be at least two problems with using TypeScript types for this. Firstly, TypeScript types are unsound in a number of intentional and unintentional ways, meaning that it’s possible for the compile-time and runtime types to disagree, even in fully typed code. Secondly, TypeScript can express many types that cannot be tested at runtime; for example, there is no way to tell whether a function accepts a string as an argument, or to guess the inner type of an empty array.
*>...use types [at] runtime..."
Two things. First, TS conceives of itself as having no runtime component. If it did, I think people (including the TS devs) would be more confused.
Second, I'd say rather we need a runtime type system. In fact I've tried my hand at writing one in the most minimalist way possible, and have been working on it recently [1]. The type system is explicit in that a type is a JSON like object, similar to JSON schema, but 100x less code.
[1] https://github.com/javajosh/simpatico/blob/master/friendly.h... This is effectively the test harness for the module.
> it has to be something like a real programming language
It's a nice feature request but there's no one feature that makes a programming language "real".
TypeScript is great and everything, but Microsoft owns the standard and the single functioning checker, and haven't published a specification or even a grammar.
The TypeScript checker/compiler is Apache 2.0 licensed, so I'm not sure there's room to complain, unless you disagree with the direction they're taking the project:
https://github.com/microsoft/TypeScript/blob/main/LICENSE.tx...
Nobody disputes that TypeScript is open source, but there’s still a vast difference between a single open-source implementation and a specification that’s suitable for standardization. The implementation inevitably has bugs, and there needs to be a way to decide which bugs are actually “features” that other implementations will need to emulate.
(Here’s the specification for ECMAScript, for example: https://tc39.es/ecma262/)
To be clear, I am not bashing Microsoft here—just pointing out a reason that TypeScript can’t be declared “the next version of JavaScript”, which is the context of this thread.
Would expanding wasm capabilities be a better long term option here instead of supporting TS directly? Opening the browser up to a whole lot of languages, including Typescript (AssemblyScript exists already).
TypeScript is still experimenting with its type system and it has numerous edge cases. I could be more in favor to specify and support a minimal subset of TypeScript rather than the entire TypeScript. Basically we could start by enabling type annotations, enum declarations, and type-aliases.
TS design goals and non-goals [1]
It's very much designed to be a layer of abstraction over JS.
I am as much a fan of TS, however, it's not likely ever going to be the thing that it's really close to being.
[1] https://github.com/Microsoft/TypeScript/wiki/TypeScript-Desi...
Because then ability for typescript to be agile would be destroyed. JavaScript can’t be changed as easily as typescript.
Hey - I am Luca Casonato, Deno's new delegate at TC39. I am happy to answer any questions you all might have :-)
Hi Luca - congratulations! I have a quick question, have their been any proposals to add Subresource Integrity hashes (https://developer.mozilla.org/en-US/docs/Web/Security/Subres...) to the import syntax? I think this effects Deno more acutely than other projects since Deno supports / (encourages?) directly importing from a url with a precise version number encoded in the url. It would be nice to add another layer of safety on top and be able to assert that the module received is exactly as expected. Thanks!
You can do that right now, albeit not directly in the import: it's done via an explicit `lock.json` file (https://deno.land/manual@v1.16.4/linking_to_external_code/in...). I'm tempted to agree that having some ability either to directly import, or even just to have that better integrated (right now, you have to ask for the lockfile to be used and pass an explicit path), would probably be a good idea.
Ok that makes a lot of sense, the link you shared helped explain things in denoland quite well (and reminds me that I really need to give it another go).
From the link I see this example:
As you mentioned in practice it's definitely a bit too manual, but should be one of those things that can be automated so it's not the end of the world.in src/deps.ts // Add a new dependency to "src/deps.ts", used somewhere else. export { xyz } from "https://unpkg.com/xyz-lib@v0.9.0/lib.ts"; Then essentially a create/update lock-file command is run. Then the lock file is checked into version control. Then another developer checks it out and runs a cache reload command.Having said I think having it in the import syntax would provide a few benefits:
1. No extra steps need to be run & hopefully IDEs could auto-complete the hash.
2. Would hopefully be standardized with the browser allowing for native browser support as well (or perhaps lock.json could be standardized with something like import maps)
3. Having it right there provides an extra level of assurance that the integrity hash is going to be used (especially in files intended to be used in the browser and in deno ... not sure how common that is though).
I am not aware of any specific proposals right now. There was some talk a while back about supporting SRI hashes inside of an import map, but that sorta dissolved. For Deno at least you can use a `lock.json` file with the `--lock` and `--lock-write` flags: https://deno.land/manual/linking_to_external_code/integrity_...
There's an issue for that to be added as part of import assertions https://github.com/tc39/proposal-import-assertions/issues/11...
I don't think that's going to fly because it's in-band.
SRI really, really should be out of band, otherwise SRI digest changes invalidate the entire module graph (and import cycles become a major pain).
I think they're much better as a part of import maps, and later fetch maps so they can apply to non-JS resources like CSS.
I don't think it's clear cut that it should be out of band 100% of the time. I think there are use cases where inline is useful.
Cycles are definitely an issue, I am not sure there is even way to work around that, except to pull the cycles apart (which may not always be possible but is usually not a bad programming practice when it is). However at the library level, libraries tend not to circularly import each other. If it's being done at the inside a project level the build tool would be generating it so dealing with the module graph being invalidated may not be a hassle (or even necessarily a bad thing), in that case it could modify the files or it could be generating a lock file / import map (which I agree has benefits at that level of not forcing every source to be transpiled, but some of that probably still has to happen for module reloading e.g. appended search parameters to the module path for cache invalidation / module reloading during development like vite.js does for example, and realistically given the nature of the ecosystem some transpilation is going to have to happen either because of .ts or just because of browser differences).
For a top-level deps.ts / dep.js file pattern there probably won't be any cycles. That pattern is to declare a root deps.js file for your project that locks things down and re-exports from third party libraries a use the exports from that as the basis for other imports. For this pattern I think SRI would be extremely helpful and add enough benefit to justify it (even though SRI may not be used in the cases you listed).
Also for smaller projects or main modules having the SRI hash inline is really helpful.
Are you going to push for records and tuples?
Eich was pushing for them in 2011 and they still haven't arrived.
He doesn’t need to, the authors of the proposal are on the committee and are planning to see it through.
The proposal is humming along through the stages at a good pace.
Exactly. I am very much in favor of them though. They would be a great addition to the language.
Yes, Bloomberg has a big JS investment due to the Terminal (20MLOC of JS last I heard) and they have employed the records and tuples champions.
I haven’t read the full post, but what’s the benefit of adding language support for records and tuples at this point? My understanding is that engines already optimize the objects/arrays version of those concepts pretty well, and TypeScript enforces the semantics on the dev side.
> My understanding is that engines already optimize the objects/arrays version of those concepts pretty well
They don't, using libraries that guarantee runtime immutability has a heavy performance cost in JS currently.
By “object approach” I just meant “making a bunch of objects with the same shape”. I’ve read that V8 can optimize these into flat structs.
Is immutability part of the proposed standard? And if so, what’s the benefit over using Object.freeze?
Immutability is the whole point of records and tuples. You can read more about them on the proposal: https://github.com/tc39/proposal-record-tuple.
Ah so these are actual persistent data structures. Got it, that makes more sense.
The use of the term "tuple" here is odd; I've only ever seen it used to describe fixed-width, non-homogenous sequences. These look more like immutable lists (though I guess they can serve both purposes).
> I've only ever seen it used to describe fixed-width, non-homogenous sequences.
That's what they are here. Array are heterogenous in JS, and tuples are too. And since they're immutable, their size is fixed.
Technically. I guess what I’m getting at is these are separate use cases that happen to be served by the same data structure in some languages, and “tuple” to me refers to the much less important/powerful use case, which is why I didn’t realize what the term referred to when I first heard about this proposal.
For instance: Rust has tuples that are fixed-length heterogenous sequences, but you can’t actually work with them as sequences in any meaningful way; you can’t map or loop or filter over them. The term “tuple”, to my mind, refers to that “multivalue” or “unlabeled-struct” kind of use case. The fact that we use arrays for that in JS always seemed to me like a historical accident/convenience.
Record and Tuple are actively being worked on: https://github.com/tc39/proposal-record-tuple -- currently Stage 2.
Or how about TCO (tail-call optimization)? Please pretty please!
TCO is something that specific JS engines need to implement. It is implemented in JSC (Safari), but not in V8 or SpiderMonkey. Also see https://v8.dev/blog/modern-javascript#proper-tail-calls.
As this is an engine feature rather than a spec thing, there is nothing me (or any other TC39 delegate) can do.
Oh yes, I see, that's true. Thanks for the explanation.
Proper tail calls are ALREADY part of the spec.
Google and Mozilla simply chose to ignore the spec.
More like Google, Mozilla just waits to see if Google is going to implement it since they can't think for themselves.
It would be so nice to have it though! It could do a lot popularize recursive problem-solving and function writing, without having to go into modifying these functions to make them stack-safe.
What are your thoughts on the do expressions proposal? Cause I think they'd be awesome for cleaner scoping of temporary variables.
I think they are great! Especially in combination with pattern matching (https://github.com/tc39/proposal-pattern-matching), or when using JSX.
Do you have any insight into how the TC39 process works, and how one might drive this work forwards? I feel like expression-orientation is the main thing missing from JavaScript at this point, and I'd love to contribute. But it's not at all clear to me how to actually do so.
The TC39 process is well-documented, you can check out the stages and guidance for providing input here: https://tc39.es/process-document/
The do expression proposal is currently stage 1, so its very early. You can check out the issue tracker there to see some of the related discussion. Standards work can be deceptively hard, even for simple things. I think the do expression proposal is a good example of that.
Edit: forgot to link to the issue tracker I referenced above https://github.com/tc39/proposal-do-expressions/issues
Do you have a link to the proposal?
https://github.com/tc39/proposal-do-expressions
Allows you to contain temporary variables to where they are needed, rather than having them remain active through the remainder of the current scope. You could do that with immediately invoked function expressions, but they are too verbose to be really viable for this. You could also use extra functions, but extra functions don't always make sense. Currently I'm frequently doing this:
Without do expression:
With do expression it would become this:let result; { let tmp = 123; result = tmp * 2; }let result = do { let tmp = 123; tmp * 2; }
Will there be a Deno equivalent of Electron?
Aaron@Deno here, we've been exploring something with the Tauri team but don't have a concrete release on the roadmap since we're focusing on other priorities.
I believe an Electron alternative is an important part of the Deno stack, so hopefully we'll ship a first iteration next year.
Thanks, Aaron. I think people who have been using Node for years are skilled at the old Node ways and have huge inertia in their skills, their own code, and others' Node code. For them, the ideal platform would be Node plus some upgrades. I'm guessing they would rather extend their inertial frame of reference than leave it behind.
Then there are others of us who have been saying no to Node and legacy JS for years. We have no such legacy to maintain and no intention of ever creating any. But some of us (at least I) would reconsider platforms built from scratch on a new TypeScript foundation rather than layered on a pre-ES6 foundation. That would include a Deno-based Electron. You might have more luck converting people who don't use Node than getting Node users to abandon their legacy.
The state of cross-platform desktop apps is terrible. All attention is on mobile, and desktop OS makers have almost zero interest in supporting cross-platform desktop apps. (MS cares a little more than zero, Apple less than zero and barely tolerates their own Mac-only developers.) Only something browser/Chromium based seems realistic for the next few years.
On the server, there are a lot of alternatives to Node that are considered better by (and very popular with) large segments of the market. Deno will be one of them, I think. But for cross-platform desktop apps, Electron would be rejected completely if the alternatives weren't so bad and unlikely to get better. A better Electron, despite its inherent problems, could end up more popular than server-side Deno. Just a thought.
Glad to hear this is still on the table! I have been very keenly watching this space waiting for this to land here :)
Why not PWAs?
As modern JS is getting closer to typescript in terms of new methods, latest syntax etc. What is the future of TS? will it be there for just type checking?
Yes.
could you imagine that some JS proposal adds the ability to ignore TS-like type annotations to the engines, so we don't even have to strip them?
this may help development heavily, also making the browser and deno nearly identical environments.
Yes, this would be great, and it will probably happen eventually. This is something we want to work on soon - expect something in the coming weeks.
Since TS/Deno has native support for JSX syntax, do you think Browsers will eventually support it as well?
No, I don't think so. JSX is too proprietary and not specified well enough. It is also rather ambiguous.
If you want a "no compile" JSX:
```jsx
const x = <div color="red">hello</div>;
// is the same as
const x = h("div", { color: "red" }, "hello");
```
This is more true as a mental model/to the type system, but slightly more complicated when compiled.
First, there’s the “new JSX transform”, which involves auto-imports, has a different function signature, and defines fallback behavior for certain circumstances.
Second, JSX is only specified as a syntax extension. Some implementations—like SolidJS and its underlying dom-expressions compiler—don’t compile to hyperscript at all.
It’s always been there just for type checking, hasn’t it?
Do you have discussions about not adding too many features to JavaScript? It's already quite complex for beginners.
This is always a consideration when adding new features. It is however also important to keep the language up to date with other modern languages. If you don't innovate, you die. There is always a cost/benefit calculation to be made.
Hey luca,
Huge fan of the achievements that Deno has made in recent years. Several questions:
How do you aim to promote Deno as a viable alternative to node, considering it's significant network effect and legacy?
Do you believe that Deno's 'more sane' defaults for security will appeal to developers in the long term? Do you think that the front end community will be receptive to these defaults?
Choosing TS as your language de jure, do you think that you will alienate any dyed in the wool JS devs? Can we expect that TS is now effectively the superset of JS going forwards?
Do you believe Deno's lack of support for NPM style package management will result in cleaning up the frontend community's over reliance on 'leftpad' style packages? Do you think that Deno's approach to dependencies fosters a more considered approach to transitive dependency bloat?
Again, huge fan of Deno, and happy to hear about this announcement.
> How do you aim to promote Deno as a viable alternative to node, considering it's significant network effect and legacy?
This is a great question, but not one I can answer in a small HN comment :-). I may write a blog post about it one day. The core of the argument is that Deno can save you an insane amount of time / discussion (OOTB linting, formatting, testing, standard library, etc). It aims to unify the ecosystem into a single style, like in Go.
> Do you believe that Deno's 'more sane' defaults for security will appeal to developers in the long term? Do you think that the front end community will be receptive to these defaults?
I think many developers do not care about permissions, and also will not in the future. This is a problem, but not something that can be tackled overnight. Security is often not emphasized enough in our industry unfortunately. Because of this I think sane defaults and opt ins are good - they push people to think about security at the most basic level. Maybe the log4shell attack also shows people that it is a good idea to sandbox server side scripts aggressively (something we have been pushing for), to prevent large scale system takeovers through a single vulnerable entrypoint.
> Choosing TS as your language de jure, do you think that you will alienate any dyed in the wool JS devs? Can we expect that TS is now effectively the superset of JS going forwards?
There is work being done on this. I don't have too much to share right now, but expect some updates on this early next year. JS has to evolve to support some form of type annotations first class to stay relevant.
> Do you believe Deno's lack of support for NPM style package management will result in cleaning up the frontend community's over reliance on 'leftpad' style packages? Do you think that Deno's approach to dependencies fosters a more considered approach to transitive dependency bloat?
Maybe, maybe not. I think it is still to early to tell. I do think that so far it is looking like it. People seem to be doing less weird stuff like "leftpad" with Deno so far. Ideally all these little helper modules should just be part of JS directly (hit me up with suggestions!)
> Again, huge fan of Deno, and happy to hear about this announcement.
Thanks, glad you like it :-)
> JS has to evolve to support some form of type annotations first class to stay relevant.
What are you thinking in this regard? Just to define type annotations that can be made but will not necessarily be type-checked at runtime? To have them serve as inputs to the interpreter for optimisations? Or even breaking at runtime if types don't match their annotations?
I don't want to go into details right now. Expect more in a couple of weeks :-)
Personally front end developers need to step up their games learn how to develop or else don't think they have business developing software front-end or back-end. This whole idea of bring in as much developers as possible at cost of developer competency has being harmful for developer and tech community.
In my opinion, what has been most harmful for evolution of the software engineering profession is the tendency to blame individual developers for not being infallible instead of fixing chronic failures in tooling, processes, and funding.
The medical profession, aviation, even rail transportation [1] have all progressed past the point where avoidable failures are entirely the responsibility of the individual.
Of course, there was resistance in those fields as well because some considered themselves an "above-average" doctor or pilot who didn't need safeguards, checklists, union rules, or laws. But it empirically improved outcomes.
While I agree in this case: at the edges, there still exist people who just should not be allowed to be doctors or engineers. The main difference with software is that the stakes tend to be drastically lower...
The stakes could be drastically lower or drastically higher depending on what the software is used for.
For example, we might want to make sure software used in avionics, aerospace, weapons systems, voting machines, medical devices, cryptography, power plants, policing, finance, etc is carefully engineered. But I agree with your main point that there's still a baseline of competence necessary -- we need both good tooling and good people.
That reminds me of a lament from a friend at the Software Engineering Institute that the profession missed the boat on the kind of licensure most engineering disciplines have. (That is, any software developer can refer to themselves as an "engineer" without taking any tests, accepting any liability, or meeting any other legal requirements.)
Congrats! Can you extend on your ideas for async iteration?
Mainly pushing proposals like https://github.com/tc39/proposal-iterator-helpers that make it easier to work with (async) iterators.
When cross platform windowed WebGPU on Deno? It would be interesting for many kinds of apps.
We are not sure about windowed WebGPU right now. We will try to come up with some form of "Deno Desktop" next year, but I have no real ETA for that.
Hey Luca! What does
> Better support for explicit resource management
refer to?
Things like https://github.com/tc39/proposal-explicit-resource-managemen.... Essentially better language level support for objects which represent some IO resource that should be reliably closed when a user is done with it. Something like the `defer` statement in Go is really missing from JS.
What you think of more *-Scripts built on top of JS keeping coming? Are you not afraid of TypeScript following the CoffeScript into obscurity because of WebDev community keeping chasing the new thing?
This is all when vanilla JS keeping very energetically absorbing new features from *-Scripts thanks to TC39 seemingly intentionally picking them?
No, I think this will eventually stop. JavaScript is getting more mature. I would not be surprised if the TypeScript syntax will be legacy in a few years, because JS caught up.
Is there some proposal for making the type syntax valid? No type checking just JavaScript parsing code with types and ignoring it.
No such proposal exists right now, but this will be a point of immediate focus for me.
How will Deno react if that happens?
Deno will behave just like Chrome or Firefox would: we ignore the checks when running your code. We would have a `deno check` subcommand to perform a typecheck. You could optionally run this automatically before a `deno run` by passing a `--check` flag.
Congrats on your new role! Currently, I feel like addition to JavaScript is a lot slower than new CSS features, for example. What do you think are the chances that JS will one day get a larger batch of STL functions, instead of about a dozen each year?
I hope the pace will accelerate. The real questions is what functions we need though. Some candidates that I would love to see are better helper functions on iterators, and Uint8Array<->base64/hex. Most of the "standard library" in most languages is related to IO, and for JS is dependant on the host (the web, Deno, Node) so not something TC39 will touch directly. Do you have ideas for standard library functions that you think are missing?
'enumerate', as in Python.
'clamp' and 'sortBy', as in Lodash.
Set methods for union, intersection, difference, etc.
Re set operators: For completeness' sake there would have to be a way to specify how set elements are to be tested for equality.
'range' a-la python
Range a-la python is a generator and is pretty straightforward in js as is. There is no need for a complicated solution and/or as part of stdlib, imo:
function* range(x, y) { while (x <= y) {yield x++} // or ‘<‘ if you want half-open one } for (let n of range(3, 5)){ console.log(n) } array = [...range(3, 5)] console.log(array)It's also straight forward to implement in python. It is common enough to be standard so people won't reinvent the wheel. And it should be available to people learning the language before they learn generators.
Array.prototype.indexOf is also pretty simple as are many core functions.
Immutable data structures are on the agenda over in https://github.com/tc39/proposal-record-tuple and I would vote for that in Deno while I have the chance.
Yup, we'll ship them once they go Stage 3/4 (when Chrome ships them in stable).
I often miss map, reduce and filter on iterators, though this is already a stage 2 proposal: https://github.com/tc39/proposal-iterator-helpers.
This is definitely something I am interested in pushing too. Iterators have not gotten the love they deserve.
That's great to hear! Congratulations and thank you for your work.
I miss being able to create a hash from an array in a quick way. You always end up with
let bar = [{id: zz, }, {id: y},...]
let foo = {}
bar.forEach(v => foo[v.id] = v)
I'd love to have something like:
let foo = bar.toMap(v => [v.id, v])
const foo = bar.reduce((hash, item) => { hash[item.id] = item; return hash; }, {});
new Map([[k1,v1], [k2, v2]]) can do that. Or Object.fromEntries with the same argument.
Thanks!
You still need to create the middle datastructure ( [[k1,v1], [k2, v2]] ) but it is an improvement :)
Ah yes, true, you always need to map the things ID to the thing itself, so that middle data structure is actually quite annoying :)
let foo = Object.fromEntries(bar.map((v) => [v.id, v]));It is not necessary to create any extraneous data structures to do OP's one-liner. This creates n+1 new arrays that must be garbage-collected.
Object.fromEntries takes iterable, so if you have a map from a handy iterator library you could get rid of the parent array returned from `bar.map`.
Or with the proposed iterator-helpers:import { map } from "my-iter-lib"; Object.fromEntries(map((v) => [v.id, v], bar));
However the inner arrays are harder to get rid of... In fact even OP’s oneliner defines the inner arrays. I would hope there were some engine optimizations though which could minimize their footprint.Object.fromEntries( Iterator.from(bar).map((v) => [v.id, v]), )Please see my other comment in this thread for the `reduce`-based solution that requires no extra data structures. They're not that hard to get rid of!
'zip' a-la python
I have often needed this too. Noted.
'reversed' a-la python
Currently in stage 2 as `Array.prototype.toReversed`: https://github.com/tc39/proposal-change-array-by-copy
Nice, I didn't know about that, but in any case note that IMO it should be part of the Iterable protocol. Also note that the proposal above creates a copy on reverse AFAIKT.
Please have someone on your team take a good hard look at the current proposal-pipeline-operator: https://github.com/tc39/proposal-pipeline-operator
It started as a Function Composition proposal (using the pipe operator |>) but after a change of leadership it has turned into something much different. We might need another perspective on the current trajectory of this proposal, as in its current form it seems to many in the community it might take JS in the wrong direction.
Thanks!
If you want a feedback, I do not find |> versions much (if at all) easier to read. The noise is as high as in a regular function-based chaining:
I don’t want to sound too negative, but this seems like a pointless extension only to bring the burden of support for some syntactic flavor.await chain( Object.keys(envars) .map(envar => `${envar}=${envars[envar]}`) .join(' '), text => `$ ${text}`, text => chalk.dim(text, 'node', args.join(' ')), colored => console.log(colored), )React example is special though. It feels like very helpful in expr-only context, but in-array ifs and fors would do better there:
<ul> {[ for (v of vs) { const text = foo(v) yield <li>{text}</li> }, if (vs.length == 0) { yield <li>No items</li> }, ]}
Something I’d like to see, in browsers, Cloudflare Workers, Deno, etc: explicit network firewall in the software stack.
An example with Workers, one script might only need to fetch from Backblaze. I’d like to set their host as a whitelisted address, and so even if a log4j type vuln happens, it can’t go anywhere except Backblaze.
I think this could even work in browser-land? If you don’t need to pull in any resources outside the original host, deny any fetch made unless it’s added to a whitelist. For browsers this would need to be opt-in for backwards compatibility, but an ideal state would be opt-out (to allow all).
You want a Content Security Policy[0]
[0]: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Co...
Ah yes, I forgot about that browser side, but server side is that a thing?
Deno has permissions to do this: https://deno.land/manual@v1.11.3/getting_started/permissions. Deno Deploy (our serverless offering) has no support for permissions yet (still in beta), but we are expecting this to happen soon.
Nice!
FWIW, in Cloudflare Workers, a log4j-type RCE vulnerability would be impossible because Workers does not allow dynamic code loading (eval() and similar are disabled).
Of course, a lesser form of the vulnerability -- data leaks rather than RCE -- would still be possible. I agree that being able to restrict outbound traffic would be useful to mitigate that.
As a hack that works now, you could monkey-patch `fetch()` to intercept calls and deny them based on URL.
(I'm the tech lead of Cloudflare Workers.)
Thanks for thinking about this Kenton. I agree the data leak is more the concern here, or even accidental use as a ddos attack agent. The scenario is an import of something like worktop, if worktop released a malicious version.
Monkey patching is an option of course, but a native solution would be nice.
I think its probably worth clarifying whether you mean an ACL or a firewall here. The former seems more feasible given that its stateless and the latter is not, at least conventionally. Those implementation details matter here I think.
But speaking more broadly, do you have any examples of this kind of behavior being defined at the language specification level (and not in a platform API)? I can't think of any presently.
It seems problematic for a number of reasons, but if there's other examples to work backwards from that might be helpful for me to grok how this would work in a general sense.
This is great news! Good luck, Luca!
> Better support for explicit resource management
+1
Since everyone is making feature requests, I'd like to point out `ArrayBuffer.transfer`[1] -- ability to effectively move data without copying would do wonders for low-level/high-performance code in JS.
Yeah, this is something we have been thinking about too. Something else along those lines would be read-only buffers, and with that a copy-on-write operation for buffers. They could result in a significant speedup for many operations.
* a copy-on-write copy operation of buffers (that returns a read only buffer)
I'm surprised not to see Cloudflare amongst the TC39/ECMA members, considering their current downline entirely depend on Javascript.
> More extensive standard library functions for (async) iteration
Great news! I wrote an open source library called axax that adds a number of utility methods to async iterators - map, filter etc.
I think having them as part of the language would be awesome.
Standardising async cancellation would be neat too if Deno wants a challenge....
I think async cancellation is pretty well covered by `AbortSignal` now. It is a Web API (not JS), but it is supported in all major runtimes.
> Better support for non-JS assets in the ES module graph
Whatever happens please never give into any misguided pushes to support commonjs/amd/umd or any of the other non-standardised disaster module formats that cause Node and npm etc to be so painful! It's only very recently that modern build tools are managing to overcome such poor foundations...
No no, this is about things like importing WASM through the `import` keyword, or referencing assets statically through syntax:
Asset References: https://github.com/tc39/proposal-asset-references Alternative module reflections (wasm imports): https://github.com/tc39/proposal-import-reflection
What are the problems with commonjs/amd/umd? I'm only vaguely familiar with the JS ecosystem. Is it mainly that the dependencies can't be statically analyzed?
Yeah I have been happy with require/commonjs for the last ~10 years, I'm not sure why suddenly it seems to be a problem.
The main problem to me is this push to this ESM thing, which I don't know what it brings to me. I understand it's a frontend thing, so I'm not sure why nodejs end npm need to be impacted.
ESM is not a frontend thing. ESM is the standardized way of doing module imports in JS. It has a lot of benefits for server side developers too:
- It has language syntax for importing and exporting instead of relying on an implicit global.
- It is asynchronous, allowing for top level await.
- It is reliably statically analyzable.
- Because it is asynchronous, module asset loading can happen in parallel which can mean a significant startup speedup for larger projects.
- It is standardized, so it behaves the same across Node, Deno, the web, bundlers, linters, and other tooling.
Exactly this and I'm a little disappointed, though not surprised that the response was "well it works for me so what's wrong with it? it's only for frontend". It's actually part of the language.
You also forgot another super important point:
- It actually works. How many times have we all run into some weird error or stack trace because node/npm/babel/webpack/typescript/jest/regenerator-runtime/babel-core/quantum-flux-inverter tower of cards collapses when a commonjs module somehow leaks in from node_modules?
Precisely, aren't babel & webpack front end issues?
No
Why would you babel or webpack your nodejs backend code? (Genuinely asking, I have never needed to do it in the last 10 years, so I assumed this was all about browser compatibility and not having to download thousands of small file over the internet respectively)
Oh, 100% agree on that point :D
I'm not convinced :)
Admittedly the largest project I'm working on is mostly in TS now, and I probably get most of the benefits of the import syntax even if require is used under the hood (including linters, which I think worked even before we switched to typescript).
I wouldn't worry about that. Node is moving to ECMAScript modules.
Node is not "moving to ESM", they don't even plan to deprecate CJS. They will continue to support both for the foreseeable future. Their docs still use `require` in a lot of places even though ESM has been enabled on stable releases for almost 2 years.
I heard there was a movement internally for a while to block adoption of ES modules because the core team members "didn't like ES modules" (again, what I saw tweeted a long time ago and don't remember the source). So that certainly can't be helping adoption of it. This is one of the reasons Deno is superior - it makes use of language features not a badly designed module format.
PRs on doc are always appreciated.
Just adding some interesting info: There is an ECMAScript Optional Static Typing Proposal.