Settings

Theme

Deno Joins TC39

deno.com

415 points by wongmjane 4 years ago · 157 comments

Reader

disease 4 years ago

> As TypeScript is a core part of the Deno ecosystem, we are also very interested in pushing for even closer alignment of TypeScript and JavaScript in the future.

I've wondered why the frontend community hasn't gotten together and said, "The next version of JavaScript - is TypeScript!" I've been using TypeScript for five years professionally now and cannot understate how much easier it has made large frontend (not just Web, but mobile and desktop) projects. Surely enough thought and work has been put into TypeScript to make it the next standard.

  • munificent 4 years ago

    If TypeScript becomes the next version of ECMAScript, then browsers will have to support it. The day that TypeScript is supported directly on end-user machines instead of going through a developer-controlled compiler pipeline is the day that almost all evolution of TypeScript stops.

    There's not really much positive value out of having browsers run TypeScript natively. The main feature is static checking, but static checking doesn't benefit end users. When I go to my bank's website, if their front-end code has a type error, it's not like I can fix it right then and there.

    The type system is mostly a developer-time feature, so it makes sense to leave it out of the core runtime environment.

    In other words, think of JavaScript/ECMAScript more like the architecture that browsers support. That needs to be slow-moving since it's deployed across billions of devices. TypeScript then just targets that.

    Adding TypeScript directly to JavaScript would improve the world to about the same degree that adding C++ features directly to x64 machine code would.

    • moron4hire 4 years ago

      This is not true. There are a number of ways that TypeScript's type-checking can be subverted at runtime, particularly when dealing with APIs that return JSON. You have to trust that the API has returned exactly what you are expecting, or write your own very detailed validation scripts. It's similar to the sorts of testing one would do in vanilla JavaScript without TypeScript, but now executing all the time at runtime.

      As a developer, in the case of an API change that violated my assumptions, I would personally prefer my applications to fail-hard at the point of the API call, rather than to have my scripts run merrily on and only error in some other code far away from the root-cause when one of those assumptions fails.

      However, I have a lot of hope that runtime type checking based on auto-code generation around TypeScript's interfaces could be developed in a future version of TypeScript.

      • kipple 4 years ago

        This would be a big (but exciting) departure from the current TS goal of "Impose no runtime overhead on emitted programs." [1]

        Recently, I've been using Zod [2] and find it to be a satisfying equivalent: you define a schema, and then you get both a TS type AND a JS parser/validator (which works as a TS typeguard).

        [1] https://github.com/microsoft/TypeScript/wiki/TypeScript-Desi... [2] https://github.com/colinhacks/zod

      • munificent 4 years ago

        TypeScript is an optionally typed language. It's core to the design of the language that the static type system is purely statically typed and doesn't come into play at runtime.

        You could argue that that's not the best kind of language for users. I wouldn't disagree. I work on Dart which used to be optionally typed but now has a fully sound type system with runtime checks.

        But that's orthogonal to whether browsers should support TypeScript directly. If they supported a statically typed language that had the runtime checks to be sound, that might be a great language, but it wouldn't be TypeScript.

        • moron4hire 4 years ago

          That's a strangely puritanical viewpoint.

          I wouldn't say it's core to the design of the language. I'd say it's a key design decision for the development of the compiler, but those are two different things.

          Also, I don't understand what you mean by "it wouldn't be TypeScript". Languages change. They change all the time. Did adding nullish coallescing before it became availalbe in JavaSciprt?

          It's also not true that TypeScript is purely static and doesn't have any runtime component. There are a bunch of helper sort of functions that TypeScript can optionally include, so there is some precedent for having runtime-oriented code generated by TypeScript, rather than just eliding type information after successful static checking.

          So perhaps there could be a syntax for imposing runtime-checking as an optional element. Something like:

              interface Point {
                  x: number;
                  y: number;
              }
          
              async function getPointFromAPI(): Promise<Point> {
                  const request = await fetch("/api/points/current");
                  const point = await request.json<Checked<Point>>();
                  return check point;
              }
          
          Type `Checked<T>` would signal to the compiler that type information for T needs to be made available at runtime, and the `check` keyword would perform the check and "unwrap" the type to a bare reference to T.

          I don't know what it would look like, but it would be a huge value add.

          • munificent 4 years ago

            > I wouldn't say it's core to the design of the language.

            It is absolutely core to the language. TypeScript's core value proposition is that you can take vanilla JavaScript and use it from TypeScript without any overhead, incrementally migrate to TS, or maintain a heterogeneous codebase as long as you want.

            If you take away seamless zero-overhead JS interop, the language you have is radically different from TypeScript. To the degree that any language has any identity at all, that would be a pretty fundamental change in its identity. Like taking objects from Java or pointers from C.

            To the best of my knowledge, no one has figured out how to have a language that allows mixing dynamic and static typing without either massive runtime overhead or giving up soundness. You basically have three options:

            1. Allow dynamic types to flow into statically typed code

            2. Soundness inside the statically typed code

            3. Tolerable runtime overhead when using dynamic code from static code

            But you only get to pick two. TypeScript, Dart 1.0 and other optionally typed languages give you 1 and 3 at the expense of 2. Dart 2.0 and other statically typed languages give you 2 and 3 at the expense of 1. Gradually typed languages like Typed Racket give you 1 and 2 at the expense of 3 (and are rarely used in practice because of it).

            You're asking for TypeScript to just add 2. People have been trying to figure out how to get all three for decades but no one has succeeded yet [1].

            [1]: https://blog.acolyer.org/2016/02/05/is-sound-gradual-typing-...

            • moron4hire 4 years ago

              No, I'm not. You're way overthinking this and your attitude is weirdly gatekeepy. I'm asking TypeScript to implement a shortcut feature for the tedious, boilerplate code we already write to use type guards to inspect objects from APIs before passing them on.

              • munificent 4 years ago

                I'm not sure where the "gatekeeping" accusation comes from. In your original comment, you wrote:

                > There are a number of ways that TypeScript's type-checking can be subverted at runtime, ... As a developer, in the case of an API change that violated my assumptions, I would personally prefer my applications to fail-hard at the point of the API call

                I interpreted that to mean that you would prefer TypeScript's type system to be sound: If a function expects a Foo, you want a guarantee that you'll never get into the body of the function at runtime with an argument whose type isn't Foo.

                I can understand that that seems like a fairly simple request. But when you dig into soundness, you discover that it is anything but. Optionally-typed languages like TypeScript are unsound by design because soundness is a difficult requirement with very severe trade-offs around interop, runtime performance, and usability. Making TypeScript sound would give you a language that felt very little like TypeScript does today.

    • spankalee 4 years ago

      This is also why Deno running TypeScript directly is a bad idea.

      • munificent 4 years ago

        The difference is a that a developer has a lot more control over what version of Deno their server-side app runs on than they do what version of a browser their client-side app runs on.

        • spankalee 4 years ago

          This isn't really true in the wider ecosystem. Yes, the app developer does, but the library developer does not. As Deno adopts newer versions of TypeScript and config options libraries can fail to compile. This is compounded by the lack of a package manager so that references to specific versions and CDNs are hardcoded into dependents.

          All with essentially no benefit over running plain JS with associated typings.

      • Soremwar 4 years ago

        They don't though. They transpile it behind the scenes

        • spankalee 4 years ago

          Same thing with the same problems. The difference being that Deno isn't going to have the same no-breaking-changes policy that the web does.

  • msoad 4 years ago

    Types in TypeScript are great but if JavaScript ever wants to add types it has to be something like a real programming language types in which you can use types in runtime as well. Like in a catch clause I can assert the type of the error and do things with it once that assertion is done. Many other useful things when types space and runtime space are not totally separate.

    ES4 was the first shot at adding types to JavaScript which failed due to how big the ambitions were. I'm not sure if there is any more appetite for adding types to JS tho.

    In very very serious big applications like a 3D editor you can fall back to WebAssembly and use your favorite typed language. For smaller apps TypeScript is good enough. This way JavaScript stays simple and lean.

    • bvaldivielso 4 years ago

      "real programming language types"

      There are a few languages where types only exist (for the most part, though with exceptions and hacks here and there) at compile-time, like Rust, C++ or even Haskell IIRC.

    • anderskaseorg 4 years ago

      JS objects already have runtime types, and you can use them in catch clauses.

          try {
            …
          } catch (e) {
            if (e instanceof FooError) {
              …
            } else if (e instanceof BarError) {
              …
            } else {
              throw e;
            }
          }
      
      There was once a Mozilla extension (https://web.archive.org/web/20200111091805/https://developer...) that allowed you to abbreviate the above to

          try {
            …
          } catch (e if e instanceof FooError) {
            …
          } catch (e if e instanceof BarError) {
            …
          }
      
      It was never standardized, but since it’s just syntactic sugar, if there were demand, it could be standardized without bringing in an entirely new type system.

      There would be at least two problems with using TypeScript types for this. Firstly, TypeScript types are unsound in a number of intentional and unintentional ways, meaning that it’s possible for the compile-time and runtime types to disagree, even in fully typed code. Secondly, TypeScript can express many types that cannot be tested at runtime; for example, there is no way to tell whether a function accepts a string as an argument, or to guess the inner type of an empty array.

    • javajosh 4 years ago

      *>...use types [at] runtime..."

      Two things. First, TS conceives of itself as having no runtime component. If it did, I think people (including the TS devs) would be more confused.

      Second, I'd say rather we need a runtime type system. In fact I've tried my hand at writing one in the most minimalist way possible, and have been working on it recently [1]. The type system is explicit in that a type is a JSON like object, similar to JSON schema, but 100x less code.

      [1] https://github.com/javajosh/simpatico/blob/master/friendly.h... This is effectively the test harness for the module.

    • jonny_eh 4 years ago

      > it has to be something like a real programming language

      It's a nice feature request but there's no one feature that makes a programming language "real".

  • AprilArcus 4 years ago

    TypeScript is great and everything, but Microsoft owns the standard and the single functioning checker, and haven't published a specification or even a grammar.

    • TimTheTinker 4 years ago

      The TypeScript checker/compiler is Apache 2.0 licensed, so I'm not sure there's room to complain, unless you disagree with the direction they're taking the project:

      https://github.com/microsoft/TypeScript/blob/main/LICENSE.tx...

      • anderskaseorg 4 years ago

        Nobody disputes that TypeScript is open source, but there’s still a vast difference between a single open-source implementation and a specification that’s suitable for standardization. The implementation inevitably has bugs, and there needs to be a way to decide which bugs are actually “features” that other implementations will need to emulate.

        (Here’s the specification for ECMAScript, for example: https://tc39.es/ecma262/)

        To be clear, I am not bashing Microsoft here—just pointing out a reason that TypeScript can’t be declared “the next version of JavaScript”, which is the context of this thread.

  • jamil7 4 years ago

    Would expanding wasm capabilities be a better long term option here instead of supporting TS directly? Opening the browser up to a whole lot of languages, including Typescript (AssemblyScript exists already).

  • conaclos 4 years ago

    TypeScript is still experimenting with its type system and it has numerous edge cases. I could be more in favor to specify and support a minimal subset of TypeScript rather than the entire TypeScript. Basically we could start by enabling type annotations, enum declarations, and type-aliases.

  • jollybean 4 years ago

    TS design goals and non-goals [1]

    It's very much designed to be a layer of abstraction over JS.

    I am as much a fan of TS, however, it's not likely ever going to be the thing that it's really close to being.

    [1] https://github.com/Microsoft/TypeScript/wiki/TypeScript-Desi...

  • dcgudeman 4 years ago

    Because then ability for typescript to be agile would be destroyed. JavaScript can’t be changed as easily as typescript.

lucacasonato 4 years ago

Hey - I am Luca Casonato, Deno's new delegate at TC39. I am happy to answer any questions you all might have :-)

  • benjaminjackman 4 years ago

    Hi Luca - congratulations! I have a quick question, have their been any proposals to add Subresource Integrity hashes (https://developer.mozilla.org/en-US/docs/Web/Security/Subres...) to the import syntax? I think this effects Deno more acutely than other projects since Deno supports / (encourages?) directly importing from a url with a precise version number encoded in the url. It would be nice to add another layer of safety on top and be able to assert that the module received is exactly as expected. Thanks!

    • gecko 4 years ago

      You can do that right now, albeit not directly in the import: it's done via an explicit `lock.json` file (https://deno.land/manual@v1.16.4/linking_to_external_code/in...). I'm tempted to agree that having some ability either to directly import, or even just to have that better integrated (right now, you have to ask for the lockfile to be used and pass an explicit path), would probably be a good idea.

      • benjaminjackman 4 years ago

        Ok that makes a lot of sense, the link you shared helped explain things in denoland quite well (and reminds me that I really need to give it another go).

        From the link I see this example:

            in src/deps.ts
        
            // Add a new dependency to "src/deps.ts", used somewhere else.
            export { xyz } from "https://unpkg.com/xyz-lib@v0.9.0/lib.ts";
        
            Then essentially a create/update lock-file command is run. 
        
            Then the lock file is checked into version control. 
        
            Then another developer checks it out and runs a cache reload command.
        
        As you mentioned in practice it's definitely a bit too manual, but should be one of those things that can be automated so it's not the end of the world.

        Having said I think having it in the import syntax would provide a few benefits:

        1. No extra steps need to be run & hopefully IDEs could auto-complete the hash.

        2. Would hopefully be standardized with the browser allowing for native browser support as well (or perhaps lock.json could be standardized with something like import maps)

        3. Having it right there provides an extra level of assurance that the integrity hash is going to be used (especially in files intended to be used in the browser and in deno ... not sure how common that is though).

    • lucacasonato 4 years ago

      I am not aware of any specific proposals right now. There was some talk a while back about supporting SRI hashes inside of an import map, but that sorta dissolved. For Deno at least you can use a `lock.json` file with the `--lock` and `--lock-write` flags: https://deno.land/manual/linking_to_external_code/integrity_...

    • spankalee 4 years ago

      SRI really, really should be out of band, otherwise SRI digest changes invalidate the entire module graph (and import cycles become a major pain).

      I think they're much better as a part of import maps, and later fetch maps so they can apply to non-JS resources like CSS.

      • benjaminjackman 4 years ago

        I don't think it's clear cut that it should be out of band 100% of the time. I think there are use cases where inline is useful.

        Cycles are definitely an issue, I am not sure there is even way to work around that, except to pull the cycles apart (which may not always be possible but is usually not a bad programming practice when it is). However at the library level, libraries tend not to circularly import each other. If it's being done at the inside a project level the build tool would be generating it so dealing with the module graph being invalidated may not be a hassle (or even necessarily a bad thing), in that case it could modify the files or it could be generating a lock file / import map (which I agree has benefits at that level of not forcing every source to be transpiled, but some of that probably still has to happen for module reloading e.g. appended search parameters to the module path for cache invalidation / module reloading during development like vite.js does for example, and realistically given the nature of the ecosystem some transpilation is going to have to happen either because of .ts or just because of browser differences).

        For a top-level deps.ts / dep.js file pattern there probably won't be any cycles. That pattern is to declare a root deps.js file for your project that locks things down and re-exports from third party libraries a use the exports from that as the basis for other imports. For this pattern I think SRI would be extremely helpful and add enough benefit to justify it (even though SRI may not be used in the cases you listed).

        Also for smaller projects or main modules having the SRI hash inline is really helpful.

  • hajile 4 years ago

    Are you going to push for records and tuples?

    Eich was pushing for them in 2011 and they still haven't arrived.

    https://brendaneich.com/2011/01/harmony-of-my-dreams/

    • frutiger 4 years ago

      He doesn’t need to, the authors of the proposal are on the committee and are planning to see it through.

      The proposal is humming along through the stages at a good pace.

      • lucacasonato 4 years ago

        Exactly. I am very much in favor of them though. They would be a great addition to the language.

        • BrendanEich 4 years ago

          Yes, Bloomberg has a big JS investment due to the Terminal (20MLOC of JS last I heard) and they have employed the records and tuples champions.

    • brundolf 4 years ago

      I haven’t read the full post, but what’s the benefit of adding language support for records and tuples at this point? My understanding is that engines already optimize the objects/arrays version of those concepts pretty well, and TypeScript enforces the semantics on the dev side.

      • Zababa 4 years ago

        > My understanding is that engines already optimize the objects/arrays version of those concepts pretty well

        They don't, using libraries that guarantee runtime immutability has a heavy performance cost in JS currently.

        • brundolf 4 years ago

          By “object approach” I just meant “making a bunch of objects with the same shape”. I’ve read that V8 can optimize these into flat structs.

          Is immutability part of the proposed standard? And if so, what’s the benefit over using Object.freeze?

          • Zababa 4 years ago

            Immutability is the whole point of records and tuples. You can read more about them on the proposal: https://github.com/tc39/proposal-record-tuple.

            • brundolf 4 years ago

              Ah so these are actual persistent data structures. Got it, that makes more sense.

              The use of the term "tuple" here is odd; I've only ever seen it used to describe fixed-width, non-homogenous sequences. These look more like immutable lists (though I guess they can serve both purposes).

              • Zababa 4 years ago

                > I've only ever seen it used to describe fixed-width, non-homogenous sequences.

                That's what they are here. Array are heterogenous in JS, and tuples are too. And since they're immutable, their size is fixed.

                • brundolf 4 years ago

                  Technically. I guess what I’m getting at is these are separate use cases that happen to be served by the same data structure in some languages, and “tuple” to me refers to the much less important/powerful use case, which is why I didn’t realize what the term referred to when I first heard about this proposal.

                  For instance: Rust has tuples that are fixed-length heterogenous sequences, but you can’t actually work with them as sequences in any meaningful way; you can’t map or loop or filter over them. The term “tuple”, to my mind, refers to that “multivalue” or “unlabeled-struct” kind of use case. The fact that we use arrays for that in JS always seemed to me like a historical accident/convenience.

    • dten 4 years ago

      Record and Tuple are actively being worked on: https://github.com/tc39/proposal-record-tuple -- currently Stage 2.

    • adamddev1 4 years ago

      Or how about TCO (tail-call optimization)? Please pretty please!

      • lucacasonato 4 years ago

        TCO is something that specific JS engines need to implement. It is implemented in JSC (Safari), but not in V8 or SpiderMonkey. Also see https://v8.dev/blog/modern-javascript#proper-tail-calls.

        As this is an engine feature rather than a spec thing, there is nothing me (or any other TC39 delegate) can do.

      • hajile 4 years ago

        Proper tail calls are ALREADY part of the spec.

        Google and Mozilla simply chose to ignore the spec.

        • Scarbutt 4 years ago

          More like Google, Mozilla just waits to see if Google is going to implement it since they can't think for themselves.

        • adamddev1 4 years ago

          It would be so nice to have it though! It could do a lot popularize recursive problem-solving and function writing, without having to go into modifying these functions to make them stack-safe.

  • mschuetz 4 years ago

    What are your thoughts on the do expressions proposal? Cause I think they'd be awesome for cleaner scoping of temporary variables.

    • lucacasonato 4 years ago

      I think they are great! Especially in combination with pattern matching (https://github.com/tc39/proposal-pattern-matching), or when using JSX.

      • nicoburns 4 years ago

        Do you have any insight into how the TC39 process works, and how one might drive this work forwards? I feel like expression-orientation is the main thing missing from JavaScript at this point, and I'd love to contribute. But it's not at all clear to me how to actually do so.

        • spmurrayzzz 4 years ago

          The TC39 process is well-documented, you can check out the stages and guidance for providing input here: https://tc39.es/process-document/

          The do expression proposal is currently stage 1, so its very early. You can check out the issue tracker there to see some of the related discussion. Standards work can be deceptively hard, even for simple things. I think the do expression proposal is a good example of that.

          Edit: forgot to link to the issue tracker I referenced above https://github.com/tc39/proposal-do-expressions/issues

    • benmccann 4 years ago

      Do you have a link to the proposal?

      • mschuetz 4 years ago

        https://github.com/tc39/proposal-do-expressions

        Allows you to contain temporary variables to where they are needed, rather than having them remain active through the remainder of the current scope. You could do that with immediately invoked function expressions, but they are too verbose to be really viable for this. You could also use extra functions, but extra functions don't always make sense. Currently I'm frequently doing this:

        Without do expression:

            let result;
            {
                let tmp = 123;
        
                result = tmp * 2;
            }
        
        
        With do expression it would become this:

            let result = do {
                let tmp = 123;
             
                tmp * 2;
            }
  • SiVal 4 years ago

    Will there be a Deno equivalent of Electron?

    • AaronO 4 years ago

      Aaron@Deno here, we've been exploring something with the Tauri team but don't have a concrete release on the roadmap since we're focusing on other priorities.

      I believe an Electron alternative is an important part of the Deno stack, so hopefully we'll ship a first iteration next year.

      • SiVal 4 years ago

        Thanks, Aaron. I think people who have been using Node for years are skilled at the old Node ways and have huge inertia in their skills, their own code, and others' Node code. For them, the ideal platform would be Node plus some upgrades. I'm guessing they would rather extend their inertial frame of reference than leave it behind.

        Then there are others of us who have been saying no to Node and legacy JS for years. We have no such legacy to maintain and no intention of ever creating any. But some of us (at least I) would reconsider platforms built from scratch on a new TypeScript foundation rather than layered on a pre-ES6 foundation. That would include a Deno-based Electron. You might have more luck converting people who don't use Node than getting Node users to abandon their legacy.

        The state of cross-platform desktop apps is terrible. All attention is on mobile, and desktop OS makers have almost zero interest in supporting cross-platform desktop apps. (MS cares a little more than zero, Apple less than zero and barely tolerates their own Mac-only developers.) Only something browser/Chromium based seems realistic for the next few years.

        On the server, there are a lot of alternatives to Node that are considered better by (and very popular with) large segments of the market. Deno will be one of them, I think. But for cross-platform desktop apps, Electron would be rejected completely if the alternatives weren't so bad and unlikely to get better. A better Electron, despite its inherent problems, could end up more popular than server-side Deno. Just a thought.

      • mattlondon 4 years ago

        Glad to hear this is still on the table! I have been very keenly watching this space waiting for this to land here :)

      • spankalee 4 years ago

        Why not PWAs?

  • jerrygoyal 4 years ago

    As modern JS is getting closer to typescript in terms of new methods, latest syntax etc. What is the future of TS? will it be there for just type checking?

    • lucacasonato 4 years ago

      Yes.

      • egeozcan 4 years ago

        could you imagine that some JS proposal adds the ability to ignore TS-like type annotations to the engines, so we don't even have to strip them?

        this may help development heavily, also making the browser and deno nearly identical environments.

        • lucacasonato 4 years ago

          Yes, this would be great, and it will probably happen eventually. This is something we want to work on soon - expect something in the coming weeks.

          • ducaale 4 years ago

            Since TS/Deno has native support for JSX syntax, do you think Browsers will eventually support it as well?

            • lucacasonato 4 years ago

              No, I don't think so. JSX is too proprietary and not specified well enough. It is also rather ambiguous.

              If you want a "no compile" JSX:

              ```jsx

              const x = <div color="red">hello</div>;

              // is the same as

              const x = h("div", { color: "red" }, "hello");

              ```

              • eyelidlessness 4 years ago

                This is more true as a mental model/to the type system, but slightly more complicated when compiled.

                First, there’s the “new JSX transform”, which involves auto-imports, has a different function signature, and defines fallback behavior for certain circumstances.

                Second, JSX is only specified as a syntax extension. Some implementations—like SolidJS and its underlying dom-expressions compiler—don’t compile to hyperscript at all.

    • brundolf 4 years ago

      It’s always been there just for type checking, hasn’t it?

  • speedgoose 4 years ago

    Do you have discussions about not adding too many features to JavaScript? It's already quite complex for beginners.

    • lucacasonato 4 years ago

      This is always a consideration when adding new features. It is however also important to keep the language up to date with other modern languages. If you don't innovate, you die. There is always a cost/benefit calculation to be made.

  • phist_mcgee 4 years ago

    Hey luca,

    Huge fan of the achievements that Deno has made in recent years. Several questions:

    How do you aim to promote Deno as a viable alternative to node, considering it's significant network effect and legacy?

    Do you believe that Deno's 'more sane' defaults for security will appeal to developers in the long term? Do you think that the front end community will be receptive to these defaults?

    Choosing TS as your language de jure, do you think that you will alienate any dyed in the wool JS devs? Can we expect that TS is now effectively the superset of JS going forwards?

    Do you believe Deno's lack of support for NPM style package management will result in cleaning up the frontend community's over reliance on 'leftpad' style packages? Do you think that Deno's approach to dependencies fosters a more considered approach to transitive dependency bloat?

    Again, huge fan of Deno, and happy to hear about this announcement.

    • lucacasonato 4 years ago

      > How do you aim to promote Deno as a viable alternative to node, considering it's significant network effect and legacy?

      This is a great question, but not one I can answer in a small HN comment :-). I may write a blog post about it one day. The core of the argument is that Deno can save you an insane amount of time / discussion (OOTB linting, formatting, testing, standard library, etc). It aims to unify the ecosystem into a single style, like in Go.

      > Do you believe that Deno's 'more sane' defaults for security will appeal to developers in the long term? Do you think that the front end community will be receptive to these defaults?

      I think many developers do not care about permissions, and also will not in the future. This is a problem, but not something that can be tackled overnight. Security is often not emphasized enough in our industry unfortunately. Because of this I think sane defaults and opt ins are good - they push people to think about security at the most basic level. Maybe the log4shell attack also shows people that it is a good idea to sandbox server side scripts aggressively (something we have been pushing for), to prevent large scale system takeovers through a single vulnerable entrypoint.

      > Choosing TS as your language de jure, do you think that you will alienate any dyed in the wool JS devs? Can we expect that TS is now effectively the superset of JS going forwards?

      There is work being done on this. I don't have too much to share right now, but expect some updates on this early next year. JS has to evolve to support some form of type annotations first class to stay relevant.

      > Do you believe Deno's lack of support for NPM style package management will result in cleaning up the frontend community's over reliance on 'leftpad' style packages? Do you think that Deno's approach to dependencies fosters a more considered approach to transitive dependency bloat?

      Maybe, maybe not. I think it is still to early to tell. I do think that so far it is looking like it. People seem to be doing less weird stuff like "leftpad" with Deno so far. Ideally all these little helper modules should just be part of JS directly (hit me up with suggestions!)

      > Again, huge fan of Deno, and happy to hear about this announcement.

      Thanks, glad you like it :-)

      • Vinnl 4 years ago

        > JS has to evolve to support some form of type annotations first class to stay relevant.

        What are you thinking in this regard? Just to define type annotations that can be made but will not necessarily be type-checked at runtime? To have them serve as inputs to the interpreter for optimisations? Or even breaking at runtime if types don't match their annotations?

    • deepstack 4 years ago

      Personally front end developers need to step up their games learn how to develop or else don't think they have business developing software front-end or back-end. This whole idea of bring in as much developers as possible at cost of developer competency has being harmful for developer and tech community.

      • asoneth 4 years ago

        In my opinion, what has been most harmful for evolution of the software engineering profession is the tendency to blame individual developers for not being infallible instead of fixing chronic failures in tooling, processes, and funding.

        The medical profession, aviation, even rail transportation [1] have all progressed past the point where avoidable failures are entirely the responsibility of the individual.

        Of course, there was resistance in those fields as well because some considered themselves an "above-average" doctor or pilot who didn't need safeguards, checklists, union rules, or laws. But it empirically improved outcomes.

        [1] https://www.youtube.com/watch?v=A3AdN7U24iU

        • andrewflnr 4 years ago

          While I agree in this case: at the edges, there still exist people who just should not be allowed to be doctors or engineers. The main difference with software is that the stakes tend to be drastically lower...

          • asoneth 4 years ago

            The stakes could be drastically lower or drastically higher depending on what the software is used for.

            For example, we might want to make sure software used in avionics, aerospace, weapons systems, voting machines, medical devices, cryptography, power plants, policing, finance, etc is carefully engineered. But I agree with your main point that there's still a baseline of competence necessary -- we need both good tooling and good people.

            That reminds me of a lament from a friend at the Software Engineering Institute that the profession missed the boat on the kind of licensure most engineering disciplines have. (That is, any software developer can refer to themselves as an "engineer" without taking any tests, accepting any liability, or meeting any other legal requirements.)

  • robertvh 4 years ago

    Congrats! Can you extend on your ideas for async iteration?

  • bartq 4 years ago

    When cross platform windowed WebGPU on Deno? It would be interesting for many kinds of apps.

    • lucacasonato 4 years ago

      We are not sure about windowed WebGPU right now. We will try to come up with some form of "Deno Desktop" next year, but I have no real ETA for that.

  • Vinnl 4 years ago

    Hey Luca! What does

    > Better support for explicit resource management

    refer to?

  • baybal2 4 years ago

    What you think of more *-Scripts built on top of JS keeping coming? Are you not afraid of TypeScript following the CoffeScript into obscurity because of WebDev community keeping chasing the new thing?

    This is all when vanilla JS keeping very energetically absorbing new features from *-Scripts thanks to TC39 seemingly intentionally picking them?

    • lucacasonato 4 years ago

      No, I think this will eventually stop. JavaScript is getting more mature. I would not be surprised if the TypeScript syntax will be legacy in a few years, because JS caught up.

      • megaman821 4 years ago

        Is there some proposal for making the type syntax valid? No type checking just JavaScript parsing code with types and ignoring it.

        • lucacasonato 4 years ago

          No such proposal exists right now, but this will be a point of immediate focus for me.

        • pier25 4 years ago

          How will Deno react if that happens?

          • lucacasonato 4 years ago

            Deno will behave just like Chrome or Firefox would: we ignore the checks when running your code. We would have a `deno check` subcommand to perform a typecheck. You could optionally run this automatically before a `deno run` by passing a `--check` flag.

  • j-krieger 4 years ago

    Congrats on your new role! Currently, I feel like addition to JavaScript is a lot slower than new CSS features, for example. What do you think are the chances that JS will one day get a larger batch of STL functions, instead of about a dozen each year?

    • lucacasonato 4 years ago

      I hope the pace will accelerate. The real questions is what functions we need though. Some candidates that I would love to see are better helper functions on iterators, and Uint8Array<->base64/hex. Most of the "standard library" in most languages is related to IO, and for JS is dependant on the host (the web, Deno, Node) so not something TC39 will touch directly. Do you have ideas for standard library functions that you think are missing?

      • FragenAntworten 4 years ago

        'enumerate', as in Python.

        'clamp' and 'sortBy', as in Lodash.

        Set methods for union, intersection, difference, etc.

        • tempodox 4 years ago

          Re set operators: For completeness' sake there would have to be a way to specify how set elements are to be tested for equality.

      • amitport 4 years ago

        'range' a-la python

        • wruza 4 years ago

          Range a-la python is a generator and is pretty straightforward in js as is. There is no need for a complicated solution and/or as part of stdlib, imo:

            function* range(x, y) {
              while (x <= y) {yield x++}
              // or ‘<‘ if you want half-open one
            }
          
            for (let n of range(3, 5)){
              console.log(n)
            }
          
            array = [...range(3, 5)]
            console.log(array)
          • amitport 4 years ago

            It's also straight forward to implement in python. It is common enough to be standard so people won't reinvent the wheel. And it should be available to people learning the language before they learn generators.

            Array.prototype.indexOf is also pretty simple as are many core functions.

      • wiredearp 4 years ago

        Immutable data structures are on the agenda over in https://github.com/tc39/proposal-record-tuple and I would vote for that in Deno while I have the chance.

      • Zababa 4 years ago

        I often miss map, reduce and filter on iterators, though this is already a stage 2 proposal: https://github.com/tc39/proposal-iterator-helpers.

        • lucacasonato 4 years ago

          This is definitely something I am interested in pushing too. Iterators have not gotten the love they deserve.

          • Zababa 4 years ago

            That's great to hear! Congratulations and thank you for your work.

      • tarjei_huse 4 years ago

        I miss being able to create a hash from an array in a quick way. You always end up with

        let bar = [{id: zz, }, {id: y},...]

        let foo = {}

        bar.forEach(v => foo[v.id] = v)

        I'd love to have something like:

        let foo = bar.toMap(v => [v.id, v])

        • keville 4 years ago

          const foo = bar.reduce((hash, item) => { hash[item.id] = item; return hash; }, {});

        • wiredearp 4 years ago

          new Map([[k1,v1], [k2, v2]]) can do that. Or Object.fromEntries with the same argument.

          • tarjei_huse 4 years ago

            Thanks!

            You still need to create the middle datastructure ( [[k1,v1], [k2, v2]] ) but it is an improvement :)

            • wiredearp 4 years ago

              Ah yes, true, you always need to map the things ID to the thing itself, so that middle data structure is actually quite annoying :)

        • runarberg 4 years ago

              let foo = Object.fromEntries(bar.map((v) => [v.id, v]));
          • keville 4 years ago

            It is not necessary to create any extraneous data structures to do OP's one-liner. This creates n+1 new arrays that must be garbage-collected.

            • runarberg 4 years ago

              Object.fromEntries takes iterable, so if you have a map from a handy iterator library you could get rid of the parent array returned from `bar.map`.

                  import { map } from "my-iter-lib";
                  
                  Object.fromEntries(map((v) => [v.id, v], bar));
              
              Or with the proposed iterator-helpers:

                  Object.fromEntries(
                    Iterator.from(bar).map((v) => [v.id, v]),
                  )
              
              However the inner arrays are harder to get rid of... In fact even OP’s oneliner defines the inner arrays. I would hope there were some engine optimizations though which could minimize their footprint.
              • keville 4 years ago

                Please see my other comment in this thread for the `reduce`-based solution that requires no extra data structures. They're not that hard to get rid of!

      • amitport 4 years ago

        'zip' a-la python

      • amitport 4 years ago

        'reversed' a-la python

    • aadams 4 years ago

      Please have someone on your team take a good hard look at the current proposal-pipeline-operator: https://github.com/tc39/proposal-pipeline-operator

      It started as a Function Composition proposal (using the pipe operator |>) but after a change of leadership it has turned into something much different. We might need another perspective on the current trajectory of this proposal, as in its current form it seems to many in the community it might take JS in the wrong direction.

      Thanks!

      • wruza 4 years ago

        If you want a feedback, I do not find |> versions much (if at all) easier to read. The noise is as high as in a regular function-based chaining:

          await chain(
            Object.keys(envars)
              .map(envar => `${envar}=${envars[envar]}`)
              .join(' '),
            text => `$ ${text}`,
            text => chalk.dim(text, 'node', args.join(' ')),
            colored => console.log(colored),
          )
        
        I don’t want to sound too negative, but this seems like a pointless extension only to bring the burden of support for some syntactic flavor.

        React example is special though. It feels like very helpful in expr-only context, but in-array ifs and fors would do better there:

          <ul>
            {[
              for (v of vs) {
                const text = foo(v)
                yield <li>{text}</li>
              },
              if (vs.length == 0) {
                yield <li>No items</li>
              },
            ]}
thegagne 4 years ago

Something I’d like to see, in browsers, Cloudflare Workers, Deno, etc: explicit network firewall in the software stack.

An example with Workers, one script might only need to fetch from Backblaze. I’d like to set their host as a whitelisted address, and so even if a log4j type vuln happens, it can’t go anywhere except Backblaze.

I think this could even work in browser-land? If you don’t need to pull in any resources outside the original host, deny any fetch made unless it’s added to a whitelist. For browsers this would need to be opt-in for backwards compatibility, but an ideal state would be opt-out (to allow all).

  • easrng 4 years ago

    You want a Content Security Policy[0]

    [0]: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Co...

  • kentonv 4 years ago

    FWIW, in Cloudflare Workers, a log4j-type RCE vulnerability would be impossible because Workers does not allow dynamic code loading (eval() and similar are disabled).

    Of course, a lesser form of the vulnerability -- data leaks rather than RCE -- would still be possible. I agree that being able to restrict outbound traffic would be useful to mitigate that.

    As a hack that works now, you could monkey-patch `fetch()` to intercept calls and deny them based on URL.

    (I'm the tech lead of Cloudflare Workers.)

    • thegagne 4 years ago

      Thanks for thinking about this Kenton. I agree the data leak is more the concern here, or even accidental use as a ddos attack agent. The scenario is an import of something like worktop, if worktop released a malicious version.

      Monkey patching is an option of course, but a native solution would be nice.

  • spmurrayzzz 4 years ago

    I think its probably worth clarifying whether you mean an ACL or a firewall here. The former seems more feasible given that its stateless and the latter is not, at least conventionally. Those implementation details matter here I think.

    But speaking more broadly, do you have any examples of this kind of behavior being defined at the language specification level (and not in a platform API)? I can't think of any presently.

    It seems problematic for a number of reasons, but if there's other examples to work backwards from that might be helpful for me to grok how this would work in a general sense.

maga 4 years ago

This is great news! Good luck, Luca!

> Better support for explicit resource management

+1

Since everyone is making feature requests, I'd like to point out `ArrayBuffer.transfer`[1] -- ability to effectively move data without copying would do wonders for low-level/high-performance code in JS.

[1] https://github.com/tc39/proposal-resizablearraybuffer

  • lucacasonato 4 years ago

    Yeah, this is something we have been thinking about too. Something else along those lines would be read-only buffers, and with that a copy-on-write operation for buffers. They could result in a significant speedup for many operations.

tmikaeld 4 years ago

I'm surprised not to see Cloudflare amongst the TC39/ECMA members, considering their current downline entirely depend on Javascript.

scanr 4 years ago

> More extensive standard library functions for (async) iteration

Great news! I wrote an open source library called axax that adds a number of utility methods to async iterators - map, filter etc.

I think having them as part of the language would be awesome.

Standardising async cancellation would be neat too if Deno wants a challenge....

  • lucacasonato 4 years ago

    I think async cancellation is pretty well covered by `AbortSignal` now. It is a Web API (not JS), but it is supported in all major runtimes.

lloydatkinson 4 years ago

> Better support for non-JS assets in the ES module graph

Whatever happens please never give into any misguided pushes to support commonjs/amd/umd or any of the other non-standardised disaster module formats that cause Node and npm etc to be so painful! It's only very recently that modern build tools are managing to overcome such poor foundations...

  • lucacasonato 4 years ago

    No no, this is about things like importing WASM through the `import` keyword, or referencing assets statically through syntax:

    Asset References: https://github.com/tc39/proposal-asset-references Alternative module reflections (wasm imports): https://github.com/tc39/proposal-import-reflection

  • ptx 4 years ago

    What are the problems with commonjs/amd/umd? I'm only vaguely familiar with the JS ecosystem. Is it mainly that the dependencies can't be statically analyzed?

    • forty 4 years ago

      Yeah I have been happy with require/commonjs for the last ~10 years, I'm not sure why suddenly it seems to be a problem.

      The main problem to me is this push to this ESM thing, which I don't know what it brings to me. I understand it's a frontend thing, so I'm not sure why nodejs end npm need to be impacted.

      • lucacasonato 4 years ago

        ESM is not a frontend thing. ESM is the standardized way of doing module imports in JS. It has a lot of benefits for server side developers too:

        - It has language syntax for importing and exporting instead of relying on an implicit global.

        - It is asynchronous, allowing for top level await.

        - It is reliably statically analyzable.

        - Because it is asynchronous, module asset loading can happen in parallel which can mean a significant startup speedup for larger projects.

        - It is standardized, so it behaves the same across Node, Deno, the web, bundlers, linters, and other tooling.

        • lloydatkinson 4 years ago

          Exactly this and I'm a little disappointed, though not surprised that the response was "well it works for me so what's wrong with it? it's only for frontend". It's actually part of the language.

          You also forgot another super important point:

          - It actually works. How many times have we all run into some weird error or stack trace because node/npm/babel/webpack/typescript/jest/regenerator-runtime/babel-core/quantum-flux-inverter tower of cards collapses when a commonjs module somehow leaks in from node_modules?

          • forty 4 years ago

            Precisely, aren't babel & webpack front end issues?

            • lloydatkinson 4 years ago

              No

              • forty 4 years ago

                Why would you babel or webpack your nodejs backend code? (Genuinely asking, I have never needed to do it in the last 10 years, so I assumed this was all about browser compatibility and not having to download thousands of small file over the internet respectively)

          • lucacasonato 4 years ago

            Oh, 100% agree on that point :D

        • forty 4 years ago

          I'm not convinced :)

          Admittedly the largest project I'm working on is mostly in TS now, and I probably get most of the benefits of the import syntax even if require is used under the hood (including linters, which I think worked even before we switched to typescript).

  • nicoburns 4 years ago

    I wouldn't worry about that. Node is moving to ECMAScript modules.

    • akyoan 4 years ago

      Node is not "moving to ESM", they don't even plan to deprecate CJS. They will continue to support both for the foreseeable future. Their docs still use `require` in a lot of places even though ESM has been enabled on stable releases for almost 2 years.

      • lloydatkinson 4 years ago

        I heard there was a movement internally for a while to block adoption of ES modules because the core team members "didn't like ES modules" (again, what I saw tweeted a long time ago and don't remember the source). So that certainly can't be helping adoption of it. This is one of the reasons Deno is superior - it makes use of language features not a badly designed module format.

      • ecares 4 years ago

        PRs on doc are always appreciated.

zsolt224 4 years ago

Just adding some interesting info: There is an ECMAScript Optional Static Typing Proposal.

https://github.com/sirisian/ecmascript-types

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection