Settings

Theme

Building the Future of the Command Line

github.com

90 points by smartblondeva 3 years ago · 74 comments

Reader

Kukumber 3 years ago

"your server needs a GPU and an i5 to use our shell, as it provides a graphical interface and some shader animation because that's what attract the money people, they want shiny stuff y'know"

This trend of "new modern shells" that runs and start as slow as some javascript code (powershell) needs to stop

People forgot what shells are for, and what scripting is for

  • yazzku 3 years ago

    Indeed. These pseudo-authoritative blogs on Github from people who have no idea what they're talking about could also stop.

  • PurpleRamen 3 years ago

    > This trend of "new modern shells" that runs and start as slow as some javascript code (powershell) needs to stop

    Who cares, as long as it's not my shell. The old shells and terminals won't disappear, and if some new relevant standard emerges, people will port it.

    > People forgot what shells are for, and what scripting is for

    Shells are for the people, and scripting might be for people, depending on whether it's a oneshot-script. Maybe you forgot too what they are for?

  • RedShift1 3 years ago

    Count me in this camp as well. The hardware guys did their jobs, our hardware as now faster and more power efficient that anything that came before it. But now the software guys are letting us down making everything slow again.

  • brujoand 3 years ago

    Agree, I actually wrote a multiline dual sided powerline inspired prompt with modular segments (info from git/pyenv etc) in bash exactly to get these fancy shiny things without having to introduce new dependencies. I log into any server, pull my config and bob’s your uncle. (Btw, that expression is so weird). Ofc, to have the powerline look I would need custom fonts on the client terminal but I usually don’t enable them as I’ve added other “themes” instead.

  • rektide 3 years ago

    Doesn't seem like a real problem. Why are we grumbling about this? No one is making rxvt less an option. Why the fear? Why the negativity?

    Does start-up time matter? Who cares? (I have three terminals that have been open for almost a year.) Are we concerned about only first start, or primarily the faster second starts once all the dynamic libraries have loaded? Does anything actually require an i5? Whats wrong with requiring a GPU?

    Doesnt it get tiring, being so grumbly about other people having fun & doing cool things? Do you really think we should do as you say & just freeze time, insist on doing nothing?

    • nmz 3 years ago

      I care. I care about start up time. I care because I just had a black out, and that requires booting, I care because I mess with my system and I have to reboot sometimes. I care because I expect software to improve, not get worse. I care about engineering because bad engineering ultimately wastes my time and everyone elses. Everyone can have fun and nobody should prohibit others from doing so. but valid criticism is valid criticism, even though I hate it.

      With that said, the counter argument should've been that although powershell does start up slow, (and many other things) it is better than bash in many cases and more performant.

      • rektide 3 years ago

        > I care because I just had a black out, and that requires booting, I care because I mess with my system and I have to reboot sometimes.

        I have a very hard time believing you are talking about 5 minutes or more per year of wait time. Even two minutes feels suspect.

        Personally, kitty or Alacritty or gnome-terminal or terminator or any other graphical terminal I've tried... they are slower to start but it's under 2s, and faster on second load (let's say 1s). It's hard for me to imagine the amount of agony & bitterness, the "I am being deprived of valuable time" for something that costs let's generously say 10 instances of 2s a month, not even a full minute.

        And no one is forcing you to switch off what you have. No one is forcing you to stop using serial console or whatever else.

        People need to dial down their outrage. This is a huge social problem online. People are vastly overconcerned. Ya'll are not being reasonable. You are being absolutist & maximalists about very particular narrow concerns.

        > I care because I expect software to improve, not get worse.

        You have an exceedingly narrow & particular view of progress. And it's conservative in that it recognizes & permits no other forms of growth or advancement. You have a high concern that trumps all other concerns & nothing but your own particular view matters.

        • phaedrix 3 years ago

          I'm guessing you simple don't live in the CLI, i.e. where you're constantly opening new e.g. tmux windows.

          Shell startup time _absolutely_ matters.

          E.g. slow startup interrupts flow for one thing.

        • nmz 3 years ago

          > I have a very hard time believing you are talking about 5 minutes or more per year of wait time. Even two minutes feels suspect.

          Well, the device I was whining about was a pi4 booting from a usb3 hdd (So yes, it will not be instantaneous! because its not an ssd), not an i20 ssd 100 cores device. And I'm not outraged, I looked at it, saw it was a hog and moved on (because again, I care about that stuff). Though, I am slightly outraged at you putting words in my mouth and making a caricature of my self, you don't know me, I don't have an "exceedingly narrow & particular view of progress", I simply know that a device has limited resources, software uses those resources. Allowing software to get less performant, means the system will get less snappy, and that I do hate (mostly because it automatically alerts me of viruses(still have that paranoia) or a runaway process hogging the CPU).

    • mshockwave 3 years ago

      > Whats wrong with requiring a GPU?

      I'm glad that you never encountered this before and I sincerely wish you never will: getting on call 3am in the morning due to server outage and you couldn't diagnose remotely. You rushed to the server room, which was only 50F btw, connected to the machine and brought up a rescue shell. Oh, did I tell you that none of them has integrated GPU?

      See, it's not about the time you sit in front of your M1 MacBook and have a nice cup of tea -- it's about the situation where everything goes south and your tools and infrastructures can still have your back

      • viraptor 3 years ago

        That's a weird strawman. Nobody aims to take away your standard framebuffer / textmode terminal. It's there for a reason and I don't think this post cares at all about that use case. (Also, you should really invest in iLO or your vendor's equivalent - it pays for itself if you do trips like that)

    • wyager 3 years ago

      > Does start-up time matter? Who cares?

      Yes, me

      > being so grumbly about other people having fun & doing cool things

      The downstream effect of people "doing cool things" (making insanely bloated crap) is that we often have to use it.

      I don't understand this attitude that every claim and endeavor is immune from criticism as long as you can frame it as someone "having fun" (I'm sure these corporate software projects are super duper fun) or being experimental.

    • bayindirh 3 years ago

      > Does start-up time matter?

      Yes.

      > Who cares?

      Me, and my colleagues.

      > I have three terminals that have been open for almost a year.

      I have 10s of them open at times. Close and re-open at least a dozen ones every day, because what I do requires that.

      > Does anything actually require an i5?

      Yes "modern" terminals really require some heavy lifting, because "smooth scrolling!"

      > Whats wrong with requiring a GPU?

      A lot. Showing text shouldn't need gaming level hardware. Then people moan about their battery life.

      > Doesnt it get tiring, being so grumbly about other people having fun & doing cool things?

      Haha no. Because they do the cool things, and they backtrack and return to roots as they move down the path. Watching this is delightful.

      > Do you really think we should do as you say & just freeze time, insist on doing nothing?

      Why not try to increase efficiency and try to do cool things without sucking the living light out of our systems, E17 style?

oceanplexian 3 years ago

> One of its modules attempts to translate natural language requests into the correct shell commands and syntax. For example, if you typed “compress Documents folder,” CLAI will recommend the corresponding Tar command.

This is such a bad idea I don’t know where to start. Shell commands are a dangerous, but precise tool, somewhat like using a scalpel or a surgical tool. Dumbing it down so it can “guess what you want it to do” is going to result in more people (Specifically people who don’t bother to read the docs) breaking things.

  • mormegil 3 years ago

    Exactly.

    > Warren Teitelman originally wrote DWIM to fix his typos and spelling errors, so it was somewhat idiosyncratic to his style, and would often make hash of anyone else's typos if they were stylistically different. Some victims of DWIM thus claimed that the acronym stood for ‘Damn Warren’s Infernal Machine!'.

    > In one notorious incident, Warren added a DWIM feature to the command interpreter used at Xerox PARC. One day another hacker there typed delete *$ to free up some disk space. (The editor there named backup files by appending $ to the original file name, so he was trying to delete any backup files left over from old editing sessions.) It happened that there weren't any editor backup files, so DWIM helpfully reported *$ not found, assuming you meant 'delete *'. It then started to delete all the files on the disk! The hacker managed to stop it with a Vulcan nerve pinch after only a half dozen or so files were lost.

    The Jargon File http://www.catb.org/jargon/html/D/DWIM.html

  • none_to_remain 3 years ago

    What if it tells you to type `man tar`

    • mburee 3 years ago

      Then 99% will give up. I mean just try to read the man page of any modern GNU utility, it's so long and filled with options that nobody will ever read that through.

  • bregma 3 years ago

    What we need to do is dumb down surgical tools. Anyone should be able to do a coronary artery bypass with just a quick google and some AI assistance. That way your analogy will match the goal of these new shells.

    • nmz 3 years ago

      If only there existed some thing, some sort of UI, that basically showed you all you could do with the tool, and there was some sort of checkbox thing where you wanted it to do this and that... gosh, we would be living in the future. not a CUI where you have to memorize --options, but one that has all the options and you pick what you want, and then maybe it prints out the full command you want... we would be living in the future.

      (I'm talking about a TUI, this was solved 60? years ago)

  • bhedgeoser 3 years ago

    I guess that's why shell programs use regexes and globs.

an1sotropy 3 years ago

One thing not discussed are the libraries used for command-line parsing (parsing argv), and how that might get complicated by shells trying to make the command-line into something effectively more than an array of strings.

Having written a non-trivial command-line parser in C, and having used a bunch of them in other languages, it seems to me that this task would benefit from some more standardization and maturation. What is the JSON of the command-line? How can we do to increase the level of interoperability between how information is encoded on different tools' command-lines? e.g. think of ImageMagick "convert" versus "find" versus "ffmpeg": totally different universes, but all of them in their own way turn command-line arguments into mini-DSLs.

  • orev 3 years ago

    Given the prevalence and longevity of GNU style short and long options, pretty much anything that doesn’t follow that is “out of compliance”.

    However, you also called out some very specific commands that are that way for a reason. For example the order of options for ffmpeg matters very much, as that’s used to construct the processing pipeline. It does make sense for certain things to be custom, but that should only be done when there’s a good reason.

  • NateEag 3 years ago

    It doesn't scale especially well to UIs with tens of subcommands, but I'm a fan of Docopt as a reasonable way to write basic CLI interfaces in many languages with a minimum of fuss:

    http://docopt.org/

  • abathur 3 years ago

    I have a hard time imagining how we get out of the gravity well of CLI programs handling their own parsing.

    A tool I write has a use-case for understanding the syntax of at least ~common CLI tools well enough to pick out args that will be other executables (sudo cat, find blah -exec...), so I have been idly pondering whether there's a humane, declarative, descriptive grammar that can express nearly all CLI interfaces.

    It's probably not worth the work for my case, but it might get to be more tractable if it was also an input for better completion, help, linting, etc. tools.

    Ideally something that drives enough all-around value that projects would start up streaming the grammars (and maybe adopting an associated parser?)

    • lgas 3 years ago

      You might find http://docopt.org/ to be of interest. (It's available in many languages https://github.com/docopt).

      • abathur 3 years ago

        Parsers designed for implementing CLI programs are generally too opinionated to handle ~strange commands. (In my terms I'd say it's a prescriptive parser as opposed to something that attempts to be flexible enough to describe nearly all existing CLIs).

        • ablob 3 years ago

          Wouldn't it be easier to have a convenient library/parser for almost all of the use cases instead of an immensely complex catch-all solution? Having custom logic when required should almost always be less complex when such a ~strange command is to be implemented.

          • abathur 3 years ago

            As I said, my usecase doesn't involve implementing the commands--it involves reliably identifying executables in the arguments to many different commands.

            I can't go rewrite awk, find, and sed with an opinionated cli module. I have to deal with the current reality.

            (you're roughly describing what I already do, and it scales poorly)

    • mschrage 3 years ago

      You might find Fig completion specs useful for this: https://github.com/withfig/autocomplete

      • abathur 3 years ago

        Completions have in general been of interest, though the shell-specific completions I've looked at so far were all too dynamic.

        I'd forgotten all about Fig since I saw your launch post here last year, so thanks for reminder. (I don't think I had quite started to work on parsing specific external commands, yet, so it wouldn't have clicked at the time. Was still focused on just identifying the likely presence of exec in the executables.)

        Are you familiar with the parse code? Are you handling painful stuff like combined short flags with a trailing option? (If I ferreted out some of the more painful cases I've had to wrangle, I am curious if you'd have a gut sense of whether your approach handles it. Would you mind if I reach out? I am working on this for https://github.com/abathur/resholve)

  • scj 3 years ago

    I've always wondered about expanding stdin, stdout, stderr. Say, stdjson that doesn't get visually displayed, but can be piped (and would only be generated if it is needed on the pipe stream).

    ls | cat </dev/stdjson | string_proc_the_json_for_some_reason

    With the direct ability to process in line:

    ls -a | json.files[0].last_modified

    I'd probably want multiple output formats (including s-expressions).

    • NotTheDr01ds 3 years ago

      You seem to be pretty close to Nushell, which was mentioned in the article. The Nushell `ls` (and `ps`, etc.) builtins generate structured data that can be sorted, queried, reduced, and then transformed to many different types of structured data.

      $ ls | get 0 | select modified | to json

      { "modified": "2022-08-16 16:38:28 -04:00" }

      The internal data format looks pretty JSON-like, with the added ability to keep Nushell types intact.

      While I'm not ready to replace Fish with Nushell, it's definitely taken the place of jq for me.

    • dotancohen 3 years ago

      I dream of stdmeta for e.g. header lines:

      https://unix.stackexchange.com/questions/197809/propose-addi...

      I see much potential in adding stdjson as well, but I do caution against opening the floodgates to std* being implemented for every pet format and insignificant corner case.

    • yjftsjthsd-h 3 years ago

      It sounds like you either want powershell or FreeBSD's libxo?

    • bawolff 3 years ago

      What about an ioctl that lets you query what formats are accepted by the file descriptor? We already sort of have a precedent for it with giving different output based on isatty().

    • an1sotropy 3 years ago

      (fully daydreaming here) For C, would it be useful to have main() look like:

         int main(int argc, char *argv[], char *envp[], JSON *json)
      
      for some JSON data type that is part of C, kind of analogous to a FILE stream? I'm not sure how the the json info would get into that fourth argument (has to be independent of argv), but it would keep std{in,out,err} as is.
      • 0x0203 3 years ago

        You would need to modify the execve system call (or equivalent non-Linux OS's) to take in the 4th json arg for the shell to pass when it exec's the new process. And then of course modify the OS kernel to parse/deal with it. But in reality, since you can't break the ABI like that for all existing software, it would end up being execve_2 or something and you'd have to both convince everyone else it's worth using, and deal with the inevitable incompatibilities when not everyone does. Not impossible perhaps, but certainly an uphill battle.

    • BirAdam 3 years ago

      You can always just use JQ.

      Additionally, as JSON is text, you can use awk/sed/grep and so on.

      • scj 3 years ago

        The intent is that standard tools would output alternative formats.

        • an1sotropy 3 years ago

          Right; every file is just bytes but we get a lot of mileage out of libraries like libpng that parse those bytes into usefully structured info. And I was pondering what more could evolve to parse info from the command line.

          I think stdjson is insane yet awesome to ponder.

          • dotancohen 3 years ago

            I agree that a standardized structured format would be awesome, but I'm not convinced that JSON is it. And there should _not_ be more than one.

            One thing that I do like about JSON is that it is ubiquitous - and that makes up for a lot of its other faults. But I would like to see proposed use cases that JSON would not support beforehand, to clearly define where the limitations are, and what limitations the community is willing to accept.

        • nishs 3 years ago

          The tool should instead provide an `--output=<format>` flag, where <format> is one of text (default), json, xml, etc.

  • andy81 3 years ago

    It feels solved in raw Powershell functions, but running external CLI tools inevitably returns text and ruins the workflow.

    "Crescendo" has been marketed as a solution and looks cool, but it means relearning the tool or documentation being less useful. The sheer amount of existing time people spent learning arcane git syntax means they're not going to switch to a hypothetical "New-GitCommit" function, even if it accepts arrays or PSCustomObject as input.

    • an1sotropy 3 years ago

      Is this the Crescendo you meant https://github.com/PowerShell/Crescendo ? From your comment I initially thought Crescendo was some separate commercial software.

      • andy81 3 years ago

        That's the one.

        It's for wrappers around existing CLI tools.

        Still feels experimental, but it's a nice idea to get objects from e.g. robocopy without dropping into regex and parsing it yourself.

        • an1sotropy 3 years ago

          Cool, thanks for the info. It is very intriguing, and I admit that until this HN thread I didn't know about powershell, or appreciate this totally different model for how to connect programs together.

          This is probably way off-topic now, and my question surely shows my ignorance, but do you know if there precedent for a program, once packaged to work within powershell (or maybe nushell), to use the associated input and output specification as the starting point for making a web interface to the same code (as bridged by a webserver)? Or have I just described .net ?

          • andy81 3 years ago

            I'm no expert on web dev, mostly working in BI/DBA.

            Running commands over WinRM or SSH can return objects of any type from remote machines. In the background I believe it's converting them to serialized CLIXML over the wire.

            e.g. $RemotePSVersion = Invoke-Command -ComputerName 'SomeOtherComputer' -ScriptBlock {$PSVersionTable}

            Rather than the variable $RemotePSVersion being a string it's an object with the type "System.Management.Automation.PSVersionHashTable", just like if you ran $PSVersionTable locally.

            For anything that returns text (e.g. external tools like curl/robocopy) you'll usually convert to an object in your script before further processing. That way it can be passed into whatever next steps in a generic way.

            e.g. curl https://catfact.ninja/fact | ConvertFrom-Json | Export-Csv '.\HighlyImportantInfo.csv'

            curl https://catfact.ninja/fact | ConvertFrom-Json | Out-Gridview

            That's less important when working interactively, but one major difference between Powershell/Bash is the relative focus on scripting vs interactive terminal use.

  • nomel 3 years ago

    In adition to stdin and stdout, I want datain and dataout pipes, for passing some standard format.

e3bc54b2 3 years ago

How about Arcan? [0]

It seems very well reasoned, has stable API, excellent backwards compatibility and does not require GPU and i5 as this one might? Its author also has proven record and actual experience, which I'm not sure authors hpf TFA have, judging solely from their writing.

0: https://arcan-fe.com/

  • carapace 3 years ago

    Um, how do you use it? I've tried like five times and I keep bouncing off the project.

    • e3bc54b2 3 years ago

      NixOS has a nice module that can be easily enabled. After that default DE durden can be invoked with single command and can be interacted with as a normal VM/Window alongside your existing DE.

      Admittedly I haven't tries making it my default yet, but seemed fairly well adaptable for tiling WM/Openbox needs.

frob 3 years ago

I'm finding the text block with an overall left-to-right gradient surprisingly hard to read. Continuing from the end of one line to the beginning of the next takes more effort than it should. I'm guessing it's the abrupt change in color.

I've found the gradient text trend to be interesting for titles and single lines, but I don't think it works for multi-line text.

tambourine_man 3 years ago

The future of the command line is something along the lines of what these guys are doing:

https://www.textualize.io

  • ablob 3 years ago

    Isn't this still confined to what a terminal gives you right now (as it is using current terminal specifications)? While certainly colorful and fancy, it doesn't really introduce new concepts, does it?

    I mean, for starters:

       -> unix introduced text as a universal interface
       -> bash made reusing stuff a lot easier via file descriptors, etc. (think: <(input to regard as file))
       -> powershell allowed for object oriented scripting
       -> some older systems (name forgotten/unknown) even had interactivity in the cli: click on parts of command-output and stuff happens, even after other commands have been run already
    • tambourine_man 3 years ago

      Sure, the terminal primitives are fixed unless we expand the standard.

      But what they are doing is way beyond colorful only. It’s smooth scrolling of text boxes within text boxes. That alone is bonkers.

      But they also have an easy API for smart/partial redrawing, sensible UI components, I mean, their progress bars are exquisite. It’s truly impressive.

  • dragonelite 3 years ago

    Tui example looks really nice would be cool if someone made something like that as a markdown previewer without having to jerryrig it to a browser.

rektide 3 years ago

The command line ought hybridize some, please. Having better self describing interfaces, machine to machine capabilities... humans are awesome enouhh to whip up super wild magic on the fly ("spellcasting on the fly") but these same tools are much weirder to use as you descend into scripting, as you start bringing "real" programming languages in (which have their own alternate realities: "standard libraries").

The command line should/ought bridge & integrate better. Making it more usable from these higher (more pre-baked/automated) levels is one side. And then reciprocally, how wonderful it would be to see execution flow expressed less in terms of stack traces & more in terms of networks of communicating processes. Create boundary layers, make the cli tools visible & known operations sequenced by (but still visible within) higher level systems.

Cli on and on!!

  • jrm4 3 years ago

    Amen. Offhand, can we start with e.g. a 2 or 3 window deal, in which there is a file manager and a document viewer that one can easily copy and paste from? There's stuff out there that sort of does this but it could be done MUCH better.

none_to_remain 3 years ago

Has anyone done anything around just ... mixing images in with the terminal output? Let's say I wanted to check if I had any old memes lying around my home directory, and have a quick look so I can decide to delete or not.

  ~$ls *{png,jpg}
  oldmeme.png 
  ~$imgcat oldmeme.png 
  /----------------\ 
  | oldmeme.png    | 
  | appears right  | 
  | here in the    | 
  | terminal       | 
  \----------------/ 
  ~$rm oldmeme.png
Terminfo man page shows some evidence of support for "bit_image" commands but none of the terminals in my terminfo files seem to have it. I have over 2000 terminfo files though, I like the idea that if I found some literal teletypewriter from 1973 and figured out some way to hook it up, I am probably prepared with the proper escape sequences.
gorgoiler 3 years ago

I love the future that fzf has given us. So many new ideas for selecting and viewing lines of space delimited records — the bread and butter of the shell — are possible with fzf and you get to build them in the traditional of small composable tools.

Junegunn Choi is really talented designer. More please.

o-o- 3 years ago

To me this is just more of the same: added convenience to what we already have. Converting individual commands to TUIs is a step away from the unix philosophy.

I think the future of the command line lies in the direction of flow-based programming and spatial representation of complex commands.

I would like to see a terminal that, as I type, _generates_ a flow-based view of my command. Every command would be visualised as a component: ls, awk, sed... Every |, < or > that I type would append a link and a new component to my flow, and ultimately I would be able to manipulate my flow instead of typing: click the ls component, have it output Creation Date instead of Modified Date, then click the awk component and add another output to a new sed component and so on.

bregma 3 years ago

It seems to me the solution to the problem posed in the article -- and to the wishes of many of the commenters here -- is to use JavaScript as your shell. The missing piece is an app that presents a text UI with a JavaScript REPL and that renders the DOM inline.

Missing file, device, and information manipulation applications that shell programmers string together would be replaced by JavaScript functions from a library. If you really want JSON, use the JavaScript Serialized Object Notation to serialize JavaScript objects in JavaScript.

Why do people write "what I want is..." articles and comments when they could be writing solutions that scratch their itch and meet their needs?

anigbrowl 3 years ago

Very well written article and very impressive magazine generally.

XorNot 3 years ago

Everyone who builds one of these never fixes the fundamental problem: they don't distinguish between "space" as a character, and the next element in the argv array.

Stop making me have to engage in bizarre escaping rituals and let me just toggle between "string mode" and "array mode".

wodenokoto 3 years ago

Is this a blog post _by_ GitHub or something posted by a user _on_ GitHub?

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection