Settings

Theme

Strand Programming Language

call-with-current-continuation.org

117 points by QuinnWilton 5 years ago · 28 comments

Reader

skissane 5 years ago

Their implementation approach is interesting: they have their own custom Forth implementation, with a kernel written in assembly language (ARM, aarch64, x86_64 and ppc64le). Then Strand is written in a mixture of Forth and itself.

simplify 5 years ago

A sample from their user manual[0]:

    % print even numbers between 1 and 10

    main :- count(1).

    count(11).
    count(N) :- N =< 10 | even(N), N2 is N + 1, count(N2).

    even(N) :- M is N \\ 2, even(M, N).

    even(0, N) :- writeln(N).
    even(_, _) :- otherwise | true.
[0] http://www.call-with-current-continuation.org/strand/MANUAL
  • QuinnWiltonOP 5 years ago

    What isn't clear from this snippet, that I think is incredibly cool, is that all of the expressions within a function (here, called processes), actually run in parallel.

    In this snippet:

       count(N) :- N =< 10 | even(N), N2 is N + 1, count(N2).
    
    count(N) is defined defined as a process that takes an argument named N, and if N is less than or equal to 10, the process changes state to a random process on the right side, and is also forked to become the other two processes.

    Where it gets weird, is that the resulting three processes have data dependencies between them, so if the third process were executed first, count(N2), the dependency on N2 wouldn't be satisfied, and so that process would suspend, and a new process be chosen for execution.

    It's easy to look at this line of code and think that the three expressions will execute in the order they're written, but any interleaving of them is possible, with that sort of non-determinism being built into the language by design.

    I'm currently reading the Strand book, and it more or less describes the language as being Prolog, but without unification and backtracking: instead treating dataflow as being the basis for the computational model.

    • 082349872349872 5 years ago

      It would be interesting if current computers could take advantage of all expressions running in parallel; back last century that was exposing more parallelism (and incurring more coordination costs) than a better, coarser-grained, approach.

      • harperlee 5 years ago

        I guess that this being a declarative language, it is the task of the compiler to determine the cutoff where you just reorder tasks or you spawn a thread, right?

      • kevin_thibedeau 5 years ago

        This is effectively what happens when HDLs are compiled into an ASIC or FPGA.

    • bitforger 5 years ago

      Can you share your copy of the Strand book? A cursory Google search makes it seem somewhat elusive...

    • Avshalom 5 years ago

      does the book explain why it doesn't use unification?

      • QuinnWiltonOP 5 years ago

        Not that I've seen yet, but the manual does include this passage:

        > Strand provides pattern matching but no general unification, instead logic variables must be explicitly assigned, which simplifies the semantics considerably.

      • chriswarbo 5 years ago

        It can be useful to tackle unification in two steps, even in Prolog. By pattern-matching first (a la functional programming) we quickly get either a failure or a set of distinct variables with constraints, which the second 'proper unification' step can try to solve.

        For example, unifying `plus(1, 2)` with `plus(X, X)` can start by pattern-matching, which turns the latter into `plus(X, Y)` with the constraint X = Y (to avoid variables occuring more than once). The match will succeed, producing the constraints X = 1 and Y = 2, along with the X = Y constraint from before. The second step will try to solve these constraints, and fail since they're inconsistent.

        I've not done much logic programming, but I've encountered this two-step approach as a way to implement co-inductive logic programming, which lets us handle infinite datastructures like streams.

  • rad_gruchalski 5 years ago

    Looks very similar to erlang, for sure the devil is in details but it’s quite easy to read when applying erlang line of thought to it.

doublec 5 years ago

From the same author, and in the same family of languages, is FLENG: http://www.call-with-current-continuation.org/fleng/fleng.ht...

It lacks Strand's distributed features but includes prolog's unification.

gnufx 5 years ago

I wasn't involved, but I remember considerable interest in Strand from parallel programming people in the lab about the time it appeared. I don't think that lasted long, but unfortunately I don't remember why it failed (if I knew then).

SISAL was probably a better bet as a dataflow-ish language, but I don't remember anyone else looking at it, despite Manchester being "just" down the road.

anentropic 5 years ago

No idea if related in any meaningful way, but https://github.com/rust-lang/chalk/tree/master/chalk-engine/... (a "PROLOG-like logic solver" for the Rust trait system) also makes use of "strands"

  • QuinnWiltonOP 5 years ago

    Oh thank you for sharing! I've never come across this, but there's some great references in the README I'll have to go through.

chrisjharris 5 years ago

I find this fascinating, if a bit incomprehensible, but what might an intended use case be?

  • chriswarbo 5 years ago

    A nice property of logic programming is that its semantics don't depend on "time": it just turns data into data (unlike "imperative" programming, like C, Java, etc. which rely on state changing over time). This is why logic programming is sometimes called "declarative".

    Whilst concurrency and parallelism are a bit of a nightmare in imperative languages (e.g. multithreading), they turn out to be trivial in such "declarative" languages; in fact the difficulty is usually in reducing parallelism, to keep down the context-switching overheads.

    Traditional logic programming languages like Prolog work in a single-threaded depth-first manner, i.e. to execute something like `main` we look at its definition, which might have multiple "clauses" depending on the input data (a bit like doing a `switch` or `cond` to figure out what to return); Prolog will see if the first condition applies, and if so it will start calculating the corresponding definition, which will probably involve some mixture of calls to other functions; it will execute those functions in a similar way. However, it will also "remember where it got up to" during the execution of 'main'; if the definition fails (or if we've asked for more than one result!), it will "back track" and try the next clause. This sort of like having a try/catch in each switch clause, and moving on to the next clause if there's an exception.

    Strand seems similar, but rather than trying one clause at a time, depth-first; it instead runs all of the clauses concurrently. This doesn't need to "remember" anything or back-track; if a particular clause fails then its thread dies, leaving the others (if any) to keep going.

    It seems like a remarkably simple way to achieve massive parallelism, especially for those already familiar with logic programming.

ADavison2560 5 years ago

A good introduction to concurrent logic languages:

Parlog86 and the dining logicians G. A. Ringwood Communications of the ACM January 1988 https://doi.org/10.1145/35043.35044

https://dl.acm.org/doi/10.1145/35043.35044

pie_flavor 5 years ago

From the User's Manual:

    三個和尚沒水喝

    (Chinese Proverb)

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection