Settings

Theme

Non-deterministic execution of Python functions

gitlab.inria.fr

55 points by cha42 2 years ago · 16 comments

Reader

Bjartr 2 years ago

> Magical computer that guess for you some data of course doesn't exists.

Prolog does sunshine like this, no? You give it rules and then make a statement with a blank in it and it'll fill in the blank given the rules you've described.

  • alephaleph 2 years ago

    At a fundamental level Prolog works the same way that Python library works -- guessing and backtracking.

    • cha42OP 2 years ago

      Yes, the cool thing here is just to use higher order to provides the user non-determinism seaminglessly within Python while in prolog this is a built in feature.

phoe-krk 2 years ago

The idea seems so similar to the classical AMB ambiguous operator introduced by John McCarthy [0]. The implementation seems similar to what I've done a while ago, too, just in Lisp and using Lisp primitives rather than function decorators.

[0] http://www.randomhacks.net/2005/10/11/amb-operator/

[1] https://github.com/phoe/amb/

PheonixPharts 2 years ago

Makes me think about how non-deterministic computing is basically how you view lists as a context (as opposed to a container) when thinking in terms of Monads in Haskell. Application functors basically form a minimal framework for applying functions to values in a non-deterministic context when working with lists.

Making this further relevant is that perhaps the most well known example of monadic programming of lists is: Python's list comprehensions. Which makes a bit surprised to see that not more explored/exploited in this library.

Interesting nonetheless!

nyrikki 2 years ago

Note of clarification.

This authors use of 'non-determinsistic' doesn't match the meaning of a Nondeterministic Turing machine used to define the complexity class NP.

There are many types of non-deterministic Turing machines, like a probabilistic Turing machine which flips a coin at each step.

But the 'maximally lucky guesser' or 'run all possibilities in one step' version of NTM's that defines NP is physically unrealizable.

  • cha42OP 2 years ago

    It is the same notion exactly actually.

    The code isn't magically solving NP vs P but it does simulate non deterministic run through a potentially exponential exploration.

  • alephaleph 2 years ago

    It's roughly the same, no? It emulates a non-deterministic Turing machine and gives the same results, just taking much longer. But runtime isn't everything! I think that if you're programming in a nondeterministic style, you're programming as if you have a NTM, whether or not you're actually running the code on one.

    Also, surely NTMs are only physically unrealizable if P!=NP, and so whether or not NTMs could exist is an open question.

__lm__ 2 years ago

Useful to explain non-determinism to students. I saw a similar idea before at https://github.com/aeporreca/nondeterminism which uses fork() to (inefficiently) explore all possible guesses concurrently

  • saghm 2 years ago

    It doesn't help that CS students will potentially end up hearing the term "nondeterminism" to mean different things in different contexts. In the context of the linked repo, it's used in the way they'll probably encounter it when learning about Turing machines, but in less formal contexts it also gets used a lot to describe stuff like "Heisenbugs" where running something more than once doesn't necessarily end up in the same result

    • dataflow 2 years ago

      > but in less formal contexts it also gets used a lot to describe stuff like "Heisenbugs"

      There are pretty formal contexts in which it also means things like that! It's not about formalism, it's just about context.

    • PhunkyPhil 2 years ago

      For me, the idea of nondeterminism was around a computation tree who's branches correspond to possible states of computation.

      In the context of a deterministic vs nondeterministic finite automata, a DFA has a linear computation path from the root to an accept/reject state and a NFA branches on epsilon transitions, continuing each potential computation (the "guesses") in so called "parallel universes" until one branch hits an accept state.

      This definition follows with Introduction to the Theory of Computation 3rd edition by Sipser (would recommend)

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection