Settings

Theme

Probability Chip

technologyreview.com

87 points by eli_s 15 years ago · 27 comments

Reader

tricky 15 years ago

The thesis on which this is based:

http://phm.cba.mit.edu/theses/03.07.vigoda.pdf

edit: p 135 is where he starts talking about implementation in silicon

moultano 15 years ago

I'm curious how they deal with probabilities very close to 1 or 0. Usually when people are doing bayesian things with probabilities they work in logistic space so that the precision of values close to 1 or 0 is effectively unbounded. That seems like a hard thing to do with an analog circuit.

  • wwalker3 15 years ago

    The founder's thesis mentions that they use a linearizer in their analog circuit, but all that does is give the same precision over the entire logical value range from 0 to 1 (by that I mean the same amount of voltage swing equals the same amount of "logic value change" anywhere in the range).

    I suppose they could use a "non-linearizer" to put more of the precision near 0 and 1, but it would come at the expense of precision in the middle. The less voltage swing is involved, the more susceptible you are to noise from various sources.

xtacy 15 years ago

Is it likely (in the future) to see more domain specific chips? Something like what http://www.deshawresearch.com/ has created---a custom chip Anton, optimised for Molecular Dynamics simulations.

  • wwalker3 15 years ago

    I think it is likely we will see more of these. But this probability chip sounds more like an analog computer than what D. E. Shaw has done.

    The Lyric web site says that they "model relationships between probabilities natively in the device physics", where D. E. Shaw's Anton chip sounds like it uses traditional logic gates the same way a GPU does.

    P.S. Sorry, I downvoted you by accident -- I meant to upvote you.

  • gwern 15 years ago

    Domain-specific chips is a cyclical trend. They come and go; at some times they have advantages, and at others they don't. (Remember Lisp machines? Good initially but vastly outperformed by the end of their lifespan.) See for example the classic 'wheel of reincarnation' paper on graphics: http://cva.stanford.edu/classes/cs99s/papers/myer-sutherland...

    The fundamental problem as I see it is that any domain-specific chip will receive a tiny fraction of R&D and economies of scale and amortization that a general purpose one will, and so its advantage is only temporary. As long as Moore's law is operating, this will be true.

    • gwern 15 years ago

      To quote the thesis on probabilistic chips:

      > In practice replacing digital computers with an alternative computing paradigm is a risky proposition. Alternative computing architectures, such as parallel digital computers have not tended to be commercially viable, because Moore’s Law has consistently enabled conventional von Neumann architectures to render alternatives unnecessary. Besides Moore’s Law, digital computing also benefits from mature tools and expertise for optimizing performance at all levels of the system: process technology, fundamental circuits, layout and algorithms. Many engineers are simultaneously working to improve every aspect of digital technology, while alternative technologies like analog computing do not have the same kind of industry juggernaut pushing them forward.

  • preview 15 years ago

    You're already seeing domain specific chips, but they're in the form of a FPGA rather than an ASIC. If it can be implementation with traditional gates, a FPGA is the way to go for low to medium volume.

    While Lyric may incorporate classic gates in their design, it also sounds like the heart of their technology uses something different from classic gates.

perplexes 15 years ago

One step closer. http://en.wikipedia.org/wiki/Technology_in_The_Hitchhiker%27...

snippyhollow 15 years ago

My Ph.D advisor will go crazy, he had his European research project on a probability computer turned down a few months ago.

api 15 years ago

Isn't this just the revenge of the analog computer?

Not saying it's a bad idea... I'm really for the idea of revisiting assumptions in computer design.

  • copper 15 years ago

    Sure, if you can represent your problem using probabilities :)

    That said, I'm more excited about the use of Lyric's technology in ECC memory. I'm skimming through Vigoda's thesis, and it seems that another very interesting application ought to be making even lower-power mobile backend chips.

ajb 15 years ago

I thought I'd heard something like this before. From 2004: http://www.eetasia.com/ART_8800354714_499488_NT_92255b4a.HTM

That's a turbo decoder rather than a generic probability calculator, but it's doing probability calculations in the analog domain.

This sort of thing may make sense for error correction, but I don't think people will run general probability calculations on it. Too difficult to debug :-)

Though, I do wonder if they can simulate a neuron more efficiently than digital logic.

RiderOfGiraffes 15 years ago

Sounds a lot like the ByNase protocol that Ward Cunningham (inventor of the wiki) came up with:

http://c2.com/cybords/wiki.cgi?BynaseProtocol

RiderOfGiraffes 15 years ago

Printer friendly, (almost) no ads, no pointless images:

http://www.technologyreview.com/printer_friendly_article.asp...

siavosh 15 years ago

similar to the fuzzy-logic chips of the 90's?

frisco 15 years ago

How does this compare to what Navia Systems is working on?

jon_hendry 15 years ago

But how do you connect it to the cup of no tea?

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection