Settings

Theme

Ultra-efficient machine learning transistor cuts AI energy use by 99%

newatlas.com

28 points by 0xa2 2 years ago · 9 comments

Reader

heyitsguay 2 years ago

You have to go from this article to the Northwestern press release to the source research paper to learn that this is for simple SVMs in wearable devices like for healthtech.

Neat, but it doesn't really have any bearing on the deep learning and generative AI trends.

heisgone 2 years ago

Are those transistor analog? It seems that AI would be fine with an analog weight evaluation and a lot of energy is wasted doing numerical calculations.

  • variadix 2 years ago

    My understanding is that this doesn’t work for back propagation because it needs to know the forward pass computation exactly to adjust the weights. Analog computation introduces error in the form of noise (thermal, environmental, etc) which makes training on analog a non starter. It probably works for inference if the error is low enough, but training is where most of the power is spent unless it’s massively deployed.

    • heisgone 2 years ago

      Thanks. That's interesting. If I understand correctly what you say, analog cannot be used for training but might be used to bake a trained static model. I think things are evolving too fast now for pre-trained models to be baked in silicon but it's a matter of time for it to become common.

      • variadix 2 years ago

        There’s probably a way to make an analog AI chip programmable, but yeah the wiring (model architecture) is fixed when you fabricate/design the circuit. I think we’ll start to see simple models deployed to edge devices soon where low power solutions will be a necessity.

    • tbalsam 2 years ago

      You will get error regardless of whether it is analog or digital.

      Digital trades bandwidth for error correction, but this does not mean it is better as many ML algorithms are noise-tolerant and would benefit greatly from the increased throughput.

      Like everything, I believe there are simply just tradeoffs.

  • kridsdale3 2 years ago

    I have an analog weight evaluation neural net in my own skull. It runs fine on a few Snickers bars.

smusamashah 2 years ago

This seems very exciting. How long will it take to see the light of day and actually used in some form?

hulitu 2 years ago

> Ultra-efficient machine learning transistor cuts AI energy use by 99%

By using a diode they could cut the energy use by 99,999 %

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection