Ultra-efficient machine learning transistor cuts AI energy use by 99%
newatlas.comYou have to go from this article to the Northwestern press release to the source research paper to learn that this is for simple SVMs in wearable devices like for healthtech.
Neat, but it doesn't really have any bearing on the deep learning and generative AI trends.
Are those transistor analog? It seems that AI would be fine with an analog weight evaluation and a lot of energy is wasted doing numerical calculations.
My understanding is that this doesn’t work for back propagation because it needs to know the forward pass computation exactly to adjust the weights. Analog computation introduces error in the form of noise (thermal, environmental, etc) which makes training on analog a non starter. It probably works for inference if the error is low enough, but training is where most of the power is spent unless it’s massively deployed.
Thanks. That's interesting. If I understand correctly what you say, analog cannot be used for training but might be used to bake a trained static model. I think things are evolving too fast now for pre-trained models to be baked in silicon but it's a matter of time for it to become common.
There’s probably a way to make an analog AI chip programmable, but yeah the wiring (model architecture) is fixed when you fabricate/design the circuit. I think we’ll start to see simple models deployed to edge devices soon where low power solutions will be a necessity.
You will get error regardless of whether it is analog or digital.
Digital trades bandwidth for error correction, but this does not mean it is better as many ML algorithms are noise-tolerant and would benefit greatly from the increased throughput.
Like everything, I believe there are simply just tradeoffs.
I have an analog weight evaluation neural net in my own skull. It runs fine on a few Snickers bars.
This seems very exciting. How long will it take to see the light of day and actually used in some form?
> Ultra-efficient machine learning transistor cuts AI energy use by 99%
By using a diode they could cut the energy use by 99,999 %