Settings

Theme

Microsoft open sources the inference engine in its Windows ML platform

zdnet.com

15 points by anirudhgarg 7 years ago · 1 comment

Reader

whitten 7 years ago

Apparently, the Open Neural Network Exchange (ONNX) runtime is an API so you can run models locally instead of on another machine.

I didn't see any details about the inference engine, so I assume this is a neural net AI application programming interface instead of a symbolic AI inferencing engine.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection