Microsoft open sources the inference engine in its Windows ML platform
zdnet.comApparently, the Open Neural Network Exchange (ONNX) runtime is an API so you can run models locally instead of on another machine.
I didn't see any details about the inference engine, so I assume this is a neural net AI application programming interface instead of a symbolic AI inferencing engine.