Settings

Theme

ModelScan – open-source scanning for unsafe code in ML models

github.com

4 points by wolftickets 2 years ago · 1 comment

Reader

wolfticketsOP 2 years ago

I lead product at Protect AI and we just released ModelScan. It is open source project that scans models to determine if they contain unsafe code. It is the first model scanning tool to support multiple model formats. ModelScan currently supports: H5, Pickle, and SavedModel formats. This protects you when using PyTorch, TensorFlow, Keras, Sklearn, XGBoost, with more on the way.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection