FP8: Efficient model inference with 8-bit floating point numbers baseten.co 2 points by philipkiely 2 years ago · 0 comments Reader PiP Save No comments yet.