Go apps can directly integrate llama.cpp for HW accelerated local inference github.com 2 points by deadprogram 4 months ago · 0 comments Reader PiP Save No comments yet.