Go apps can directly integrate llama.cpp for HW accelerated local inference github.com 2 points by deadprogram a month ago · 0 comments Reader PiP Save No comments yet.