MiniLLM: A minimal system for running LLMs on consumer-grade Nvidia GPUs
twitter.comGitHub link: https://github.com/kuleshov/minillm
good luck with this, looks good so far
GitHub link: https://github.com/kuleshov/minillm
good luck with this, looks good so far