GitHub - greenido/multi-LLM-at-once: Query few LLMs with one query and see who is the best 🙌🏾

3 min read Original article ↗

Skip to content

Navigation Menu

Sign in

Appearance settings

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up

Appearance settings

Query few LLMs with one query and see who is the best 🙌🏾

3 stars 1 fork Branches Tags Activity

Notifications You must be signed in to change notification settings

Repository files navigation

⛄️ Multi Llama Tool

A web application for querying multiple Language Model (LLM) instances with a single query.

For a longer explanation on the why/how/when: https://greenido.wordpress.com/2024/04/08/the-power-of-many-why-you-should-consider-using-multiple-large-language-models/

...the last version with the older models:

How to run the app

  1. Clone the repo
  2. Install the dependencies (npm install)
npm install
npm i ollama
  1. Open a terminal and run the following command:
node server.mjs
  1. Open localhost:3000 in your browser
  2. Start to query the local models with the query input box.

How does it look

TODO

  • Add timers per model
  • Add more models and a way to select them
  • Add more query options / pre-defined queries
  • Enable to export to file
  • Allow to leverage llama_index

License

MIT

Got an idea? Questions?

🏂 Feel free to open an issue or contact: https://x.com/greenido

About

Query few LLMs with one query and see who is the best 🙌🏾

Topics

Resources

Readme

Activity

Stars

3 stars

Watchers

1 watching

Forks

1 fork

Releases

No releases published

Packages

No packages published