Settings

Theme

Show HN: A web-app to explore topics using LLM

github.com

20 points by graphitout 2 years ago · 3 comments · 1 min read

Reader

Lately, I've been tinkering with llama.cpp and the ollama server. The speed of these tools caught my attention, even on my modest 4060 setup. I was quite impressed with the generation quality of models like Mistral.

But I was a bit unhappy at the same time because whenever I explore a topic, there is a lot of typing involved when using the chat interface. So I needed a tool to not only give a response but also generate a set of "suggestions" which can be explored further just by clicking.

My experience in front-end development is limited. Nonetheless, I tinkered together a small web app to achieve the same goal. It is built with vuejs3+vuetify.

Code: https://github.com/charstorm/llmbinge/

pbronez 2 years ago

Interesting concept, can you share some more detail about the implementation? How are you generating the different portions of the interface? Seems like you have a couple canned prompts that trigger a few exploratory ideas in addition to a primary response.

  • pbronez 2 years ago
    • graphitoutOP 2 years ago

      You are right.

      - llm_generate() - this is the core function which calls ollama API

      - get_related() - this will give a sequence of related topics for a given topic and description

      - llm_get_aspect_query() - this is tricky. The interface has a fixed set of aspects for every response. Like history, related ideas, people etc. I wanted a way to create new query based on an existing query and a given aspect. This func does the rephrasing of the query.

      - App.vue: handle_related() - there is a tiny bit here. When going to another "suggested" topic, it seems we have to give it some context.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection