Settings

Theme

Ask HN: How are you handling the Biggest Vendor Lock-in of the Decade?

7 points by furilo 2 years ago · 4 comments · 1 min read


Open source LLMs are amazing, they will get better and maybe even better that OpenAI's. But: using OpenAI APIs is so easy (they just work, are cheap, they will get cheaper) that added to all the hype, many of us just start using them. We know there are other options, but since this thing we are doing is just a test, we'll have time to change the future since changing is so easy.

But wait: is changing just updating the URL to the API you call?

For many applications, you need to create the embeddings. Each new embedding you create with a given model is digging a little deeper on the vendor lock in...

And if going out of S3 is expensive, imagine having to re-embed all your content with another model/provider...

So, what are your thoughts about the Biggest Vendor Lock-in of the Decade?

ilaksh 2 years ago

It's not that it's so hard to use open models -- there are easily accessible APIs now.

It's that GPT-4 still blows everything out of the water in terms of ability.

  • runjake 2 years ago

    +1. The APIs and programmability are there on open models. Heck my M3 Max is churning out answers far more quicker than from GPT-4.

    But the quality isn’t nearly as good as GPT. The “hallucinations” from the open models are often annoyingly inferior.

    • hilti 2 years ago

      Absolutely interesting. Would you mind sharing which models and tools you are using on your M3?

      • runjake 2 years ago

        Sure. I’m not a computer scientist so I’m using ollama[1] along with a number of the most popular models[2].

        It publishes a REST API on localhost and you can make use of “Modelfiles”, which are modeled on Dockerfile, to create customized models (ala GPTs).[3]

        It took 10 minutes to get all this working — most of it waiting for model downloads. I had working code and Modelfiles in another 20 minutes after that. I also have it (with the llama uncensored model) hooked into my Raycast.

        Switching models on the fly is pretty seamless.

        If for some reason my links go over anyone’s heads, there’s a number of thorough and approachable ollama walkthroughs on YouTube.

        1. https://ollama.ai/

        2. https://ollama.ai/library

        3. https://github.com/jmorganca/ollama

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection