Locally run a ChatGPT-style LLM Trained from 800k GPT-3.5-Turbo Generations
twitter.comNot exactly seeing great performance with this model. It gets tripped up by ridiculously silly questions.
For example "if you have a spoon, a knife, a fork, a carrot and a calculator, which one would you use for math?"
> None. The question is grammatically incorrect as it implies that the person has all of those items in their possession at once
or
"The bus was going so fast it passed the racecar". What vehicle was going the fastest?"
> It's impossible to tell which one went faster, as they were both traveling at high speeds
ChatGPT and Claude also get tripped by both questions
I think Claude is correct on the merits (Claude’s awesome in general)
Without more context about the speeds of the bus and racecar, I cannot determine which vehicle was going fastest based on the given statement. Simply saying that the bus passed the racecar does not provide enough information to compare their speeds.
This is true, the question isn't inconsistent with the bus and the racecar travelling in opposite directions, it's just implied that they're going in the same direction.
GPT-4 (Bing) solves them correctly.
yeah, gonna need a few more M's of data work for that unfortunately.
Repo linked in the second tweet: https://github.com/nomic-ai/gpt4all
I can’t wait for “competing with OpenAI” to get more clearly defined.
For example, could I use ChatGPT to create training data for a (non-llama) language model that I then sell per-device licenses to end users? OpenAI doesn’t sell per-device licenses, so I wouldn’t be competing with them.
Right?