Grok, an AI Modeled After the Hitchhiker's Guide to the Galaxy
twitter.comLol context length 160 chars. Answers with lol, yeah, kewl
Had been trained on Twitter Data.
Ok. I still have 160k sms from the old decade. It would make a perfect trainingset that will outperform xAi with decent lol, no and OMG answers. If well trained, of course.
(Funny-mode off)
Let's see. Twitter has a few things that may be very very good for training. A short message for training may be even better. Because, the shorter it is, the less information to understand will be given, the more of an abstraction needs to be done. Let's see what happens.
I asked ChatGPT to write me a short email and it returned 6 paragraphs. Some brevity would be nice, but as with Twitter, 160 characters could be limiting and fail to provide context for more complex questions.