Settings

Theme

ChatGPT API – How do I handle the 4000 token limit for keeping context?

3 points by johnnyyyy 3 years ago · 2 comments · 1 min read


I am coding a little Discord ChatGPT Bot that I just want to privately use. Right now I just feed the whole message history back to ChatGPT but of course you quite quickly reach the 4000 token limit.

The simplest solution of course would be to just delete the oldest messages but that of course could delete more important messages.

Another idea was that I could let ChatGPT summarise the conversation when I am close to the limit and then only save that message.

Maybe theres already some better implementation that I am not aware of?

bob1029 3 years ago

Prompt engineering & fine-tuning are the strategic solution.

https://platform.openai.com/docs/guides/fine-tuning

It can be cheaper to use a more expensive model (in terms of tokens/$) if it can be tuned with the same context you'd have to otherwise send with every prompt.

Right now, gpt-3.5-turbo cannot be tuned, so you must fit everything in the 4k limit. There are 4 other models that can be tuned.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection