GPT-J is self-hosted open-source analog of GPT-3: how to run in Docker
tracklify.comAfter reading this, it seems clear (for most users) that the paid REST api for GPT-3 is a good deal.
By my math just the hosting cost for GPT-J’s TPU node puts it at about the same price per token as OpenAI’s Davinci. And that’s if you’re using it 100% of the time for a whole month.
Adding in the cost of a server to manage a REST API and a proxy to forward completions requests to an available node and you’re definitely going to pay more than you would to OpenAI. Add on the personnel cost of now maintaining your own GPT-J cluster and writing your own software to handle the completions requests and you’re paying way WAY more than you would to OpenAI. All that subsidization from M$ is making its way to the customer.
This is still something we’ll probably look into at work. But it won’t be a cost saving measure.
The GPT-3 API from OpenAI is only open for you if they approve of you. Anyone even remotely concerned with freedom (of software, of ideas) should not be using it.
That hasn't been true for a while now: https://openai.com/blog/api-no-waitlist/
> That hasn't been true for a while now: https://openai.com/blog/api-no-waitlist/
Sure, but you can substitute "approve of your use case" instead:
https://beta.openai.com/docs/usage-guidelines/content-policy
Step-by-step guide to run GPT3 analog in Docker: from renting chaepest server with GPU to ready running installation
Did anyone work out the pricing comparable to usage?