OpenAI has revamped the API for its GPT models. The improvements will make new features available, reduce costs and make chatbots remember much longer conversations.
Everyone in the tech world will be long familiar with OpenAI’s flagship ChatGPT, but for the AI giant’s revenue model, selling API access is crucial. Large language models (LLMs) based on GPT-3.5 turbo and GPT-4, for example, can be enabled for all sorts of purposes, from Copilots in Windows and GitHub to virtual assistants in Salesforce products.
Big context window, more applications
Firstly, the API update offers a larger context window for longer prompts. This means the chatbot can take in more information and generate a response accordingly. A chat entry can now contain 16,000 tokens, compared to the 4,000 from before. From OpenAI, we have not seen an offering that accepts that many characters before. Internally it already has a GPT-4 version with 32,000 tokens, however.
Specifically, this will result in more applications for which OpenAI tech can be used. For example, with 4,000 characters, a chatbot can handle a press release of a few pages quite well. However, transforming longer pieces of text becomes possible only by entering it in parts. The idea is that with 16,000 characters, which can easily cover 8-10 pages of a document, you can have much more summarized or transformed. As with other generative AI applications, what innovations it will enable remains to be seen.
From text to tools
One properly new feature is function calling. This means that developers can enable a chatbot to invoke external tools. For example, they can now ensure that a command in natural language is turned into a computer prompt toward an e-mail service or a database. This can be very useful for many everyday purposes, although it needs a bit more control for best results. Let that be another innovation of OpenAI.
Namely, there is more “steerability” in OpenAI LLMs from now on. Thus, it should listen better to instructions such as “you are a sales assistant and you only talk about our products.”
Cheaper
Finally, OpenAI is coming down in price. GPT-3.5 Turbo now costs 25 percent less than before, which works out to about 0.0015 dollars per 1,000 input characters and 0.002 dollars per 1,000 output characters. Ars Technica calculated that this amounts to 700 pages per dollar.
Also read: OpenAI is being sued for ChatGPT’s defamatory “hallucinations”