Google is making it more attractive to switch to Gemini. Those currently using OpenAI’s API to connect to that company’s LLMs can migrate to Google with three lines of code.
Previously, developers had to contact the Gemini API directly, but now these models have been made compatible with the OpenAI libraries and the REST API. The Python code already looks relatively simple, but the REST API contains less than 300 characters. However, those not yet using OpenAI are advised to still connect directly to Google’s API.
The Gemini models thereby join the broader Google Vertex AI portfolio of models, where all LLMs could also easily switch from OpenAI. Through Vertex AI, an offering is a lot broader than just offering API access for models, such as Vertex AI Agent Builder.
Also read: Google’s Vertex AI has generative AI features
Standardization
Google’s move shows a gradual standardization in terms of AI deployment. Since OpenAI is considered the market standard for GenAI deployment via APIs, Elon Musk’s xAI also made the choice to be compatible with all OpenAI SDKs (as well as Anthropic).
Incidentally, OpenAI APIs date back to June 2020, when GPT-3 had only just appeared. The emergence of ChatGPT (and thus the GenAI hype) did not occur until November 2022. As a first mover in the GenAI market, OpenAI has set a standard, but with this compatibility, competitors hope to avoid lock-in.
Price war?
Recently, API prices for GenAI use have fallen sharply. OpenAI, for example, pushed down the cost of its own API for GPT-4o by 50 percent for input and 33 percent for output in the middle of this year. Google was even more aggressive with a 75 percent price reduction for Gemini 1.5 Flash. So with this further standardization, users can simply look at prices and then judge whether the cheapest model already meets their requirement.
Also read: Problems with Orion model forces OpenAI to adopt new strategies