2 min

Google has just completely reversed the integration of generative AI. Where developers are now used to talking to a model in order to apply genAI, Google has now eliminated that step. Google has made Gemini 1.0 Pro, its latest LLM, directly available in the BigQuery and AlloyDB databases. Through an SQL query, you can directly unleash Gemini on the data.

Google has eliminated the application logic you need to talk to an LLM. You can specify directly in your SQL query in BigQuery and AlloyDB that a particular text should be summarized so the query’s output is immediately the summary rather than the full text.

This saves a considerable amount of code that developers have to write. This is because they can immediately work with the data generated by the model. This will undoubtedly slow down the queries slightly, but given that the integration is done at a high level, it is unlikely to result in a poor user experience.

Especially for data analysts, this direct integration is good news. They can now feed the ML models they have built to analyze enterprise data in BigQuery directly to Gemini. This should help create more advanced analysis and be able to create new insights.

Google says this can make great strides by providers in healthcare, as well as for customer experience and engagement in retail, financial services and telcos. Additionally, the insights Gemini can offer on data in Big Query go beyond just structural data.

Also read: Gemini 1.5 is much more than a new foundation model

Sound recordings, images and documents can also be insightful. With Gemini you can create structural data about these unstructured data sources, but you can also search for equivalent unstructured data based on vector search. For example, a recording of a conversation with a customer service representative can easily be transcribed and analyzed. What was discussed, what agreements were made with the customer, these are questions that can be quickly answered by Gemini. If there are open questions, possibly with images, similar cases can also be found by using vector search.

Vector search to more Google databases

Vector search plays a vital role because it can find similar unstructured data. For example, all red shirts sold in a store can be found based on an image. Of course, it makes more sense to have metadata of all products, which includes the color, size and availability. However, if that is not already available, it can be generated with Gemini, or vector search can be used to find similar products.

Google seems to be making big strides with Vector search as not only Big Query and AlloyDB are getting this feature, Vector Search is also being added within Google Cloud SQL, Spanner and Memorystore for Redis. Firestore and Bigtable are integrating with Vertex AI Vector Search to make the feature available there.