To label generative AI as a “hot topic” is an understatement. The technology is everywhere, from chatbots to data analytics. However, Douwe Kiela (CEO) and Amanpreet Singh (CTO) of startup Contextual AI see that the enterprise market has a number of hurdles before it can fully commit to this form of artificial intelligence.
On Wednesday, Contextual AI launched from “stealth mode”: the startup can already count on $20 million (€18.6 million) in investment. Speaking to TechCrunch, Kiela revealed that at Contextual AI they want to depart from the existing trend of setting up generative AI mainly for consumers. It is not for nothing that, for example, Google Bard and ChatGPT can mainly inspire, but not yet really generate faultless new information. For that, Kiela believes, a “next generation” variant is needed that can cater to business.
After all, there is currently already quite a bit of resistance to the unrestrained growth of AI applications. The discussion about the use of user data and the possible violation of privacy and copyright laws is not yet over. The EU, the U.S. and other political entities are working on AI regulation, for which compliance will ultimately determine business. That’s where Contextual AI wants to come in with responsible use cases so everyone can have confidence in AI.
Spores earned
Kiela and co-founder Singh have at least earned their spurs. Both men have worked at AI companies Hugging Face and Meta, where Kiela researched retrieval augmented generation (RAG). This application already sounds a lot less prone to “hallucinations,” or misinformation from AI bots. You can think of RAG as a layer on top of an LLM, keeping an eye on which sources contain the information the AI model comes up with. It provides the LLM with more context around the question, so it better grounds itself in reality.
This form of opacity can lead to quite concrete errors. Because LLMs are not “context-aware,” it can provide misinformation without making clear what it is basing it on. Even if it is correct, it would be desirable if one could access the source. This is integrated into Bing Chat, but is far from error-free. RAG should eventually fix this problem among other existing problems regarding generative AI, Kiela thinks.
The big advantage will apparently be that new AI models can be a lot smaller without sacrificing accuracy. This is a long-term necessity, since we can’t build supercomputers and data centers endlessly. In any case, running AI models locally could become a lot more practical with RAG.
Currently Contextual AI has only eight employees, but it hopes to have 20 by the end of the year. It hopes large companies will be open to trying the technology on a pilot basis.
Also read: Marketing industry sees generative AI as game changer