Generative AI

GPT-4, Gemini, Claude 3, Devin, Llama 3, Mixtral: the list of AI models is growing by the day. In this Insight, we present analyses, backgrounds and current developments on GenAI and its various applications. The existing Large Language Models (LLMs) each focus on their own niches, necessitating an open-ended approach. Presumably we are going to see the multi-model strategy frequently.
Without data, AI is nothing, so it is important to make the right choices about where you house that data (e.g. a private cloud) and what kind of AI you train on it. When proprietary data is concerned, using a smaller model with fewer parameters may be an attractive option. RAG (Retrieval-Augmented Generation) is particularly suitable for training AIs on such proprietary data. This Insight hopefully enables IT professionals to make the right choices in the ever-changing field of AI.