Docker has introduced a new generative AI stack in collaboration with partners Neo4j, LangChain and Ollama. This should help developers build generative AI applications quickly and easily.
The collaboration between Docker and the other partners has produced a new so-called GenAI stack. The aim of this stack is to help developers quickly and easily create generative AI applications without having to find all the necessary technologies and configurations together.
The stack hereby combines Docker’s solutions and tooling with which developers are already familiar with Neo4j, LangChain and Ollama technology to develop generative AI applications.
GenAI stack functionality
The solutions assembled in the stack include pre-configured components such as Ollama’s open-source LLMs and support, Neo4j’s vector and graph databases for improved AI/ML performance and the LangChain framework for orchestrating context-aware reasoning applications.
The joint GenAI stack focuses primarily on popular use cases for generative AI applications. The stack is now available in the Docker Desktop Learning Center or via GitHub.
Also read: Docker launches tools to develop container applications and security