Version 1.3 of Red Hat Enterprise Linux (RHEL) introduces the Granite 3.0 8b model to the platform and simplifies training data preparation.
Developed by IBM, Red Hat’s parent company, Granite-LLMs are part of Red Hat’s offering to bring AI to enterprise environments. Granite can be used for applications such as summarization, question answering and classification. Consider, for example, finishing a line of code for developers or analyzing large data sets to gain insights.
With RHEL AI 1.3, support is provided for Granite 3.0 8b. Red Hat describes this type of Granite as a converged model suitable for natural language processing, code generation and function calling. The model is directly usable for English-language use cases. In the developer preview, support is also available for non-English use cases, as well as coding and function calling.
Preparing data
To prepare data for model training, Red Hat also leans on IBM. Recently, IBM made Docling open source, a project that can convert document formats to formats such as Markdown and JSON. RHEL AI 1.3 integrates Docling as a new feature, allowing, for example, PDF files to be easily converted for data entry in InstructLab.
By adding Docling, RHEL AI also provides context-aware chunking. This technology takes into account the structure and semantic elements of documents, leading to better responses from AI applications.