2 min Devops

JetBrains launches its own AI code assistant

JetBrains launches its own AI code assistant

JetBrains is expanding the AI capabilities of its tools. It is launching a self-developed AI assistant to support developers in the development process.

In version 2024.3 of the JetBrains tools, there are two improvements that deserve attention. These enhancements give developers more insight into the IDEs and improve the AI support in projects.

JetBrains IDEs now display the logical structure of code. As a result, developers gain more insight into the project and can bring more structure to the part of fixing bugs in the projects. JetBrains is taking K2 mode out of beta for the improvements.

More AI support

Furthermore, the new version of the AI Assistant integrates noteworthy new LLMs. These include Gemini 1.5 Pro and Flash from Google. By the way, developers can still choose to have the chatbot driven by OpenAI or self-developed models.

In addition to a better chat experience, developers get better support while writing code. This is done through the Mellum model that the company developed itself. The AI tool supports from within the editor for the most natural experience possible.

Mellum has been available for several weeks. This LLM was developed to work faster and smarter with an AI model that considers context when generating AI output. It is not possible at JetBrains to invoke a competing LLM for code completion, the AI Assistant will automatically select Mellum for code completion.

Also read: GitHub Copilot welcomes OpenAI competitors