JFrog has introduced a new Model Context Protocol (MCP) Server that allows AI assistants and large language models to interact directly with the JFrog Platform. The server aims to boost developer productivity by enabling natural language commands within coding environments.
As JFrog Co-Founder and CTO Yoav Landman noted, “The developer tool stack and product architecture has fundamentally changed in the AI era.” The MCP Server addresses this shift by providing seamless integration between AI tools and software supply chain management.
The solution targets the growing need for AI-powered development workflows while maintaining the security and governance requirements of enterprise software development. By supporting self-service AI across the development lifecycle, JFrog aims to help teams build applications faster without compromising security.
Integration with development tools
The MCP Server integrates with popular agentic coding environments and IDEs through standard MCP clients. This enables developers to access JFrog Platform capabilities without leaving their preferred development environment.
Key features include repository management, build status monitoring, and detailed package information queries. The server also provides vulnerability scanning results directly within the development workflow.
Developers can access the preview through AWS marketplace or follow the step-by-step implementation guide provided by JFrog. The company seeks feedback during this preview period to refine the offering before general availability.