2 min Applications

Microsoft Azure makes AI adoption easier with OpenAI Data Zones

Microsoft Azure makes AI adoption easier with OpenAI Data Zones

Microsoft has added a number of new capabilities to its Azure AI portfolio. Key announcements include the arrival of Azure OpenAI Data Zones, an SLA for very low latency, new AI models available and availability through GitHub Marketplace.

The new capabilities within the Microsoft Azure AI portfolio should further assist users of these managed AI services in building and scaling AI solutions and applications.

Azure OpenAI Data Zones

An important newly added option includes the ability of Azure OpenAI Data Zones. These specific zones for managed OpenAI capabilities within Azure should give companies more control over data privacy and residency.

Specifically, this new data regions option allows customers in the EU and the United States to process and store data within this solution in the specific regions. Currently, this option is available for the Standard (PayGo) tier and soon for the Provisioned tier.

In addition, a 99.9 percent latency SLA has been introduced for token generation. This SLA should ensure that customers always have access to guaranteed speeds for token generation. This should give developers more confidence in the performance of the service.

Also, more options for the various o1-preview, o1-mini, GPT-4o and GPT-4o-mini LLMs are now available within the Azure OpenAI portfolio.

New LLMs available

Other new features include the availability of new LLMs within the Microsoft Azure AI portfolio. The new LLMs focus primarily on healthcare and include multimodal medical vision models, such as MedImageInsight for analyzing medical images, MedImageParse for image segmentation and CXRReportGen for generating detailed structured reports.

Furthermore, LLMs from Cohere and Mistral AI have now been added, including Cohere Embed 3 and Ministral 3B. Even more fine-tuning is also possible for Microsoft’s own mini-LLMs Phi-3.5-mini and Phi-3.5-MoE.

Playgroud for testing and comparison

Finally, using the Azure AI model inference API, Azure AI LLMs are now also available through GitHub Marketplace. This allows developers to test and compare LLMs for free in a “playgroud” environment. If they want to use a particular LLM, they can log in through their Azure account and switch to a paid tier.

Also read: New tools in Microsoft Azure AI Studio keep LLMs safe