Tag: mixture of experts

Here you will find all the articles with the tag: mixture of experts.

xAI open sources details and architecture of their Grok-1 LLM

xAI open sources details and architecture of their Grok-1 LLM

Elon Musk's AI developer xAI has finally made the basic model, underlying parameters and architecture of the Grok-1 LLM open source. This was previously announced. In a short blog post, Elon Musk's AI development startup provides more details on the part of the network architecture of the Grok-... Read more

date1 month ago
Microsoft launches Tutel, an open-source library for AI development

Microsoft launches Tutel, an open-source library for AI development

Microsoft announces Tutel. The open-source library is available immediately for developing AI models and applications with a 'mixture of experts' architecture. AI models and residential buildings have a feature in common. Both start with architecture. 'Mixture of experts' (MoE) is an architectur... Read more

date2 years ago