#mixtral-8x7b

[ follow ]
#mistral-ai
InfoQ
8 months ago
Artificial intelligence

Mistral AI's Open-Source Mixtral 8x7B Outperforms GPT-3.5

Mistral AI released Mixtral 8x7B, a large language model (LLM) that outperforms other models on benchmarks.
Mistral 8x7B is a sparse mixture of experts (SMoE) model with 46.7B parameters, but performs at the same speed and cost as smaller models. [ more ]
Ars Technica
9 months ago
Artificial intelligence

Everybody's talking about Mistral, an upstart French challenger to OpenAI

Mistral AI has developed a new AI language model called Mixtral 8x7B that reportedly matches OpenAI's GPT-3.5 in performance.
Mixtral 8x7B is a 'mixture of experts' model with open weights and can process a 32K token context window in multiple languages. [ more ]
InfoQ
8 months ago
Artificial intelligence

Mistral AI's Open-Source Mixtral 8x7B Outperforms GPT-3.5

Mistral AI released Mixtral 8x7B, a large language model (LLM) that outperforms other models on benchmarks.
Mistral 8x7B is a sparse mixture of experts (SMoE) model with 46.7B parameters, but performs at the same speed and cost as smaller models. [ more ]
Ars Technica
9 months ago
Artificial intelligence

Everybody's talking about Mistral, an upstart French challenger to OpenAI

Mistral AI has developed a new AI language model called Mixtral 8x7B that reportedly matches OpenAI's GPT-3.5 in performance.
Mixtral 8x7B is a 'mixture of experts' model with open weights and can process a 32K token context window in multiple languages. [ more ]
moremistral-ai
[ Load more ]