Mixtral 8x7B

Summary: Medium sized legacy mixed expert model of Mistral for conversational agents and virtual assistants requiring higher response quality.

Intelligence

Speed

Sovereignty

Input

Output

Medium

Medium

Moderate

Text

Text

Central parameters

Description: Mixtral 8x7B is the first mixture of experts model trained by Mistral. It is especially suited for use cases for which response time is crucial and natively supports various European languages including German, French, Spanish, Italian, and Portuguese. It was one of the first models trained on large European language corpora.

Model identifier: mistralai/Mixtral-8x7B-Instruct-v0.1

IONOS AI Model Hub Lifecycle and Alternatives

IONOS Launch

End of Life

Alternative

Successor

July 1, 2024

N/A

Origin

Provider

Country

License

Flavor

Release

France

Instruct

December 11, 2023

Technology

Context window

Parameters

Quantization

Multilingual

32k

46.7B

fp8

Yes

Modalities

Text

Image

Audio

Input and output

Not supported

Not supported

Endpoints

Chat Completions

Embeddings

Image generation

v1/chat/completions

Not supported

Not supported

Features

Streaming

Tool calling

Supported

Not supported

Rate limits

Rate limits ensure fair usage and reliable access to the AI Model Hub. In addition to the contract-wide rate limits, no model-specific limits apply.

Last updated

Was this helpful?