Mixtral 8x7B
AI Model Hub for Free: From December 1, 2024, to June 30, 2025, IONOS is offering all foundation models in the AI Model Hub for free. Create your contract today and kickstart your AI journey!
Summary: Medium sized legacy mixed expert model of Mistral for conversational agents and virtual assistants requiring higher response quality.
Intelligence
Speed
Sovereignty
Input
Output
Medium
Medium
Moderate
Text
Text
Central parameters
Description: Mixtral 8x7B is the first mixture of experts model trained by Mistral. It is especially suited for use cases for which response time is crucial and natively supports various European languages including German, French, Spanish, Italian, and Portuguese. It was one of the first models trained on large European language corpora.
Model identifier: mistralai/Mixtral-8x7B-Instruct-v0.1
IONOS AI Model Hub Lifecycle and Alternatives
IONOS Launch
End of Life
Alternative
Successor
Origin
Provider
Country
License
Flavor
Release
Technology
Context window
Parameters
Quantization
Multilingual
32k
46.7B
fp8
Yes
Modalities
Text
Image
Audio
Input and output
Not supported
Not supported
Endpoints
Chat Completions
Embeddings
Image generation
v1/chat/completions
Not supported
Not supported
Features
Streaming
Tool calling
Supported
Not supported
Rate limits
Rate limits ensure fair usage and reliable access to the AI Model Hub. In addition to the contract-wide rate limits, no model-specific limits apply.
Last updated
Was this helpful?