Mixtral-8x22B-Instruct
Last updated
Was this helpful?
Last updated
Was this helpful?
This documentation is valid for the following list of our models:
mistralai/Mixtral-8x22B-Instruct-v0.1
A cutting-edge large language model designed for instruction-following tasks. Built on a Mixture of Experts (MoE) architecture, this model is optimized for efficiently processing and generating human-like text based on detailed prompts.
If you don’t have an API key for the AI/ML API yet, feel free to use our Quickstart guide.
Creates a chat completion using a language model, allowing interactive conversation by predicting the next response based on the given chat history. This is useful for AI-driven dialogue systems and virtual assistants.
/v1/chat/completions
mistralai/Mixtral-8x22B-Instruct-v0.1
No body