qwen-max
Last updated
Was this helpful?
Last updated
Was this helpful?
This documentation is valid for the following list of our models:
qwen-max
qwen-max-2025-01-25
A large-scale Mixture-of-Experts (MoE) language model developed by Alibaba Cloud. It excels in language understanding, generation, and task performance across a variety of modalities. Mixture-of-Experts (MoE) Architecture: Uses 64 specialized "expert" networks, activating only relevant ones per task for efficient processing. Extensive Multilingual Support: Supports 29 languages, including Chinese, English, and Arabic. Long-Context Optimization: Supports 32K context windows with 8K generation. High Stability: Demonstrates high stability in maintaining prompt instructions, with no erroneous replies during extensive testing.
If you don’t have an API key for the AI/ML API yet, feel free to use our Quickstart guide.
Creates a chat completion using a language model, allowing interactive conversation by predicting the next response based on the given chat history. This is useful for AI-driven dialogue systems and virtual assistants.
/v1/chat/completions
qwen-max
, qwen-max-2025-01-25
low
, medium
, high
No body