Qwen2-72B-Instruct
Model Overview
This model is stronger than the last generation of Qwen 1.5. The model's linguistic proficiency has been broadened to 27 additional languages, demonstrated state-of-the-art results across a multitude of evaluations, and the context length support was increased up to an impressive 128K tokens.
You can also view a detailed comparison of this model on our main website.
How to Make a Call
API Schema
Code Example
import requests
import json # for getting a structured output with indentation
response = requests.post(
"https://api.aimlapi.com/v1/chat/completions",
headers={
# Insert your AIML API Key instead of <YOUR_AIMLAPI_KEY>:
"Authorization":"Bearer <YOUR_AIMLAPI_KEY>",
"Content-Type":"application/json"
},
json={
"model":"Qwen/Qwen2-72B-Instruct",
"messages":[
{
"role":"user",
"content":"Hello" # insert your prompt here, instead of Hello
}
]
}
)
data = response.json()
print(json.dumps(data, indent=2, ensure_ascii=False))
Last updated
Was this helpful?