magnum-v4
Model Overview
A LLM fine-tuned on top of Qwen2.5, specifically designed to replicate the prose quality of the Claude 3 models, particularly Sonnet and Opus. It excels in generating coherent and contextually rich text.
How to Make a Call
API Schema
Code Example
import requests
import json # for getting a structured output with indentation
response = requests.post(
"https://api.aimlapi.com/v1/chat/completions",
headers={
# Insert your AIML API Key instead of <YOUR_AIMLAPI_KEY>:
"Authorization":"Bearer <YOUR_AIMLAPI_KEY>",
"Content-Type":"application/json"
},
json={
"model":"anthracite-org/magnum-v4-72b",
"messages":[
{
"role":"user",
"content":"Hello" # insert your prompt here, instead of Hello
}
],
"enable_thinking": False
}
)
data = response.json()
print(json.dumps(data, indent=2, ensure_ascii=False))
Last updated
Was this helpful?