Llama-3-chat-hf
Model Overview
This model is optimized for dialogue use cases and outperform many existing open-source chat models on common industry benchmarks.
You can also view a detailed comparison of this model on our main website.
How to Make a Call
API Schema
Code Example
import requests
import json # for getting a structured output with indentation
response = requests.post(
"https://api.aimlapi.com/v1/chat/completions",
headers={
# Insert your AIML API Key instead of <YOUR_AIMLAPI_KEY>:
"Authorization":"Bearer <YOUR_AIMLAPI_KEY>",
"Content-Type":"application/json"
},
json={
"model":"meta-llama/Llama-3-70b-chat-hf",
"messages":[
{
"role":"user",
"content":"Hello" # insert your prompt here, instead of Hello
}
],
}
)
data = response.json()
print(json.dumps(data, indent=2, ensure_ascii=False))
Last updated
Was this helpful?