Claude 3 device use is priced within the following manner. The API name is priced precisely the identical as a standard API name, however an extra “system immediate tokens” are added on high:
- Claude 3 Opus: 395 tokens
- Claude 3 Sonnet: 159 tokens
- Claude 3 Haiku 264 tokens.
All APIs devour further tokens when instruments are used. These further tokens embody device parameters and gear content material blocks.
Claude 3 provides in comparison with OpenAI equal and arguably higher efficiency when it comes to high quality of the response, pace and value in non-tool use circumstances for LLMs and VLMs.
Based mostly on right now’s common launch — Claude 3 matches with GPT-4 within the device use efficiency as properly.
So, let’s get began.
Let’s import the libraries
#!pip set up anthropic #first time solely
import anthropic
import os
import base64
import httpx
Import the API key from the environmental variable. Begin the shopper object.
key = "anthropic_key"
shopper = anthropic.Anthropic(api_key=os.getenv(key))
We at the moment are able to make API calls.
response = shopper.beta.instruments.messages.create(
mannequin="claude-3-haiku-20240307",
max_tokens=1024,
instruments=[
{
"name": "get_weather",
"description": "Get the current weather in a given location",
"input_schema": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA"
},
"unit": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
"description": "The unit of temperature, both "celsius" or "fahrenheit""
}
},
"required": ["location"]
}
}
],
messages=[{"role": "user", "content": "What is the weather like in San Francisco?"}]
)
I can now learn the API preliminary response.
print(response)
The preliminary response consists of “stop_reason”: “tool_use”, which refers the API is ready to obtain again from the consumer facet the device end result.
In essence, we have now to this point acquired consumer request, Claude API has transformed this into response, which defines a necessity for utilizing a device. So, let’s outline a device response, which we might obtain again from a weather-tool API:
{
"position": "consumer",
"content material": [
{
"type": "tool_result",
"tool_use_id": "toolu_xxxxxxxxxxxxxxxx",
"content": "65 degrees"
}
]
}
So, I can now ship the climate tool-API end result again to Claude API, so it could actually generate a response again to the top consumer. The API name is similar, besides we add two further strains of “messages”:
- Claude API earlier response with “assistant”-role
- The weather-tool API response as a “consumer”-role and the “tool_use_id” of the earlier Claude API name.
So, the ultimate API name appears the next
response_final = shopper.beta.instruments.messages.create(
mannequin="claude-3-haiku-20240307",
max_tokens=1024,
instruments=[
{
"name": "get_weather",
"description": "Get the current weather in a given location",
"input_schema": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA"
},
"unit": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
"description": "The unit of temperature, both "celsius" or "fahrenheit""
}
},
"required": ["location"]
}
}
],
messages=[{"role": "user", "content": "What is the weather like in San Francisco?"},
{"role": "assistant", "content": response.content},
{
"role": "user",
"content": [
{
"type": "tool_result",
"tool_use_id": "ADD_HERE_TOOL_USE_ID_FROM_PRIOR_API_CALL",
"content": [{"type": "text", "text": "65 degrees"}]
}
]
}]
)
The result’s the ultimate response:
print(response_final.content material[0].textual content)
We’ve now responded to the top consumer.