Tools extend Prompts with access to external data sources and enable them to take action.

Tools on Humanloop are used to extend Prompts with access to external data sources and enable them to take action.

Function calling with LLMs

The most capable Large Language Models (LLMs), including models from OpenAI and Anthropic, support function calling. You can provide these models with definitions of the available tools. The model then decides whether to call a tool, and which parameters to use.

Tools and their schemas can be easily managed and version controlled on Humanloop. This is especially valuable when iterating on tool definitions in the Humanloop Editor, as you can make changes to your schema and directly see how the changes impact the model’s output.

1{
2 "name": "get_current_weather",
3 "description": "Get the current weather in a given location",
4 "parameters": {
5 "type": "object",
6 "properties": {
7 "location": {
8 "type": "string",
9 "name": "Location",
10 "description": "The city and state, e.g. San Francisco, CA"
11 },
12 "unit": {
13 "type": "string",
14 "name": "Unit",
15 "enum": ["celsius", "fahrenheit"]
16 }
17 },
18 "required": ["location"]
19 }
20}
A Tool definition for getting current weather information.

Integrations

Humanloop also offers pre-built tools with integrations to popular services like Google Search and Pinecone. Tools using these integrations can run automatically on Humanloop, with their results appearing in both the UI and API responses.

Some Tools can be called directly from within prompt templates. In that case, the tool’s output is automatically inserted into the prompt before it’s sent to the model. This makes it easy to include dynamic information, such as search results or database queries.