Tools
Tools extend Prompts with access to external data sources and enable them to take action.
Tools on Humanloop are used to extend Prompts with access to external data sources and enable them to take action.
Function calling with LLMs
The most capable Large Language Models (LLMs), including models from OpenAI and Anthropic, support function calling. You can provide these models with definitions of the available tools. The model then decides whether to call a tool, and which parameters to use.
Tools and their schemas can be easily managed and version controlled on Humanloop. This is especially valuable when iterating on tool definitions in the Humanloop Editor, as you can make changes to your schema and directly see how the changes impact the model’s output.
Integrations
Humanloop also offers pre-built tools with integrations to popular services like Google Search and Pinecone. Tools using these integrations can run automatically on Humanloop, with their results appearing in both the UI and API responses.
Some Tools can be called directly from within prompt templates. In that case, the tool’s output is automatically inserted into the prompt before it’s sent to the model. This makes it easy to include dynamic information, such as search results or database queries.