Tool Calling with the SDK
In this guide we will demonstrate how to take advantage of OpenAI function calling in our Python SDK.
The Humanloop SDK provides an easy way for you to integrate the functionality of OpenAI function calling, which we refer to as JSON Schema tools, into your existing projects. Tools follow the same universal JSON Schema syntax definition as OpenAI function calling. In this guide, we’ll walk you through the process of using tools with the Humanloop SDK via the chat endpoint.
Creating a Tool
Prerequisites
- A Humanloop account - you can create one by going to our sign up page.
- Python installed - you can download and install Python by following the steps on the Python download page.
Using other model providers
This guide assumes you’re using OpenAI with the gpt-4
model. Only specific
models from OpenAI are supported for function calling.
Install and initialize the SDK
Import the Humanloop SDK: If you haven’t done so already, you’ll need to install and import the Humanloop SDK into your Python environment. You can do this using pip:
Note, this guide was built with Humanloop==0.5.18
.
Then import the SDK in your script:
Define the tool: Define a tool using the universal JSON Schema syntax syntax. Let’s assume we’ve defined a get_current_weather tool, which returns the current weather for a specified location. We’ll add it in via a "tools": tools, field. We’ve also defined a dummy get_current_weather method at the top. This can be replaced by your own function to fetch real values, for now we’re hardcoding it to return a random temperature and cloudy for this example.
Check assistant response
The code above will make the call to OpenAI with the tool but it does nothing to handle the assistant response. When responding with a tool response the response should have a tool_calls
field. Fetch that value and pass it to your own function. An example of this can be seen below. Replace the TODO - Add assistant handling logic
in your code from above with the following. Multiple tool calls can be returned with the latest OpenAI models gpt-4-1106-preview
and gpt-3.5-turbo-1106
, so below we loop through the tool_calls and populate the response accordingly.
Return the tool response
We can then return the tool response to OpenAI. This can be done by formatting OpenAI tool message into the relative assistant
message seen below along with a tool
message with the function name and function response.
Review assistant response
The assistant should respond with a message that incorporates the parameters you provided, for example: The current weather in Boston is 22 degrees and cloudy.
The above can be run by adding the python handling logic at the both of your file:
The full code from this example can be seen below: