Call a Prompt
A guide on how to call your Prompts that are managed on Humanloop.
This guide will show you how to call your Prompts as an API, enabling you to generate responses from the large language model that uses the versioned template and parameters. If you want to call an LLM with a prompt that you’re defining in code follow the guide on Calling a LLM through the Humanloop Proxy.
Call an existing Prompt
Prerequisites
Before you can use the new prompt.call()
method, you need to have a Prompt. If you don’t have one, please follow our Prompt creation guide first.
Call the LLM with a prompt that you’re defining in code
Response
🎉 Now that you have chat messages flowing through your Prompt you can start to log your end user feedback to evaluate and improve your models.