Prompts define a task for a Large Language Model.

A Prompt on Humanloop defines the instructions and configuration for guiding a Large Language Model (LLM) to perform a specific task.

Each change in any of the following properties creates a new Version of the Prompt:

  • the template such as Write a song about {{topic}}.
    For chat models, the template contains an array of messages
  • the model e.g. gpt-4o
  • the parameters to the model such as temperature, max_tokens, top_p
  • any tools available to the model

A Prompt is callable in that if you supply the necessary inputs, it will return a response from the model.

Inputs are defined in the template through the double-curly bracket syntax e.g. {{topic}} and the value of the variable will need to be supplied when you call the Prompt to create a generation.

This separation of concerns, keeping configuration separate from the query time data, is crucial for enabling you to experiment with different configurations and evaluate any changes. The Prompt stores the configuration and the query time data in Logs, which can then be used to create Datasets for evaluation purposes.

Note that we use a capitalized “Prompt” to refer to the entity in Humanloop, and a lowercase “prompt” to refer to the general concept of input to the model.

1---
2model: gpt-4o
3temperature: 1.0
4max_tokens: -1
5provider: openai
6endpoint: chat
7---
8<system>
9 Write a song about {{topic}}
10</system>
An example Prompt, serialized in the .prompt file format

Versioning

Versioning your Prompts enables you to track how adjustments to the template or parameters influence the model’s responses. This is crucial for iterative development, as you can pinpoint which configuration produces the most relevant or accurate outputs for your use cases.

A Prompt File will have multiple Versions as you iterate on different models, templates, or parameters, but each version should perform the same task and generally be interchangeable with one another.

When to create a new Prompt File

You should create a new Prompt File for each different ‘task to be done’ with an LLM. Each of these tasks can have its own separate Prompt File: Writing Copilot, Personal Assistant, Summarizer, etc.

Many users find value in creating a ‘playground’ Prompt where they can freely experiment without risking damage to their other Prompts or creating disorder.

Using Prompts

Prompts are callable as an API, allowing you to provide query-time data such as input values or user messages, and receive the model’s text output in response.

Prompts can also be used without proxying through Humanloop to the model provider. Instead, you can call the model directly and explicitly log the results to your Prompt.

Serialization

The .prompt file format is a serialized representation of a Prompt Version, designed to be human-readable and suitable for integration into version control systems alongside code.

The format is heavily inspired by MDX, with model and parameters specified in a YAML header alongside a JSX-inspired syntax for chat templates.

1---
2model: gpt-4o
3temperature: 1.0
4max_tokens: -1
5provider: openai
6endpoint: chat
7---
8<system>
9 You are a friendly assistant.
10</system>