Prompt Files

By Conor KellyGrowth

Managing prompts effectively is key to building reliable and scalable applications. As language models become more popular, the need for a structured and version-controlled approach to storing and managing prompts has never been more important.


This is where prompt files come in - a serialized format designed to make prompt management easier, more consistent, and fully integrated into your development workflow.

What is a Prompt File?

A prompt file is a structured, serialized format for defining and managing prompts used in AI systems, particularly those powered by large language models (LLMs). Designed to be human-readable and easy to integrate into version control systems like Git, prompt files allow technical teams to treat prompts as first-class artifacts in their development workflow.

The format is heavily inspired by MDX, combining YAML headers for configuration with JSX-like syntax for defining chat templates, multi-modal interactions, and tool integrations. This approach makes it simple to manage everything from basic text-based prompts to complex workflows involving images and external tools.

Why do you Need a Prompt File?

Prompt Files let developers store prompts alongside their source code in a human-readable format that integrates seamlessly with version control systems like Git. This means prompts remain consistent across environments and are easily accessible for collaboration, debugging, iteration.

How Does a Prompt File Work?

A prompt file works by combining structured configuration with dynamic templating to create reproducible, version-controlled interactions with large language models (LLMs). At its core, the format uses a YAML header for model settings and a JSX-inspired syntax to define:

  • Chat templates
  • Multi-modal inputs
  • Tool integrations

Prompt File Format

The prompt file format is a structured approach to managing AI prompts, combining YAML configurations with JSX-inspired templating.

A prompt file has two main sections:

  1. YAML header: Configuration for model parameters and tools
  2. Body: Chat templates using XML-like syntax
---  
model: gpt-4o  
temperature: 0.7  
max_tokens: -1  
provider: openai  
endpoint: chat  
---  
<system>  
  You are a friendly assistant.  
</system>  

Multi-Modality and Images

Prompt files natively support multi-modal interactions through structured XML-like tags. Developers can combine text and images in a single prompt while maintaining strict schema validation:

---  
model: gpt-4o  
temperature: 0.7  
max_tokens: -1  
provider: openai  
endpoint: chat  
tools: []  
---  
<system>  
  You are a friendly assistant.  
</system>

<user>  
  <text>  
    What is in this image?  
  </text>  
  <image url="https://upload.wikimedia.org/wikipedia/commons/8/89/Antidorcas_marsupialis%2C_male_%28Etosha%2C_2012%29.jpg" />
</user>  

Tools, Tool Calls, and Tool Responses

Prompt files let developers specify tools that an AI model can use during its interactions. These tools are defined in the YAML header as a JSON list, including attributes such as:

  • Tool name
  • Description
  • Parameters
  • Required fields

The assistant can invoke these tools using <tool> tags within its message, passing arguments in JSON format. The tool's response is then returned in a corresponding <tool> tag.

1: Tool Definition in YAML Header

These definitions declare available tools and their specifications:

tools: [  
  {  
    "name": "get_current_weather",  
    "description": "Get the current weather in a given location",  
    "parameters": {  
      "type": "object",  
      "properties": {  
        "location": {"type": "string", "name": "Location", "description": "The city and state"},  
        "unit": {"type": "string", "name": "Unit", "enum": ["celsius", "fahrenheit"]}  
      },  
      "required": ["location"]  
    }  
  }  
]  

2: Tool Call

Using the defined tool, we can initiate the LLM:

<assistant>  
  <tool name="get_current_weather" id="call_1ZUCTfyeDnpqiZbIwpF6fLGt">  
    {"location": "San Francisco, CA"}  
  </tool>  
</assistant>  

3: Tool Response

The response then provides tool execution results back to the LLM:

<tool name="get_current_weather" id="call\_1ZUCTfyeDnpqiZbIwpF6fLGt">  
  Cloudy with a chance of meatballs.  
</tool>  

Learn More About Prompt Files

Prompt files let teams unlock new levels of consistency, collaboration, and control in their AI development workflows.

At Humanloop, we empower enterprises to implement production-grade prompt management solutions, including support for prompt files, multi-modal interactions, and tool-augmented workflows. Our platform provides the tools enterprises need to simplify development and optimize LLM performance across complex applications.

To find out more about how prompt files and Humanloop’s enterprise-grade AI development platform can accelerate your team’s workflow, book a demo today.

About the author

avatar
Conor Kelly
Growth
Conor Kelly is the Growth lead at Humanloop. He is an expert in generative AI application development and did graduate research on retrieval augmented generation (RAG) at UCL. Conor regularly appears on high-profile media outlets to discuss generative AI was nominated as Ireland's Young AI Role Model of the Year in 2023.
Twitter
𝕏@conorkellyai
LinkedIn
LinkedIn iconLinkedIn

Ready to build successful AI products?

Book a 1:1 demo for a guided tour of the platform tailored to your organization.

Ā© 2020 - 2045 Humanloop, Inc.
HIPAAHIPAA