Flow Decorator

Automatic tracing for AI features

Overview

The Flow decorator creates and manages traces for your AI feature. When applied to a function, it:

  • Creates a new trace on function invocation.
  • Adds all Humanloop logging calls made inside the function to the trace.
  • Completes the trace when the function exits.

On Humanloop, a trace is the collection of Logs associated with a Flow Log.

Usage

The flow decorator will trace all downstream Humanloop logs, whether they are created by other decorators or SDK calls.

Tracing Decorators

1@hl_client.prompt(path="MyFeature/Call LLM"):
2def call_llm(messages: List[ChatMessage]):
3 return openai.chat.completions.create(
4 model="gpt-4o-mini",
5 messages=messages
6 ).choices[0].message.content
7
8@hl_client.flow(path="MyFeature/Process")
9def process_input(inputs: list[str]) -> list[str]:
10 # Logs created by the Prompt decorator are added to the trace
11 return [
12 call_llm([{"role": "user", "content": text}])
13 for text in inputs
14 ]

Tracing SDK Calls

Logs created through the Humanloop SDK are added to the trace.

1@hl_client.flow(path="MyFeature/Process")
2def process_input(text: str) -> str:
3 # Created Log is added to the trace
4 llm_output = hl_client.prompts.call(
5 path="MyFeature/Transform",
6 messages=[{"role": "user", "content": text}]
7 ).logs[0].output_message.content
8
9 transformed_output = transform(llm_output)
10 # Created Log is added to the trace
11 hl_client.tools.log(
12 path="MyFeature/Transform",
13 tool={function: TRANSFORM_JSON_SCHEMA},
14 inputs={"text": text},
15 output=transformed_output
16 )
17
18 return transformed_output

Behavior

The decorated function creates a Flow Log when called. All Logs created inside the decorated function are added to its trace.

The Flow Log’s fields are populated as follows:

FieldTypeDescription
inputsobjectFunction arguments that aren’t ChatMessage arrays
messagesarrayChatMessage arrays passed as arguments
output_messageChatMessageReturn value if it’s a ChatMessage-like object
outputstringStringified return value if not a ChatMessage-like object
errorstringError message if function throws or return value can’t be serialized

If the decorated function returns a ChatMessage object, the output_message field is populated. Otherwise, the output field is populated with the stringified return value.

Definition

1@hl_client.flow(
2 # Required: path on Humanloop workspace for the Flow
3 path: str,
4 # Optional: metadata for versioning the Flow
5 attributes: dict[str, Any] = None
6)
7def function(*args, **kwargs): ...

The decorator will preserve the function’s signature.

The decorator accepts the following parameters:

ParameterTypeRequiredDescription
pathstringYesPath on Humanloop workspace for the Flow
attributesobjectNoKey-value object for versioning the Flow

SDK Interactions

  • It’s not possible to call flows.log() inside a decorated function. This will raise a HumanloopRuntimeError
  • To create nested traces, call another flow-decorated function.
  • Passing trace_parent_id argument to an SDK logging call inside the decorated function is ignored and emits a warning; the Log is added to the trace of the decorated function.

Error Handling

  • If user-written code (e.g. in code Evaluators) raises an exception, the relevant Log’s error field is populated with the exception message and the decorated function returns None.
  • HumanloopRuntimeError exceptions indicate incorrect decorator or SDK usage and are re-raised instead of being logged under error.

A explanation of Flows and their role in the Humanloop platform is found in our Flows documentation.