Update Prompt Log

PATCH

Update a Log.

Update the details of a Log with the given ID.

Path parameters

idstringRequired

Unique identifier for Prompt.

log_idstringRequired

Unique identifier for the Log.

Request

This endpoint expects an object.
output_messageobjectOptional

The message returned by the provider.

prompt_tokensintegerOptional

Number of tokens in the prompt used to generate the output.

output_tokensintegerOptional

Number of tokens in the output generated by the model.

prompt_costdoubleOptional

Cost in dollars associated to the tokens in the prompt.

output_costdoubleOptional

Cost in dollars associated to the tokens in the output.

finish_reasonstringOptional

Reason the generation finished.

messageslist of objectsOptional

The messages passed to the to provider chat endpoint.

tool_choice"none" or "auto" or "required" or objectOptional

Controls how the model uses tools. The following options are supported:

  • 'none' means the model will not call any tool and instead generates a message; this is the default when no tools are provided as part of the Prompt.
  • 'auto' means the model can decide to call one or more of the provided tools; this is the default when tools are provided as part of the Prompt.
  • 'required' means the model can decide to call one or more of the provided tools.
  • {'type': 'function', 'function': {name': <TOOL_NAME>}} forces the model to use the named function.
outputstringOptional

Generated output from your model for the provided inputs. Can be None if logging an error, or if creating a parent Log with the intention to populate it later.

created_atdatetimeOptional

User defined timestamp for when the log was created.

errorstringOptional

Error message if the log is an error.

provider_latencydoubleOptional

Duration of the logged event in seconds.

stdoutstringOptional

Captured log and debug statements.

provider_requestmap from strings to anyOptional

Raw request sent to provider.

provider_responsemap from strings to anyOptional

Raw response received the provider.

inputsmap from strings to anyOptional

The inputs passed to the prompt template.

sourcestringOptional

Identifies where the model was called from.

metadatamap from strings to anyOptional

Any additional metadata to record.

start_timedatetimeOptional

When the logged event started.

end_timedatetimeOptional

When the logged event ended.

Response

Successful Response

Prompt Log Responseobject
OR
Tool Log Responseobject
OR
Evaluator Log Responseobject
OR
Flow Log Responseobject

Errors