Update Prompt Log
Update a Log.
Update the details of a Log with the given ID.
Path parameters
id
Unique identifier for Prompt.
log_id
Unique identifier for the Log.
Headers
X-API-KEY
Request
This endpoint expects an object.
output_message
The message returned by the provider.
prompt_tokens
Number of tokens in the prompt used to generate the output.
reasoning_tokens
Number of reasoning tokens used to generate the output.
output_tokens
Number of tokens in the output generated by the model.
prompt_cost
Cost in dollars associated to the tokens in the prompt.
output_cost
Cost in dollars associated to the tokens in the output.
finish_reason
Reason the generation finished.
messages
The messages passed to the to provider chat endpoint.
tool_choice
Controls how the model uses tools. The following options are supported:
'none'
means the model will not call any tool and instead generates a message; this is the default when no tools are provided as part of the Prompt.'auto'
means the model can decide to call one or more of the provided tools; this is the default when tools are provided as part of the Prompt.'required'
means the model must call one or more of the provided tools.{'type': 'function', 'function': {name': <TOOL_NAME>}}
forces the model to use the named function.
output
Generated output from your model for the provided inputs. Can be None
if logging an error, or if creating a parent Log with the intention to populate it later.
created_at
User defined timestamp for when the log was created.
error
Error message if the log is an error.
provider_latency
Duration of the logged event in seconds.
stdout
Captured log and debug statements.
provider_request
Raw request sent to provider.
provider_response
Raw response received the provider.
inputs
The inputs passed to the prompt template.
source
Identifies where the model was called from.
metadata
Any additional metadata to record.
start_time
When the logged event started.
end_time
When the logged event ended.
Response
Successful Response
OR
OR
OR
OR