Logs

Update

PATCH
Update a logged datapoint in your Humanloop project.

Path parameters

idstringRequired

String ID of logged datapoint to return. Starts with data_.

Request

This endpoint expects an object.
output
stringOptional
Generated output from your model for the provided inputs.
error
stringOptional
Error message if the log is an error.
duration
doubleOptional
Duration of the logged event in seconds.

Response

This endpoint returns an object
id
string

String ID of logged datapoint. Starts with data_.

config
union
evaluation_results
list of objects
observability_status
enum
Status of a project datum for observability.
Allowed values: pendingrunningcompletedfailed
updated_at
datetime
project
stringOptional
The name of the project associated with this log
project_id
stringOptional
The unique ID of the project associated with this log.
session_id
stringOptional
ID of the session to associate the datapoint.
session_reference_id
stringOptional

A unique string identifying the session to associate the datapoint to. Allows you to log multiple datapoints to a session (using an ID kept by your internal systems) by passing the same session_reference_id in subsequent log requests. Specify at most one of this or session_id.

parent_id
stringOptional
ID associated to the parent datapoint in a session.
parent_reference_id
stringOptional

A unique string identifying the previously-logged parent datapoint in a session. Allows you to log nested datapoints with your internal system IDs by passing the same reference ID as parent_id in a prior log request. Specify at most one of this or parent_id. Note that this cannot refer to a datapoint being logged in the same request.

inputs
map from strings to anyOptional
The inputs passed to the prompt template.
source
stringOptional
Identifies where the model was called from.
metadata
map from strings to anyOptional
Any additional metadata to record.
save
booleanOptional
Whether the request/response payloads will be stored on Humanloop.
source_datapoint_id
stringOptional
ID of the source datapoint if this is a log derived from a datapoint in a dataset.
reference_id
stringOptional
Unique user-provided string identifying the datapoint.
trial_id
stringOptional
Unique ID of an experiment trial to associate to the log.
messages
list of objectsOptional
The messages passed to the to provider chat endpoint.
output
stringOptional

Generated output from your model for the provided inputs. Can be None if logging an error, or if logging a parent datapoint with the intention to populate it later

config_id
stringOptional
Unique ID of a config to associate to the log.
environment
stringOptional
The environment name used to create the log.
feedback
list of objectsOptional
created_at
datetimeOptional
User defined timestamp for when the log was created.
error
stringOptional
Error message if the log is an error.
duration
doubleOptional
Duration of the logged event in seconds.
output_message
objectOptional
The message returned by the provider.
prompt_tokens
integerOptional
Number of tokens in the prompt used to generate the output.
output_tokens
integerOptional
Number of tokens in the output generated by the model.
prompt_cost
doubleOptional
Cost in dollars associated to the tokens in the prompt.
output_cost
doubleOptional
Cost in dollars associated to the tokens in the output.
provider_request
map from strings to anyOptional
Raw request sent to provider.
provider_response
map from strings to anyOptional
Raw response received the provider.
user
stringOptional
User email address provided when creating the datapoint.
provider_latency
doubleOptional
Latency of provider response.
tokens
integerOptional
Total number of tokens in the prompt and output.
raw_output
stringOptional
Raw output from the provider.
finish_reason
stringOptional
Reason the generation finished.
metric_values
list of objectsOptional
tools
list of objectsOptional
A result from a tool used to populate the prompt template
tool_choice
unionOptional
Controls how the model uses tools. The following options are supported: 'none' forces the model to not call a tool; the default when no tools are provided as part of the model config. 'auto' the model can decide to call one of the provided tools; the default when tools are provided as part of the model config. Providing {'type': 'function', 'function': {name': <TOOL_NAME>}} forces the model to use the named function.

Errors