Create Experiment Stream

POST

Response

This endpoint returns a stream of object.
datalist of objects

Array containing the chat responses.

provider_responseslist of any

The raw responses returned by the model provider.

project_idstringOptional

Unique identifier of the parent project. Will not be provided if the request was made without providing a project name or id

num_samplesintegerOptionalDefaults to 1

The number of chat responses.

logprobsintegerOptional

Include the log probabilities of the top n tokens in the provider_response

suffixstringOptional

The suffix that comes after a completion of inserted text. Useful for completions that act like inserts.

userstringOptional

End-user ID passed through to provider call.

usageobjectOptional

Counts of the number of tokens used and related stats.

metadatamap from strings to anyOptional

Any additional metadata to record.

provider_requestmap from strings to anyOptional

The raw request sent to the model provider.

session_idstringOptional

ID of the session if it belongs to one.

tool_choice"none" or "auto" or "required" or objectOptional

Controls how the model uses tools. The following options are supported: ‘none’ forces the model to not call a tool; the default when no tools are provided as part of the model config. ‘auto’ the model can decide to call one of the provided tools; the default when tools are provided as part of the model config. Providing {‘type’: ‘function’, ‘function’: {name’: <TOOL_NAME>}} forces the model to use the named function.