GuidesGenerate and Log

Upload historic data

Uploading historic model inputs and generations to an existing Humanloop project.

The Humanloop Python SDK allows you to upload your historic model data to an existing Humanloop project. This can be used to warm-start your project. The data can be considered for feedback and review alongside your new user generated data.

Prerequisites

  • You already have a Prompt — if not, please follow our Prompt creation guide first.

Log historic data

Grab your API key from your Settings page.

  1. Set up your code to first load up your historic data and then log this to Humanloop, explicitly passing details of the model config (if available) alongside the inputs and output:

    1from humanloop import Humanloop
    2import openai
    3
    4# Initialize Humanloop with your API key
    5humanloop = Humanloop(api_key="<YOUR Humanloop API KEY>")
    6
    7# NB: Add code here to load your existing model data before logging it to Humanloop
    8
    9# Log the inputs, outputs and model config to your project - this log call can take batches of data.
    10log_response = humanloop.log(
    11 project="<YOUR UNIQUE PROJECT NAME>",
    12 inputs={"question": "How should I think about competition for my startup?"},
    13 output=output,
    14 config={
    15 "model": "gpt-4",
    16 "prompt_template": "Answer the following question like Paul Graham from YCombinator: {{question}}",
    17 "temperature": 0.2,
    18 },
    19 source="sdk",
    20)
    21
    22# Use the datapoint IDs to associate feedback received later to this datapoint.
    23data_id = log_response.id
  2. The process of capturing feedback then uses the returned log_id as before.

    See our guide on capturing user feedback.

  3. You can also log immediate feedback alongside the input and outputs:

    1# Log the inputs, outputs and model config to your project.
    2log_response = humanloop.log(
    3 project="<YOUR UNIQUE PROJECT NAME>",
    4 inputs={"question": "How should I think about competition for my startup?"},
    5 output=output,
    6 config={
    7 "model": "gpt-4",
    8 "prompt_template": "Answer the following question like Paul Graham from YCombinator: {{question}}",
    9 "temperature": 0.2,
    10 },
    11 source="sdk",
    12 feedback={"type": "rating", "value": "good"}
    13)