May

Cohere

_ May 23rd, 2023_

We’ve just added support for Cohere to Humanloop!

This update adds Cohere models to the playground and your projects - just add your Cohere API key in your organization’s settings. As with other providers, each user in your organization can also set a personal override API key, stored locally in the browser, for use in Cohere requests from the Playground.

Enabling Cohere for your organization

Add your Cohere API key to your organization settings to start using Cohere models with Humanloop.

Working with Cohere models

Once you’ve successfully enabled Cohere for your organization, you’ll be able to access it through the playground and in your projects, in exactly the same way as your existing OpenAI and/or Anthropic models.

REST API and Python / TypeScript support

As with other model providers, once you’ve set up a Cohere-backed model config, you can call it with the Humanloop REST API or our SDKs.

1import { Humanloop } from "humanloop";
2
3const humanloop = new Humanloop({
4 apiKey: "API_KEY",
5});
6
7const chatResponse = await humanloop.chat({
8 project: "project_example",
9 messages: [
10 {
11 role: "user",
12 content: "Write me a song",
13 },
14 ],
15 provider_api_keys: {
16 cohere: COHERE_API_KEY,
17 },
18 model_config: {
19 model: "command",
20 temperature: 1,
21 },
22});
23
24console.log(chatResponse);

If you don’t provide a Cohere API key under the provider_api_keys field, the request will fall back on the stored organization level key you configured above.


Improved Python SDK

May 17th, 2023

We’ve just released a new version of our Python SDK supporting our v4 API!

This brings support for:

  • 💬 Chat mode humanloop.chat(...)
  • 📥 Streaming support humanloop.chat_stream(...)
  • 🕟 Async methods humanloop.acomplete(...)

https://pypi.org/project/humanloop/

Installation

pip install --upgrade humanloop

Example usage

1complete_response = humanloop.complete(
2 project="sdk-example",
3 inputs={
4 "text": "Llamas that are well-socialized and trained to halter and lead after weaning and are very friendly and pleasant to be around. They are extremely curious and most will approach people easily. However, llamas that are bottle-fed or over-socialized and over-handled as youth will become extremely difficult to handle when mature, when they will begin to treat humans as they treat each other, which is characterized by bouts of spitting, kicking and neck wrestling.[33]",
5 },
6 model_config={
7 "model": "gpt-3.5-turbo",
8 "max_tokens": -1,
9 "temperature": 0.7,
10 "prompt_template": "Summarize this for a second-grade student:\n\nText:\n{{text}}\n\nSummary:\n",
11 },
12 stream=False,
13)
14pprint(complete_response)
15pprint(complete_response.project_id)
16pprint(complete_response.data[0])
17pprint(complete_response.provider_responses)

Migration from 0.3.x

For those coming from an older SDK version, this introduces some breaking changes. A brief highlight of the changes:

  • The client initialization step of hl.init(...) is now humanloop = Humanloop(...).
    • Previously provider_api_keys could be provided in hl.init(...). They should now be provided when constructing Humanloop(...) client.
    • 1humanloop = Humanloop(
      2 api_key="YOUR_API_KEY",
      3 openai_api_key="YOUR_OPENAI_API_KEY",
      4 anthropic_api_key="YOUR_ANTHROPIC_API_KEY",
      5)
  • hl.generate(...)’s various call signatures have now been split into individual methods for clarity. The main ones are:
    • humanloop.complete(project, model_config={...}, ...) for a completion with the specified model config parameters.
    • humanloop.complete_deployed(project, ...) for a completion with the project’s active deployment.