Published on

How to Use the OpenAI Assistants API

Authors

How to Use the OpenAI Assistants API

Since OpenAI unveiled their new features at OpenAI Dev Day, a lot of people have been fascinated by the possibilities opened up by all of these new toolsets. The new custom "GPTs" made it possible to have your own version of GPT help out with various problems, but you will quickly run into the problem that these GPTs are only available in the browser for those with paid GPT Plus accounts, making it hard to link up these tools to any automations or make them available to a broader audience. Luckily, OpenAI also baked a lot of this functionality into their new Assistants API, which can be called from anywhere.

Prerequisites

To get started, make sure that you have set up an OpenAI API account at https://platform.openai.com

This guide will use the python library, so to follow along, make sure you have the OpenAI python library installed by running:

pip install --upgrade openai

Get an API key generated at https://platform.openai.com/api-keys and export it in your shell to use it in the library:

export OPENAI_API_KEY="<your_api_key>"

The Basics

For any usage of the Assistants API, you'll want to use the OpenAI client to create an assistant, then create a thread to store messages and finally create a run to feed the messages from your thread into your assistant. The most basic version of this looks like:

from openai import OpenAI
import time

client = OpenAI()

# Create your assistant
assistant = client.beta.assistants.create(
  name="Helpful Assistant",
  instructions="You are a helpful assistant",
  model="gpt-4-1106-preview",
)

# Create a thread
thread = client.beta.threads.create()

# Place your first message into your thread
client.beta.threads.messages.create(
  thread_id=thread.id,
  role="user",
  content="Tell me a joke",
)

# Create a run
run = client.beta.threads.runs.create(
  thread_id=thread.id,
  assistant_id=assistant.id,
)

# Wait for your run to finish
# If you have tools defined, be sure to check for the 'requires_action' status
while run.status != 'completed':
  time.sleep(5) # Sleep for 5 seconds between checks to limit calls
  run = client.beta.threads.retrieve(
    thread_id=thread.id,
    run_id=run.id,
  )

# Print out the response
messages = client.beta.threads.messages.list(thread_id=thread.id)
print(messages[0].content[0].text.value)

While this is a bit more verbose than using the traditional completion endpoints, it's important to note the note a few key benefits:

  • The assistant can be used for multiple threads, encapsulating some of the boilerplate for "system" messages
  • Using threads means that you don't need to find a way to maintain the chat log on your end
  • The ability to use "tools" besides functions

The drawback over the traditional completion endpoints is having less options to fine tune your output, such as specifying the temperature of the response or the max_tokens, so be sure to weigh your options when choosing which api you want to use for your project.

Files

The Retrieval and Code Interpreter tools benefit from having files passed into them. In order to use a file, you will need to upload the file to OpenAI with the purpose set to "assistants" and then either give it to an assistant, where it can be used for every run of that assistant, or pass it in along with a message, where it can only be used in the thread that message is a part of.

Here is an example of a txt file given to an assistant:

from openai import OpenAI

# Be sure to open your file with 'rb' when creating assistant files
file = client.files.create(file=open("data.txt", "rb"), purpose="assistants")

assistant = client.beta.assistants.create(
  name="Content Assistant",
  instructions="You are a helpful assistant. When asked a question, you will answer using the contents of your knowledge base.",
  model="gpt-4-1106-preview",
  tools=[{"type": "retrieval"}],
  file_ids=[file.id] # Here is where the file is included for the assistant
)

Here is an example of a CSV file passed with a message:

from openai import OpenAI

# Be sure to open your file with 'rb' when creating assistant files
file = client.files.create(file=open("data.csv", "rb"), purpose="assistants")

assistant = client.beta.assistants.create(
  name="Content Assistant",
  instructions="You are a helpful assistant. When asked a question, you will answer using the contents of your knowledge base.",
  model="gpt-4-1106-preview",
  tools=[{"type": "code_interpreter"}],
)

thread = client.beta.threads.create()

client.beta.threads.messages.create(
  thread_id=thread.id,
  role="user",
  content="Tell me a joke",
  file_ids=[file.id] # Here is where the file is included for the message
)

Cleanup

OpenAI charges for file storage per GB/assistant/day so make sure that when you try out these APIs that you cleanup any unused assistants and files.

Here's how you cleanup an assistant and a file in your current script:

client.beta.assistants.delete(assistant.id)
client.files.delete(file.id)

Here's how to cleanup every assistant and file on your account at once, be sure to only run this if you're absolutely sure you don't need any of the assistants/files on your OpenAI account/organization:

from openai import OpenAI
client = OpenAI()

my_assistants = client.beta.assistants.list(
    order="desc",
    limit="100",
)
for assistant in my_assistants:
  client.beta.assistants.delete(assistant.id)

my_files = client.files.list(
  purpose="assistants",
)
for file in my_files:
  client.files.delete(file.id)

Tools

The real power in the Assistants API is in its ability to use tools as it deems necessary. I have guides that go over the three tool types that were initially launched with the Assistants API so you can get the full set of benefits the Assistants API has to offer: