For the detailed information, please check the official API reference.
pythonopenaichatgpt
OpenAI recently released something similar to plugin functionality but for the API. Now you can describe functions to gpt-3.5-turbo-0613 and gpt-4-0613 models. In this brief article, I will underline the key points and present an illustrative code snippet demonstrating how to make an API call to Wikipedia using user input.
Important clarifications
— Models don't execute your functions. GPT-model will return arguments based on user input, so you can run the function and return the result of this function for GPT to summarize.
— Functions are part of the model context limit and are billed as an input tokens.
— Model can ask the user for additional questions, based on the required arguments.
— At this moment, you can use gpt-3.5-turbo-0613 and gpt-4-0613 models.
Make sure you installed required libraries
pip install openai
pip install requestsIntroduction to Chat Completion Functions

<a href='https://cdn.abstractkitchen.com/images/posts/openai-chat-completion-api-functions/openai-chat-completion-api-functions.jpeg' title='OpenAI Chat Completion Functions API Birdview' target='_blank'>Open this image in a new tab</a>
| 1 | The function that we'll execute based on the model response. |
| 2 | OpenAI Python library is a straightforward and convenient way to interact with the API |
| 3 | User prompt. In a real-world application it will be dynamic. |
| 4 | Function description. Here we describe available and required function arguments. Based on this information model decides if this particular function can be used and how. Take a look at the name key. Value is the same as our function name from above. |
| 5 | By using the function_call parameter, you have the ability to instruct the model to prioritize a specific function or disregard it altogether. The default value is auto if you have functions key and none if you don't have a functions key. |
Based on the provided functions model can decide which function to use and when. It will also ask the user for more questions if you have required arguments and the user hasn't provided values for them.
Chat Completion Functions Flow

<a href='https://cdn.abstractkitchen.com/images/posts/openai-chat-completion-api-functions/open-ai-gpt-functions-flow.jpeg' title='OpenAI Completion Functions API Flow' target='_blank'>Open this image in a new tab</a>
Take a look at the JSON below. It's a response from openai.ChatCompletion.create when the model decided to use one of your functions.
User prompt: Please find articles about China on Wikipedia
{
"id": "chatcmpl-someid",
"object": "chat.completion",
"created": 1687151109,
"model": "gpt-3.5-turbo-0613",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": null,
"function_call": {
"name": "search_wikipedia",
"arguments": "{\n \"query\": \"China\"\n}"
}
},
"finish_reason": "function_call"
}
],
"usage": {
"prompt_tokens": 62,
"completion_tokens": 16,
"total_tokens": 78
}
}As you can see in the response messages(["choices"][0]["message"]) we have a finish_reason key and it equals function_call. This is how the model instructs us that to proceed you have to execute a function called search_wikipedia. Arguments were generated based on the user input. Arguments are stringified, so you have to convert them with json.loads.
Search Wikipedia Example
First, let's start with our function. We'll use this function to search Wikipedia based on the user query.
def search_wikipedia(search_query):
# just some Wikipedia API query params
payload = {
"action": "opensearch",
"search": search_query,
"limit": 5,
"namespace": 0,
"format": "json"
}
response = requests.get("https://en.wikipedia.org/w/api.php", params=payload)
# it's important to convert data to string for the gpt api
return json.dumps(response.json())As you can see it's just a function where we send a request to Wikipedia API and return a raw stringified response. Ideally, you have to do some logic here and return filtered data. But, hey, I am not going the full road here, because I don't want to burden you with useless details. You get the idea.
Since this data will be used by the GPT model, the function must return stringified object.
Chat Completion API Utility
def chat_request(model, messages, **kwargs):
functions = kwargs.get("functions")
api_params = {
"model": model,
"messages": messages,
}
if functions:
api_params["functions"] = functions
try:
return openai.ChatCompletion.create(**api_params)
except Exception as e:
print("Woops! Something happened:\n")
print(f"Error: {e}")It's just an overlay on top of the openai.ChatCompletion.create function to stay DRY. As you can see we use basic parameters like model, messages, and optional functions. Messages hold user input. As a model, you can use gpt-3.5-turbo-0613 or gpt-4-0613.
Functions Junction
Take a look at this code. Remember, in the beginning, I mentioned that it's your job to execute the function? If you have more than one function this is the easiest way to select one of them to run. It might be done differently, but again — I don't want to bombard you with difficult code. In the end, it's your job to make your code unreadable.
def execute_function_call(message):
function_args = json.loads(message["function_call"]["arguments"])
results = None
if message["function_call"]["name"] == "search_wikipedia":
results = search_wikipedia(function_args["query"])
return resultsIn this snippet, I do a couple of things. Firstly, I convert the arguments string to the object. And secondly, I run the required function with these arguments.
Final Result
import openai
import requests
import json
GPT_MODEL = "gpt-3.5-turbo-0613"
openai.api_key = "YOUR_OPENAI_API_KEY"
# function to search wikipedia, based on user api
def search_wikipedia(search_query):
payload = {
"action": "opensearch",
"search": search_query,
"limit": 5,
"namespace": 0,
"format": "json"
}
response = requests.get("https://en.wikipedia.org/w/api.php", params=payload)
# it's important to convert data to string for the gpt api
return json.dumps(response.json())
# execute function based on the openai response
def execute_function_call(message):
function_args = json.loads(message["function_call"]["arguments"])
results = None
if message["function_call"]["name"] == "search_wikipedia":
results = search_wikipedia(function_args["query"])
return results
def chat_request(model, messages, **kwargs):
functions = kwargs.get("functions")
api_params = {
"model": model,
"messages": messages,
}
if functions:
api_params["functions"] = functions
try:
return openai.ChatCompletion.create(**api_params)
except Exception as e:
print("Woops! Something happened:\n")
print(f"Error: {e}")
def run_dialog():
messages = [
{
"role": "user",
"content": "Please find articles about China on Wikipedia"
}
]
functions = [
{
"name": "search_wikipedia",
"description": "Use this function to search wikipedia.",
"parameters": {
"type": "object",
"properties": {
"query": {
"type": "string",
"description": "The term to search on wikipedia.",
}
},
"required": ["query"]
}
}
]
# Send request to OpenAI
chat_response = chat_request(GPT_MODEL, messages, functions=functions)
message = chat_response["choices"][0]["message"]
# Looks like we need to run the function
if message.get("function_call"):
function_name = message["function_call"]["name"]
# Execute function
function_response = execute_function_call(message)
# Append new data to our previous messages and ...
messages.append(message)
messages.append({
"role": "function",
"name": function_name,
"content": function_response,
})
# ... send back to OpenAI
second_response = chat_request(GPT_MODEL, messages)
return second_response["choices"][0]["message"]["content"]
else:
return message["content"]
print(run_dialog())
As a result you'll have a response from the GPT-model:
Here are some articles about China on Wikipedia:
1. [China](https://en.wikipedia.org/wiki/China)
2. [China Airlines](https://en.wikipedia.org/wiki/China_Airlines)
3. [China–India relations](https://en.wikipedia.org/wiki/China%E2%80%93India_relations)
4. [China Central Television](https://en.wikipedia.org/wiki/China_Central_Television)
5. [China–United States relations](https://en.wikipedia.org/wiki/China%E2%80%93United_States_relations)
You can click on the links to access the respective articles.You can play around with this code. For example, you can customize the Wikipedia API limit. Consider this prompt: «Please find ten articles about China on Wikipedia» instead of «Please find articles about China on Wikipedia». In this case you'll have to deal with an optional argument which you'll have to check in the execute_function_call.
To do this you'll have to slightly modify functions["parameters"]["properties"] key in the openai.ChatCompletion.create and add limit key.
"parameters": {
"type": "object",
"properties": {
"query": {
"type": "string",
"description": "The term to search on wikipedia.",
},
"limit": {
"type": "number",
"description": "Number of Wikipedia links to show"
}
},
"required": ["query"]
}If you want this argument to be mandatory just add it to the required arguments: "required": ["query", "limit"]. Now, it's up to you to modify execute_function_call.