13 Comments
Jul 3, 2023·edited Jul 3, 2023Liked by Abhinav Upadhyay

Very detailed write-up. Thanks for sharing this.

The plugins integration part is missing from the blog. I copied this part from the github source code (chat.py) to get the plugin features work.

if self.plugins:

json_data.update({"functions": self._get_functions()})

Expand full comment
author

Thank you, YJ! Both for liking the article and for reading it deeply :)

Expand full comment

Amazing. I am always inspired by the amount of detail you put to your articles. They are so comprehensive.

Expand full comment
author

Thanks Mir :)

Expand full comment
Aug 16, 2023·edited Aug 16, 2023

Hi Abhinav! Thanks for the amazing tutorial. I ran into the error of "no result written to stdout" when using PythonInterpreterPlugin. So I tested the exec() function, found it won't print anything when executing statements. Did I miss something?

Expand full comment
author

Hi Maugham, thank you for reading it.

So, it isn't a problem with the Python code we wrote, but sometimes the GPT API does not follow the instructions closely. In the get_description() method of the code interpreter plugin implementation we gave the function description as "Execute the given python code return the result from stdout.".

Think of the function description as yet another prompt instruction to the GPT model. It will generate the input for the function based on these instructions. Here, we are asking it to generate code which prints the result on stdout. Because, for simplicity, it is easier to call exec() and read its output. However, if the code generated by the model does not print anything then we will not be able to read the result.

This is a limitation with the GPT model. The function descriptions also need to be very carefully written. You can try to change the get_description() function for the code interpreter plugin to return this:

"""

Read the value printed by the given python code.

The code SHOULD explicitly call print so that

this function returns the output.

For example: "import math; print(math.pi)"

is correct

But "import math; math.pi" is incorrect because

it doesn't print the value

"""

I discussed this and couple of other issues with the function calling API in my follow-up article: https://codeconfessions.substack.com/p/navigating-sharp-edges-in-openais. The above fix I provided is taken from there.

Expand full comment
Aug 17, 2023Liked by Abhinav Upadhyay

I just tried your solution and it worked! Thanks for your prompt reply and it really inspired me of how to write proper prompt instructions. Learned a lot! Can't wait to read the follow-up articles.

Expand full comment
author

Thanks Maugham. I am happy that my article was useful for you. I hope future articles will also do the same. Do share this substack with in your circle :)

Expand full comment

I'm setting up function calling to execute additional functionality on my app. I'm wondering do you know if it's possible for the response to include both assistant message content(i.e. not None/Null) as well as a function argument response?

Expand full comment
author
Aug 4, 2023·edited Aug 4, 2023Author

Yes, you can append the complete conversation history along with the function response. See this example from OpenAI docs (https://platform.openai.com/docs/guides/gpt/function-calling):

# Step 4: send the info on the function call and function response to GPT

messages.append(response_message) # extend conversation with assistant's reply

messages.append(

{

"role": "function",

"name": function_name,

"content": function_response,

}

) # extend conversation with function response

second_response = openai.ChatCompletion.create(

model="gpt-3.5-turbo-0613",

messages=messages,

) # get a new response from GPT where it can see the function response

return second_response

Expand full comment

Hi Abhinav! Great write-up. Thanks! What do you envision as the future of the ChatGPT plugin store, now that functions exist?

Expand full comment
author

Thanks Jakob. The plugins in ChatGPT are just one the possibilities with function calls. And it definitely enables you and me to build our own ChatGPT app, powered with out own plugins, either for personal use or commercial. And for a non-developer person, the OpenAI's ChatGPT Plus subscription still remains an option unless there is an alternative presented to them.

However, functions open up opportunities in other areas as well. One thing is that they streamline how we write apps using the ChatGPT APIs. Instead of building a state-machine logic in the app, you carefully craft a set of functions and ChatGPT model calls those functions one by one in a sequence until it has an answer. Although, this may increase the latency of operations. I hope other LLMs also come up with similar feature allowing us to leverage a local or embedded model to cut costs and latency. For example, imagine a Q and A app which generates answer based on data in a vector database. Now, if you want to introduce fact checking, you can add it as a function which gets called by the LLM after it generates the answer. Without functions, it would have been a headache to do.

There are other applications of functions as well. For example, agents like AutoGPT. I think we should see some cool applications being built using this in the coming days.

Expand full comment
author

I wrote a follow up on this, it covers some of the sharp edges we may run into as we use function calls more extensively: https://codeconfessions.substack.com/p/navigating-sharp-edges-in-openais

Expand full comment