-
Notifications
You must be signed in to change notification settings - Fork 7.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ChatGPT Plugin Functionality #1417
Conversation
ChatGPT Plugins
What's this going to take to get this review. Me Need... |
Sorry, but I don't think it's me who should be reviewing this. |
Taking a look @Icemaster-Eric . I'm really excited about supporting plugin calling against open ai's plugin sepc but need to think about edge cases
I'll be giving this is a deeper review this week and floating it around in the discord for discusion. |
Thanks for the suggestions!
|
response = requests.request(method, url, json=parameters) | ||
|
||
# Check if the response is successful | ||
if response.ok: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe we can reverse this condition and clean up the nesting:
if response is error, throw value error,
if response.text.strip():
...continue
This comment was marked as spam.
This comment was marked as spam.
Any updates here? Lack of browsing and plugin support is the main thing holding me to ChatGPT for now. Adding this here would be huge. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This needs to be updated for conflicts, but we certainly would love this functionality. Currently out of date :(
It seems like this PR will never be finished. I prefer the approach in ggml-org/llama.cpp#5695 which relies on a model-specific template to actually include function calls in the middle of a conversation instead of as the sole response of the model. |
Describe your changes
Added chatgpt style plugin functionality to the python bindings for GPT4All. The existing codebase has not been modified much. The only changes to gpt4all.py is the addition of a
plugins
parameter in the GPT4All class that takes an iterable of strings, and registers each plugin url and generates the final plugin instructions. I've added the plugin functions in a newplugins.py
file, along with some tests in the tests folder.Issue ticket number and link
#1391
Checklist before requesting a review
Demo
Installation / Usage
Build this repo from source locally using the build instructions.
Usage is very simple, by adding the
model.plugin_instructions
string to the prompt. Give the model output to themodel.get_plugin_response
function, which will return a string containing the plugin response details. Finally, you can give the plugin response to the model in a prompt again to get the final output.Plugin URLs should usually end with
/.well-known/ai-plugin.json
, as that's the path which contains the information for the plugin.Example Code
Notes
This does not work for chatgpt restricted plugins. Currently, the 13b GPT4All-snoozy can generate plugin calls correctly. Most similar sized models should also be able to do the same. However, the complexity of the returned data has a big impact on the final response. It is recommended to trim the information returned by plugins to the minimum required if possible.