Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support for Open-Assistant.io/gpt4all #15

Open
vasnt opened this issue Jul 8, 2023 · 6 comments
Open

support for Open-Assistant.io/gpt4all #15

vasnt opened this issue Jul 8, 2023 · 6 comments

Comments

@vasnt
Copy link

vasnt commented Jul 8, 2023

instead of using openai API, can it work with other open-source/local hosted models like llama etc.?

@unconv
Copy link
Owner

unconv commented Jul 9, 2023

Currently not, because it uses the new ChatGPT API function calling. I will try to "backport" it to be able to use the older models without the function calling, and then you should be able to switch to using a different model. In theory.

@sarfraznawaz2005
Copy link

would be great if you add OPENAI_API_BASE to config file so we can use this with local modals. Thanks

@unconv
Copy link
Owner

unconv commented Jan 31, 2024

@sarfraznawaz2005 Do any local models support function calling?

@sarfraznawaz2005
Copy link

Not sure what you mean but there are local models that support openai api format such as https://github.com/josStorer/RWKV-Runner or https://lmstudio.ai. They expose local url and many similar projects now support them by exposing option for base url in which case api key is not taken into account.

autopilot strictly checks for api key otherwise gives errror. If you allow option of base url and if it is local then you can modify code to skip checking for api key.

@unconv
Copy link
Owner

unconv commented Jan 31, 2024

GPT-AutoPilot uses the OpenAI API's function calling feature which is not implemented in other LLM's than ChatGPT and Gemini as far as I know.

You can set the OPENAI_API_KEY environment variable to whatever and set OPENAI_BASE_URL to the URL of the local LLM to try it, but probably the endpoints do not support function calling.

I'm working on another project in TypeScript that does not use the function calling feature, and I might integrate it into this project as well, if I get it to work.

@sarfraznawaz2005
Copy link

yes I tried that way it does not work. Looking forward to other project :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants