-
Notifications
You must be signed in to change notification settings - Fork 30.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Copilot Chat lmTools
API does not reliably use tool responses
#225737
Comments
I've seen similar behavior, and have been experimenting with this. I think there are two things that I need to do
I've been looking at the API part but I just realized this morning that the second point seems pretty impactful, so I will include the list of functions that have been used in the conversation thread, with And yes internally we're also still on OpenAI's deprecated "functions" API, we will move to the "tools" API soon, maybe that will help too. |
I tried using a user message instead of a role=function message
????? |
Anyway, we already label context messages for the old variables API, I had just hoped I could get away from that, but something like this is still needed.
And another reason I won't go the |
Does this issue occur when all extensions are disabled?: No, depends on Copilot Chat and another extension to use the lmTools API proposal
Steps to Reproduce:
npx --package yo --package generator-code -- yo code
and accept all prompts to create a new TypeScript extension.npx vscode-dts dev
.Note that with verbose logging enabled, it's clear that GitHub Copilot did run the registered tool and include the tool response in the request.
cc @isidorn @lukka @spebl @sinemakinci1 @esweet431
The text was updated successfully, but these errors were encountered: