Replies: 4 comments
-
@ppipada I have some of the important parameters as optional settings, which you can overwrite in your settings.json. Can you check the existing params and share if you think we should add more? |
Beta Was this translation helpful? Give feedback.
-
@gencay The usecase I am talking about is slightly different. The usecase I am talking about is about prompt string specific settings. E.g: I want to add prompt: "write max and min values test case" with temperature=0 and best_of=1, but a prompt "create a workflow using following apis" needs temperature=0.5 and best_of=2 for better results. We can have multiple such saved prompts and associated settings in a file someplace in a predefined format. The plugin can then read the file, load the prompts and settings in a struct, and then use it when invoked (not sure about how the UX for invoke should be, but even from command pallett or side pallett is fine). Hope that clarifies the usecase. The params we need to tweak prominently are for my specific usecase: but, in general we may want to keep things generic for options in openAI completions reference: https://beta.openai.com/docs/api-reference/completions/create |
Beta Was this translation helpful? Give feedback.
-
Got it. The vs-code context menu items are not extensible (predefined in package.json at deployment-time) for the predefined prompts. Thus, I added another option for overriding a prompt with ad-hoc. What you're proposing is having multiple ad-hoc commands with more options available. There may not be a straightforward solution to this I am unsure of the capability that vs-code provides for the extensions. I will look into this but if there are upvotes to your suggestion in the meantime I will try to prioritize this. |
Beta Was this translation helpful? Give feedback.
-
Hey would you guys mind providing me with some resources, so I can learn all this stuff you're talking about. It's pretty interesting to me, but I'm still new to Chat Gpt |
Beta Was this translation helpful? Give feedback.
-
I am using the OpenAI GPT3 API key method to get some code suggestions (over chatGPT).
I am doing this so that I can control the input params to the completions endpoint. Having this control over input params (like temperature, stop indicators, etc) gives better results for non generic usecases like workflow test generation, edge cases unit tests, security related fuzz inputs etc.
This generally results into a bunch of promts before the code, with every prompt having specific settings.
Is there a way we could save this prompts+settings in a file (json, yml, etc), get the file path in settings, and navigate to it the top level command promt (cmd + shift + p menu), or any other way?
I myself can create a PR for this if you can please guide me on how to achieve this.
If you feel that this doesnt align to this repos goal, can you please help me understand where would I change things to make this tentatively happen (may be non-elegantly is also fine.) I cannot seem to locate in the repo where do I control such settings for OpenAI key method (please pardon my ignorance of TS, VSPlugin ecosystem).
Beta Was this translation helpful? Give feedback.
All reactions