Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Connecting to ollama #38

Open
govindkailas opened this issue Nov 27, 2024 · 7 comments
Open

Connecting to ollama #38

govindkailas opened this issue Nov 27, 2024 · 7 comments

Comments

@govindkailas
Copy link

How do I connect to ollama running locally? I tried the below from the settings,
image

But it doesn't seem to be working

image

I see the spinning wheel for a while, but I don't receive any response.

Also, when I go back to the settings, everything is reset. It doesn't store the details of the ollama I provided earlier.

image

It would be helpful if you could include additional documentation on this topic.

@lee88688
Copy link
Owner

Did you click save?

@govindkailas
Copy link
Author

govindkailas commented Nov 28, 2024

Did you click save?

yes, I did. I also tried with OpenAI Compatible, but it's the same.

@govindkailas
Copy link
Author

I am connected to a remote host via ssh (vscode remote development), if that helps. But no logs or warnings.
It issued a warning while I was setting the Python path; I expected a similar message if it wasn't connecting to Ollama.

@lee88688
Copy link
Owner

I am connected to a remote host via ssh (vscode remote development), if that helps. But no logs or warnings. It issued a warning while I was setting the Python path; I expected a similar message if it wasn't connecting to Ollama.

the extension currently not supported vscode remove. when start, extension will start a aider service in the host(local or remote). but when in remote, webview in vscode(the chat and setting page) cannot connect host aider server.
maybe there is an solution for automatically redirect remote port to local.

@govindkailas
Copy link
Author

govindkailas commented Nov 28, 2024

It would be nice to at least show some warning if the connectivity is failing; just like the Python path settings.

While trying OpenAI Compatible, got the below,

console logs from vscode,
image

@lee88688
Copy link
Owner

lee88688 commented Dec 5, 2024

I am not sure if the official has provided relevant APIs for forwarding the host's port. so that I can use to auto redirect the aider's server to local. I may need to search it.

@lee88688
Copy link
Owner

lee88688 commented Dec 5, 2024

if you use SSH remote, the config(path path) need to be set to remote server's pathon path.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants