-
-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: Trying to run litellm proxy for ollama/llama2 results in FileNotFoundError #858
Comments
hmmm @kumaranvpl it looks like it's working fine for me, the error is caused when running |
this is what happens on
|
@ishaan-jaff I have tried the command in two different ubuntu machines. Both returned the same error. I have used python3 venv and the python version is 3.10.12 and pip version is 22.0.2. |
@kumaranvpl it's just running |
@kumaranvpl fix here: 2a35ff8 |
Closing issue as fix is in prod. @kumaranvpl please re-open if the issue persists. |
@ishaan-jaff @krrishdholakia Nope. It is still not fixed. I have created a PR here - #902. Please review and merge. |
What happened?
I have started ollama using the docker image and pulled the llama2 model in it.
When I try to start litellm proxy for ollama/llama2 using the following command
it results in
Relevant log output
Twitter / LinkedIn details
@kumaranvpl
The text was updated successfully, but these errors were encountered: