Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Trying to run litellm proxy for ollama/llama2 results in FileNotFoundError #858

Closed
kumaranvpl opened this issue Nov 21, 2023 · 7 comments · Fixed by #902
Closed
Labels
bug Something isn't working

Comments

@kumaranvpl
Copy link
Contributor

What happened?

I have started ollama using the docker image and pulled the llama2 model in it.
When I try to start litellm proxy for ollama/llama2 using the following command

litellm --model ollama/llama2 --api_base http://localhost:11434

it results in

FileNotFoundError: [Errno 2] No such file or directory: 'ollama'

Relevant log output

$ litellm --model ollama/llama2 --api_base http://localhost:11434  
Traceback (most recent call last):
  File "/home/azureuser/venv/bin/litellm", line 8, in <module>
    sys.exit(run_server())
  File "/home/azureuser/venv/lib/python3.10/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
  File "/home/azureuser/venv/lib/python3.10/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
  File "/home/azureuser/venv/lib/python3.10/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/azureuser/venv/lib/python3.10/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
  File "/home/azureuser/venv/lib/python3.10/site-packages/litellm/proxy/proxy_cli.py", line 116, in run_server
    run_ollama_serve()
  File "/home/azureuser/venv/lib/python3.10/site-packages/litellm/proxy/proxy_cli.py", line 24, in run_ollama_serve
    process = subprocess.Popen(command, stdout=devnull, stderr=devnull)
  File "/usr/lib/python3.10/subprocess.py", line 971, in __init__
    self._execute_child(args, executable, preexec_fn, close_fds,
  File "/usr/lib/python3.10/subprocess.py", line 1863, in _execute_child
    raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: 'ollama'

Twitter / LinkedIn details

@kumaranvpl

@kumaranvpl kumaranvpl added the bug Something isn't working label Nov 21, 2023
@ishaan-jaff
Copy link
Contributor

hmmm @kumaranvpl it looks like it's working fine for me, the error is caused when running ollama_serve()

@ishaan-jaff
Copy link
Contributor

this is what happens on run_ollama_serve

def run_ollama_serve():
    command = ['ollama', 'serve']
    
    with open(os.devnull, 'w') as devnull:
        process = subprocess.Popen(command, stdout=devnull, stderr=devnull)

@kumaranvpl
Copy link
Contributor Author

@ishaan-jaff I have tried the command in two different ubuntu machines. Both returned the same error. I have used python3 venv and the python version is 3.10.12 and pip version is 22.0.2.

@ishaan-jaff
Copy link
Contributor

@kumaranvpl it's just running ollama serve - we can try/catch it so it does not block execution of the proxy server

@ishaan-jaff
Copy link
Contributor

@kumaranvpl fix here: 2a35ff8

@krrishdholakia
Copy link
Contributor

Closing issue as fix is in prod. @kumaranvpl please re-open if the issue persists.

@kumaranvpl
Copy link
Contributor Author

@ishaan-jaff @krrishdholakia Nope. It is still not fixed. I have created a PR here - #902. Please review and merge.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants