Enabling CUDA on Docker #574
Unanswered
franco-giordano
asked this question in
Q&A
Replies: 1 comment 2 replies
-
Hey @franco-giordano, thanks for the detailed report:
|
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi, I'm having trouble enabling CUDA-powered inference on my machine. What I've tried:
Loaded chat model to CPU. utils.py:32
in logs, CPU usage spikes)Any ideas? From previous discussions, issues, and docs I get the impression that it should work, but I'm not sure. Some specs:
nvidia-smi
output:Beta Was this translation helpful? Give feedback.
All reactions