-
Notifications
You must be signed in to change notification settings - Fork 714
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: patchcore pytorch error #1798
Comments
I downgraded torch and torchmetrics and it works without errors. I guess some changes will be needed to make it work with newer versions. |
* Fix onnx export by rewriting GaussianBlur * Address codacy complaints. Reame variable to something other than `input` * Move GaussianBlur2d to anomalib.post_processing * Move blur to `anomlib.models.components.filters`
which versions are you using? I downgraded to |
I am using:
In general I think the quickest approach (this is what I finally did ) to make it work is to have a virtual env and install all the requirements as listed in the requirements files. |
I downgraded this package and then it works I downgraded and. then it works I can't resolve this issue just downgrade this package |
This is peculiar. I'm having trouble reproducing this issue on my end |
@voidmain443, on a fresh environment, can you try |
Using pathcore gives the following torch error.:
RuntimeError: expand(torch.FloatTensor{[1, 1, 1, 33, 33]}, size=[1, -1, -1, -1]): the number of sizes provided (4) must be greater or equal to the number of dimensions in the tensor (5)
See the log sections for the full error.
Dataset
Folder
Model
PatchCore
Steps to reproduce the behavior
I am trying to use patchcore as by tutorial. I am using a fresh install of anomalib 1.0.0.dev.
Running the following produce an error.
OS information
OS information:
Expected behavior
to run without errors
Screenshots
No response
Pip/GitHub
GitHub
What version/branch did you use?
No response
Configuration YAML
?
Logs
Code of Conduct
The text was updated successfully, but these errors were encountered: