-
Notifications
You must be signed in to change notification settings - Fork 77
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support ML-Danbooru #6
Conversation
), | ||
'wd14-convnextv2.v1': WaifuDiffusionInterrogator( | ||
'WD14 ConvNeXTV2 v1', | ||
repo_id='SmilingWolf/wd-v1-4-convnextv2-tagger-v2' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is odd, v1 and v2, was the repository named incorrectly? If so I'm adding a comment
repo_id='SmilingWolf/wd-v1-4-convnextv2-tagger-v2' | ||
), | ||
'wd14-swinv2.v2': WaifuDiffusionInterrogator( | ||
'WD14 SwinV2 v1', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same question here, but then I'd saythe it should be
'wd14-swinv2.v1': WaifuDiffusionInterrogator(
-----------------^^
'WD14 SwinV2 v1',
repo_id='SmilingWolf/wd-v1-4-swinv2-tagger-v2'
),
target_size = ( | ||
(target_size[0] // 4) * 4, | ||
(target_size[1] // 4) * 4, | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is the same as
target_size = (target_size[0] & ~3, target_size[1] & ~3)
right?
'onnxruntime-gpu' | ||
) | ||
|
||
run_pip(f'install {package}', 'onnxruntime') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can just add onnxruntime to the requirements.txt, right?
onnxruntime-silicon; sys_platform == 'darwin'
onnxruntime-gpu; sys_platform != 'darwin'
or people can adapt this
class WaifuDiffusionInterrogator(Interrogator): | ||
def __init__( | ||
self, | ||
name: str, | ||
repo_id: str, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have some without a repo_id, so I've changed this to a repo_id=None, also requires some changes elsewhere,
@@ -48,11 +48,10 @@ def on_interrogate( | |||
|
|||
unload_model_after_running: bool | |||
): | |||
if interrogator not in utils.interrogators: | |||
interrogator: Interrogator = next((i for i in utils.interrogators.values() if interrogator_name == i.name), None) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is a bit silly
@@ -78,7 +77,7 @@ def on_interrogate( | |||
return [ | |||
', '.join(processed_tags), | |||
ratings, | |||
tags, | |||
dict(list(tags.items())[:200]), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In my branch There is already a solution for this (the slider on in settings -> tagger), so I can omit this.
interrogators[path.name] = DeepDanbooruInterrogator(path.name, path) | ||
|
||
return sorted(interrogators.keys()) | ||
interrogators["deepdanbooru"] = DeepDanbooruInterrogator(path.name, path) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Path.name seems to work as well, why change it?
unloaded = True | ||
gc.collect() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what exactly does gr.collect() do?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You may already notice, this line is gc.collect()
rather than gr.collect()
. I guess PR author's original purpose is to call garbage collector(gc) to run a full collection. But as we already known, he/she did abandon this approach due to the underlying lib bugs.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ah yes I noticed and forgot. I also found this so numba possible sollution for cuda users.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The new solution looks cool, please give it a try.
except ImportError: | ||
# only one of these packages should be installed at a time in any one environment | ||
# https://onnxruntime.ai/docs/get-started/with-python.html#install-onnx-runtime | ||
# TODO: remove old package when the environment changes? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In what scenario do you expect the environment to change? installed on an external drive?
I don't think we have to support this.
class WaifuDiffusionInterrogator(Interrogator): | ||
def __init__( | ||
self, | ||
name: str, | ||
repo_id: str, | ||
model_path='model.onnx', | ||
tags_path='selected_tags.csv', | ||
**kwargs |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the kwargs are now obsolete
tags_path = Path(hf_hub_download( | ||
**self.kwargs, filename=self.tags_path)) | ||
model_path = hf_hub_download(self.repo_id, self.model_path) | ||
tags_path = hf_hub_download(self.repo_id, self.model_path) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
tags_path = hf_hub_download(self.repo_id, self.model_path)
^^^^^^^^^--------------------------------------^^^^^^^^^^
This is a bug, I've fixed it.
I could not simply merge it anymore so I manually applied the changes and fixed the issues mentioned here. Thanks for the work. I'll mention you in the changelog and commit message. I've posed a lot of questions, sorry about that. Only the ones where I've added eyes are still a question to me. |
see the discussion here