You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, is there any clipping/clamping done to the latents after using Noise Offset?
In my experience, I was training a LoRA for a character that wears a white uniform.
Everything works fine except... sometimes the uniform comes out as black when using the LoRA in generations.
Previously, I was also training a LoRA for a character that wears a blue dress.
Again, everything works fine except the dress often comes out red instead.
I highly suspect that, when applying the noise offsets, the resulting latents have values outside of what the model can handle, causing some sort of overflow, resulting in white becoming black as I experienced.
Therefore, I experimented by manually add a torch.clamp before the return of the apply_noise_offset function.
And as a result, the white uniform no longer becomes black during generation!
Is it just a coincidence? Or can someone verify this interactions? And perhaps implement a fix?
The text was updated successfully, but these errors were encountered:
Currently, is there any clipping/clamping done to the latents after using
Noise Offset
?In my experience, I was training a LoRA for a character that wears a white uniform.
Everything works fine except... sometimes the uniform comes out as black when using the LoRA in generations.
Previously, I was also training a LoRA for a character that wears a blue dress.
Again, everything works fine except the dress often comes out red instead.
I highly suspect that, when applying the noise offsets, the resulting latents have values outside of what the model can handle, causing some sort of overflow, resulting in white becoming black as I experienced.
Therefore, I experimented by manually add a
torch.clamp
before the return of theapply_noise_offset
function.And as a result, the white uniform no longer becomes black during generation!
Is it just a coincidence? Or can someone verify this interactions? And perhaps implement a fix?
The text was updated successfully, but these errors were encountered: