You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Dear EasyEdit team,
I may have found a seeming contradiction line 307: EasyEdit/easyeditor/editors/editor.py with torch.no_grad(): for k, v in weights_copy.items(): nethook.get_parameter(self.model, k)[...] = v.to(f "cuda:{self.hparams.device}")
This seems to be restoring the weights of the original model. But the final return value is named edited_model in line 361.
This has confused me, and I'm not sure if I'm missing other information that has led me to this confusion.
The text was updated successfully, but these errors were encountered:
😆Yes. But my question is that the weights of the specific layer in edited_model are recovered by weight_copy which records the origin value of origin model. So the edited_model gets the same weights as the origin model. I'm confused about whether the return value is what we expect. 🤔
👍Yes, your understanding is correct. In the past editing literature, the measurements were the results of a single edit, so it was necessary to roll back after each evaluation. However, in EasyEdit, you can enable continuous editing by setting keep_original_weights = False, more details in the issue #220
Dear EasyEdit team,
I may have found a seeming contradiction
line 307: EasyEdit/easyeditor/editors/editor.py
with torch.no_grad(): for k, v in weights_copy.items(): nethook.get_parameter(self.model, k)[...] = v.to(f "cuda:{self.hparams.device}")
This seems to be restoring the weights of the original model. But the final return value is named edited_model in
line 361
.This has confused me, and I'm not sure if I'm missing other information that has led me to this confusion.
The text was updated successfully, but these errors were encountered: