You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I’m running into a RuntimeError:Expected to mark a variable ready only once.. A quick google search found that setting the argument find_unused_parameters to False might fix this. Where do I put it in pytorch lightning?
Updated 09/03/2022:
Will add the following in a note based on Adrian's comment in the thread:
"To clarify, the concept of "looking for unused parameters" is only relevant for ddp. You wouldn't need this for anything running on a single device"
The text was updated successfully, but these errors were encountered:
I ended up putting a strategy = 'ddp_notebook_find_unused_parameters_false' argument in my pl.trainer.__init__(). However, when this happens, I get a RuntimeError: The server socket has failed to listen on any local network address
@carmocca
I'll be creating tickets for questions, adding the content to the docs, closing the tickets then closing the loop and adding the URL from the docs to the original Slack post.
While the issue was resolved in Slack, the answer to his original question about where they should put the argument find_unused_parameters to False in PL wasn't.
📚 Documentation
Sean Carter
I’m running into a RuntimeError:Expected to mark a variable ready only once.. A quick google search found that setting the argument find_unused_parameters to False might fix this. Where do I put it in pytorch lightning?
Updated 09/03/2022:
Will add the following in a note based on Adrian's comment in the thread:
"To clarify, the concept of "looking for unused parameters" is only relevant for ddp. You wouldn't need this for anything running on a single device"
The text was updated successfully, but these errors were encountered: