You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@caic99 L1, L2 regularization is a common practice in ML training but not common in MLIP training, should we put how (and why) to use regularization in loss function in docs of deepmd-kit ?
@caic99 L1, L2 regularization is a common practice in ML training but not common in MLIP training, should we put how (and why) to use regularization in loss function in docs of deepmd-kit ?
@QuantumMisaka I would like to correct my reply above: what I meant was the L2 norm used for gradient clipping, which is not the same thing as the original poster said.
how (and why) to use regularization in loss function
Unluckily I have no experience on using regularization for MLIP.
Summary
Hi, I'm using Deepmd-kit, but I encountered over-fitting problem during training. How can I port L1 L2 regularization into deepmd-kit?
Detailed Description
Just as summary
Further Information, Files, and Links
No response
The text was updated successfully, but these errors were encountered: