Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not picking up changes in gradient.py #7

Closed
arosale4 opened this issue Jun 10, 2016 · 5 comments
Closed

Not picking up changes in gradient.py #7

arosale4 opened this issue Jun 10, 2016 · 5 comments
Labels

Comments

@arosale4
Copy link
Collaborator

I saw this commit, 5126475, where the code checks to see if there are any changes. Have we figured out why there are no changes? I have seen instances where a parameter does change the penalty function, and it produces changes in the do_method function, but those changes are not picked up again in the run function.

@arosale4 arosale4 added the bug label Jun 10, 2016
@ericchansen
Copy link

I'm really confused. Commit 5126475 only removed options that generated MacroModel conformational search .com files.

@arosale4
Copy link
Collaborator Author

0cd6a70
Thats weird, try this one.
But I think I know whats wrong:
The check function goes through the radii and cutoffs, but we currently have them defined as none.

@ericchansen
Copy link

I'm not sure what you're talking about as a bug still.

Here's an example of how things work. Maybe this will help you.

changes = do_lstsq(ma, vb, radii=radii, cutoffs=cutoffs)

This 1st goes through do_method since do_lstsq is wrapped by that function. If opt.OptError is raised, a warning is issued, and no changes are returned. This may be more clear if I added in a line return None following line 352, but that's also unnecessary extra line.

If opt.OptError isn't raised, then it goes through check. It calculates radius, the radius of parameter change, and then either checks for maximum radii or cutoffs.

The maximum radii method essentially scales all parameter changes to be within whatever radii is given by the user. The user can specify as many max radii as he or she wants. If the calculated radius is larger than those specified by the user, it will keep scaling the parameter changes to meet the user specified radii, each time generating another trial FF. It checks in order of smaller to larger user inputted maximum radii. If at some point during this process, the calculated radius is less than the user inputted maximum radius, then it returns the unscaled parameter changes. If the calculated radius starts out smaller than the smallest user requested maximum radius, then it simply returns the original, unscaled trial parameter changes. This check doesn't return None or 0, it will always give you something assuming opt.OptError wasn't raised earlier.

If you instead use check_cutoffs (which happens when the user doesn't specify any maximum radii and does specify cutoffs), and it returns False, then check and do_method returns [] (None, False and [] all mean 0 in Python).

So, opt.OptError (troublesome if that happens) and check_cutoffs (not troublesome, doing what the user asked for) can both end up with not producing any changes.

ericchansen added a commit that referenced this issue Jun 10, 2016
When any method in gradient generates parameter changes, these
are then run through gradient.check, which checks whether the radius
of unscaled parameter changes meets certain user specified conditions
(currently, those conditions are given in the __init__ statement of
gradient.Gradient).

It would first check for max_radii and then for cutoffs. However,
if neither were used, it would incorrectly return None rather than
returning the unscaled parameter changes.

This should fix issue #7.
@ericchansen
Copy link

Can anyone confirm that commit d646465 addresses this and then close the issue?

@arosale4
Copy link
Collaborator Author

Things work well now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants