Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add optimization direction to invdes.Optimizer #1961

Merged
merged 1 commit into from
Sep 17, 2024

Conversation

yaugenst-flex
Copy link
Collaborator

@yaugenst-flex yaugenst-flex self-assigned this Sep 16, 2024
Copy link
Collaborator

@tylerflex tylerflex left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good. Just wondering about naming.. direction could be confused with eg ModeSource.direction. Is there a better option? Perhaps "max_or_min", sign (-1,1), or maximize:bool?

@yaugenst-flex
Copy link
Collaborator Author

Yeah not sure. An alternative would be to do it the pytorch way and make it a maximize: bool = True argument. the explicit "maximize" vs "minimize" is a bit unnecessary because these values are not actually used.
nlopt has set_{min,max}_objective, but i think this is not feasible for us.
i would like to keep it explicit though by having either "maximize" or "minimize" in the variable name, otherwise people might get confused as to which is the default direction for a gradient 😄

@yaugenst-flex
Copy link
Collaborator Author

how about we go the boolean route?

@tylerflex
Copy link
Collaborator

Agree with everything. And let's just do Boolean. Only thing i can think of is are we ever going to want to support something other than minimizing or maximizing? Can't think of anything now so maybe let's just go with bool

@yaugenst-flex yaugenst-flex force-pushed the yaugenst-flex/optimizer-direction branch from d331d0e to 67fe9ca Compare September 16, 2024 10:39
@tylerflex
Copy link
Collaborator

There's one caveat to discuss. Penalties should always be minimized? If so then we want to apply the sign to the postprocess function value only. (In the objective function def)

@yaugenst-flex yaugenst-flex force-pushed the yaugenst-flex/optimizer-direction branch from 67fe9ca to 2d2e7b6 Compare September 16, 2024 11:02
@yaugenst-flex
Copy link
Collaborator Author

updated it by passing maximize to make_objective_fn(). i agree in terms of the code it would be cleaner if it was just defined in InverseDesign, but conceptually i don't see how a maximize argument makes sense in InverseDesign. it really belongs with the optimizer. and doing it via the function value instead of the gradient is also...not optimal 😄
but yeah i guess this would work for now?

@@ -130,7 +136,7 @@ def continue_run(
post_process_val = aux_data["post_process_val"]

# update optimizer and parameters
params, opt_state = self.update(parameters=params, state=opt_state, gradient=-grad)
params, opt_state = self.update(parameters=params, state=opt_state, gradient=grad)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hm, if

direction_multiplier = 1 if maximize else -1
obj_fn_val = direction_multiplier * post_process_val - penalty_value

then shouldn't we be using -grad here since we're always trying to maximize this quantity?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oops

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

not sure it's pushed? just fyi

@tylerflex tylerflex self-requested a review September 17, 2024 11:41
@yaugenst-flex yaugenst-flex force-pushed the yaugenst-flex/optimizer-direction branch from 34acd6f to 7feee7e Compare September 17, 2024 12:12
@yaugenst-flex yaugenst-flex force-pushed the yaugenst-flex/optimizer-direction branch from 7feee7e to 0d6bb41 Compare September 17, 2024 12:13
@yaugenst-flex yaugenst-flex merged commit 2d98a0e into develop Sep 17, 2024
15 checks passed
@yaugenst-flex yaugenst-flex deleted the yaugenst-flex/optimizer-direction branch September 17, 2024 12:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants