-
Notifications
You must be signed in to change notification settings - Fork 138
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Updates to wave fracture, including a bug fix #299
Conversation
Clarifying wave_spec_type options
Clarifying wave options
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't have a good sense of what you've done here. Especially if this is different from what's already been published, it would be good to add a couple of sentences to the documentation regarding exactly what's happening now. On the other hand, if you @lettie-roach view these changes as minor, then I'm fine with it as-is.
@@ -688,29 +710,44 @@ subroutine get_fraclengths(X, eta, fraclengths, hbar, e_stop) | |||
|
|||
! calculate strain | |||
if (is_triplet(j)) then |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does this guarantee that delta* in the denominator are all nonzero?
1. There was a typo in the strain equation in HT2015. I have corrected this
- this changes the answers, but the fact that the QC test passed (comparing
with FSD and waves on) suggests the impact is minor. Should I add sentence
in the documentation to say: 'Note that the equation for strain in HT2015
contains typos - we use the correct version here. Tests suggest the impact
is minor.' or similar?
2. I made the 'random' option for the wave fracture mean that the fracture
code is run to convergence (we generate realizations of sea surface height
with random phase and add the fractures to a histogram, until successive
histograms are the same to within some small error tolerance). I should
have put this in before (in CICE5 you could have changed to run a number of
SSH realizations by hardcoding the `loopcts' variable). This does not
affect anything when you run with wave_spec_type='constant.' The answers
will differ slightly between the master branch and this code for
wave_spec_type='random', but I think the logic makes more sense now.
3. I did a general tidy up and removed some unnecessary computations.
Good point - I'll add a check to the denominator
…On Tue, 11 Feb 2020 at 14:26, Elizabeth Hunke ***@***.***> wrote:
***@***.**** commented on this pull request.
I don't have a good sense of what you've done here. Especially if this is
different from what's already been published, it would be good to add a
couple of sentences to the documentation regarding exactly what's happening
now. On the other hand, if you @lettie-roach
<https://github.com/lettie-roach> view these changes as minor, then I'm
fine with it as-is.
------------------------------
In columnphysics/icepack_wavefracspec.F90
<#299 (comment)>
:
> @@ -688,29 +710,44 @@ subroutine get_fraclengths(X, eta, fraclengths, hbar, e_stop)
! calculate strain
if (is_triplet(j)) then
Does this guarantee that delta* in the denominator are all nonzero?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#299?email_source=notifications&email_token=AGIYTCJB3SUAR6J7IB2B4GTRCMQYTA5CNFSM4KTG7EJ2YY3PNVWWK3TUL52HS4DFWFIHK3DMKJSXC5LFON2FEZLWNFSXPKTDN5WW2ZLOORPWSZGOCVD75QY#pullrequestreview-357039811>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AGIYTCPPNQFWPLEVI3GCTY3RCMQYTANCNFSM4KTG7EJQ>
.
|
thanks! |
Also noted in #298. I ran full test suites on conrad with 4 compilers using master + dtlatm ( #298 ) + ifsd3 ( #299 ) and everything looks good. Ran both icepack and cice suites. fsd tests fail regression but that's expected. Think both #298 and #299 are fine. Test results are here, https://github.com/CICE-Consortium/Test-Results/wiki/icepack_by_hash_forks#122fb3d35defebafb202cb576309ba34b23b01d9 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This PR is almost ready, but I just noticed a 'stop' command which needs to be fixed. Sorry I didn't notice that earlier.
Oh sorry about that. I can just remove it and replace it with a print
statement warning that this is taking a long time to converge. I think it's
pretty unlikely to happen, but wanted to have something in there just in
case. Does that seem reasonable?
…On Fri, 14 Feb 2020 at 08:49, Elizabeth Hunke ***@***.***> wrote:
***@***.**** requested changes on this pull request.
This PR is almost ready, but I just noticed a 'stop' command which needs
to be fixed. Sorry I didn't notice that earlier.
------------------------------
In columnphysics/icepack_wavefracspec.F90
<#299 (comment)>
:
>
- ! normalize
- if (SUM(frac_local) /= c0) frac_local(:) = frac_local(:) / SUM(frac_local(:))
+ if (iter.gt.100) then
+ print *, 'fracerror ',fracerror
+ print *, 'before ',prev_frac_local
+ print *, 'after ',frac_local
+ stop 'wave_frac did not converge'
We need to use the warning software to do this abort (and check for others
in the FSD code, like this). @apcraig <https://github.com/apcraig> could
you help @lettie-roach <https://github.com/lettie-roach> with this?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#299?email_source=notifications&email_token=AGIYTCKLZVZLT5YCS7XYCL3RC3DQBA5CNFSM4KTG7EJ2YY3PNVWWK3TUL52HS4DFWFIHK3DMKJSXC5LFON2FEZLWNFSXPKTDN5WW2ZLOORPWSZGOCVTRBBI#pullrequestreview-359075973>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AGIYTCLXZJV33ZHLDC5LZ6LRC3DQBANCNFSM4KTG7EJQ>
.
|
That’s reasonable. If the code doesn’t need to abort, then let’s not. Thanks,
e
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like the warning message. But since there's still a print, which needs to be fed up through the icepack interface via the warning system. Icepack column physics isn't supposed to do any I/O itself. I can help with this but it might be some days, maybe @apcraig can help with it first.
Ah right. I tried to add in the icepack warning calls instead of the print statements, just by following the other examples in that routine |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks okay to me, but I'll ask @apcraig to take a look at the warning stuff to make sure it'll work as expected, before merging.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The icepack_warnings_add calls looks fine. If you ever want/need to abort at some point, you would add the following before or after the warnings_add call.
call icepack_warnings_setabort(.true.,FILE,LINE)
But I have one other request. After every call from every subroutine in icepack, there should be a
if (icepack_warnings_aborted(subname)) return
The idea here is that icepack cannot write any I/O nor can it abort/stop because it doesn't know how to do it. It doesn't know anything about parallelization among many other things. So we have this warning module. You add output to the warning module by calling icepack_warnings_add(string). You turn on the abort flag by calling call icepack_warnings_setabort(.true.,FILE,LINE). And finally, within icepack, if a subroutine has set the abort flag, you want icepack to return up the calling tree to whatever called it (icepack driver, CICE, some other model, etc) and then that model can check the abort flag and write out the icepack warning messages to the log file. So that means within icepack, you want to add the "if (icepack_warnings_aborted(subname)) return" statement after every call inside icepack (except calls to the icepack_warnings module). You can also see how that's setup in other subroutines.
A quick look suggests that is needed at on a call to wave_frac and to icepack_fsd_cleanup, but it may be needed other places in the icepack fsd implementation. I think we forgot to check that before. If you want some help going thru your code to find these situations, let me know.
OK. I'm headed out now for a couple of days but will look into it next
week.
…On Fri, Feb 14, 2020, 15:58 Tony Craig ***@***.***> wrote:
***@***.**** requested changes on this pull request.
The icepack_warnings_add calls looks fine. If you ever want/need to abort
at some point, you would add the following before or after the warnings_add
call.
call icepack_warnings_setabort(.true.,*FILE*,*LINE*)
But I have one other request. After every call from every subroutine in
icepack, there should be a
if (icepack_warnings_aborted(subname)) return
The idea here is that icepack cannot write any I/O nor can it abort/stop
because it doesn't know how to do it. It doesn't know anything about
parallelization among many other things. So we have this warning module.
You add output to the warning module by calling
icepack_warnings_add(string). You turn on the abort flag by calling call
icepack_warnings_setabort(.true.,*FILE*,*LINE*). And finally, within
icepack, if a subroutine has set the abort flag, you want icepack to return
up the calling tree to whatever called it (icepack driver, CICE, some other
model, etc) and then that model can check the abort flag and write out the
icepack warning messages to the log file. So that means within icepack, you
want to add the "if (icepack_warnings_aborted(subname)) return" statement
after every call inside icepack (except calls to the icepack_warnings
module). You can also see how that's setup in other subroutines.
A quick look suggests that is needed at on a call to wave_frac and to
icepack_fsd_cleanup, but it may be needed other places in the icepack fsd
implementation. I think we forgot to check that before. If you want some
help going thru your code to find these situations, let me know.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#299?email_source=notifications&email_token=AGIYTCO74AIHDVGLN7CDQATRC4VZ7A5CNFSM4KTG7EJ2YY3PNVWWK3TUL52HS4DFWFIHK3DMKJSXC5LFON2FEZLWNFSXPKTDN5WW2ZLOORPWSZGOCVVC4MY#pullrequestreview-359280179>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AGIYTCIJ4S7FQH6HDPBFY4TRC4VZ7ANCNFSM4KTG7EJQ>
.
|
I tried to add those icepack warnings checks now. I'm not sure about the ones I added in icepack_init_fsd_bounds |
Thanks @lettie-roach, I'll have a look and run some tests later today. |
@lettie-roach, we do NOT want the calls to
implemented after calls to
It doesn't hurt to do that, but it doesn't add anything. Could you remove those again? I see some in icepack_fsd and icepack_wavefracspec. Sorry about that. I probably wasn't clear. The other additions look fine. |
OK, I removed those now |
@lettie-roach, I just PR'ed a few changes to your ifsd3 icepack branch. These include removing a few final icepack abort checks where they are not needed and converting tabs to spaces. None of this is critical and it could be fixed later as well. If you are interested, please consider having a quick look and then you can merge the PR to your branch. Or if you prefer, I can push directly to your branch. You will also get a icepack master update that happens to be in my working branch. I think that should be fine. I am running full test suites on conrad with 4 compilers for icepack and cice. They already passed without my proposed changes, but just to be sure, I'm going to run them again. |
Minor clean up
The test suites are looking good and I think this is ready to merge. If nobody objects, I'll merge this in the next day or two. @eclare108213, feel free to merge if you agree it's ready. Thanks @lettie-roach |
* implement bfb for mpi comm global sums, fix compile issue with cpps, remove unused cpps * add set_nml.reprosum * update global reductions for serial mode to leverage bfbflag in the same way as mpi mode * update and fix bfb compare feature * add comparelog.csh script and logbfb test * update travis gcc, fix tdir feature with tests * update Makefile for c compiles, update Macros files to support c compile and serial/parallel compiles better and cleaner * modify report results script to address grep for bfbcomp cases, add first_suite.ts * fix Macros.onyx_intel, accidently removed debug flags in earlier commit * update travis c build * fix script logic for more threads than tasks per node * update scripts to address CICE-Consortium#299, poll_queue script and CICE-Consortium#301, single test bgen default use * add QSTAT variables to fram and cesium
For detailed information about submitting Pull Requests (PRs) to the CICE-Consortium,
please refer to: https://github.com/CICE-Consortium/About-Us/wiki/Resource-Index#information-for-developers
PR checklist
Updates to wave fracture, including a bug fix
Lettie Roach
I ran the base_suite. All tests passed except cheyenne_intel_smoke_gx3_8x2_diag24_fsd12ww3_medium_run1day.
I then ran the QC test with these namelist options. QC tests pass.
Changes to the wave fracture: fixes a bug in equation for strain that arose from a typo in Horvat & Tziperman (2015); removed the unnecessary conversion of the wave spectrum to be defined in terms of wavelengths; allow the possibility to run the wave fracture to convergence if wave_spec_type=‘random’, and added more description of this to the documentation.