-
Notifications
You must be signed in to change notification settings - Fork 51
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Great Lakes Data Assimilation #808
Conversation
Seems like there is a conflict with current branch. |
Resolved. |
@@ -1249,6 +1249,96 @@ def get_obs_from_timeslices( | |||
return observation_df_new | |||
|
|||
|
|||
def get_GL_obs_from_timeslices( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This function is very similar to get_obs_from_timeslices
function. Is it possible to merge these two?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, it's actually identical just without the interpolation step. We could merge them, but I do want to avoid interpolating for the Great Lakes, so we'd have to alter the original function to only interpolate under certain conditions. I chose this method for now for simplicity, but I'm open to re-working this function.
This PR adds a new data assimilation method specific to the Great Lakes. The Great Lakes are not currently routed in t-route due to complexities with their size. Previously, the level pool module was used for them, but due to some parameters being intentionally set to 0, they produce 0 outflow. This new method uses the best available observations (the lakes don't have their own outflow gages, so it uses gages that are nearby). Lakes Superior and Michigan/Huron use nearby USGS gages, lake Erie uses a Canadian gage, and Lake Ontario uses forecasted flows from the International Lake Ontario-St. Lawrence River Board.
Because the level pool module does not work on these lakes, we default to using climatological values in the event that there are no good observations. We've implemented an 11 day persistence method similar to USGS/USACE reservoirs, meaning the DA module will allow an observation to persist for up to 11 days, after which it will default to climatology if no new observations are available.
Additions
reservoir_GL_da.py
Removals
Changes
mc_reach.pyx
DataAssimilation.py
great_lakes_DA
. This handles the preprocessing of observations and updating observation and parameter data frames after loops in__main__.py
.HYFeaturesNetwork.py
connections
dictionary.nhd_io.py
get_GL_obs_from_timeslices()
. Almost identical to the existingget_obs_from_timeslices()
, except it doesn't do any interpolation.rfc_lake_gage_crosswalk.py
get_great_lakes_climatology()
, that loads preprocessed climatology values for the Great Lakes.compute.py & main.py
mc_reach.pyx
.Testing
Screenshots
Notes
Todos
Checklist
Testing checklist
Target Environment support
Accessibility
Other