-
Notifications
You must be signed in to change notification settings - Fork 47
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
regridder multi threaded dask workers #88
Comments
I've run into similar errors using xesmf in parallel applications, and it has usually been an issue with the regridder files being stored on disk. You might solve this by keeping the weight file and setting reuse_weights to True. Any updates on the new API that makes use of in-memory storage for regridder weights? |
I'd be interested in the in-memory storage for regridder weights too, as I'm running into the 545 error as well. My use case is regridding ~50 CMIP climate model fields that exist on all different grids to a common grid. Was trying to use dask.delayed to set up the ~50 calls to a regridding function (that creates an xesmf regridder and regrids) and dask.compute to then do all the regridding but it's a no go. I tested with: def my_regrid_func( xarraydataset, dst_grid ) result1=dask.delayed( my_regrid_func )( xarraydataset1, dst_grid ) result2=dask.delayed( my_regrid_func )( xarraydataset2, dst_grid ) If there's a solution for embarrassingly parallel xesmf/dask regridding that I haven't found yet, please advise thanks |
Check out the new pangeo-data fork of this package, which has a number of updates including in-memory weights storage. |
Same problem here. I am using xarray to read very large files from model output. I created a custom generator to read one time step at a time, regrid and feed it to a NN model. First time around works fine, second time I get this error: File "/discover/nobackup/dbarahon/bin/conda/envs/tf_keras/lib/python3.8/site-packages/ESMF/interface/cbindings.py", line 528, in ESMP_GridCreate1PeriDim These are the relevant parts of the code: def train_gen(dts_per_epoch): def get_dts(): ... |
Set Also i install This problem really bother me, cause both of |
I'm trying to interpolate a section from a curvilinear grid. I'm running an acoustic transmission loss model and need to extract sections from a model at differing bearings from a point. I've written my code as an xarray accessor.
If I have a dask cluster with 1 thread per worker all is good. If I have multiple threads then the code executes first time correctly then second or third time I get:
I known this is probably not the usual use case for this package but it works nicely. Any thoughts on how to just interplate along a line would also be greatly received.
The text was updated successfully, but these errors were encountered: