Replies: 1 comment
-
This file has 18GB of data: >>> da.nbytes /1e9
18.966779696 Presumably it's compressed on disk? Also there is at least one copy being made, so you can't do it with 32GB of memory. I would install dask and use ds = rioxarray.open_rasterio(path_in, engine="rasterio", chunks={"y":5000, "x": 10000}) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
This is a discussion that started in Twitter and @rabernat suggested to bring here, including code and data.
Today I tried to use
.coarsen()
method to resample a 3GB raster file from 1m to 100m. The process seems to consume all my RAM memory resources (32 GB) before crashing with aProcess finished with exit code 137 (interrupted by signal 9: SIGKILL)
. Crash picture follows:However, I tried first the resampling operation via QGIS and GRASS GIS'
resample
module. Then I did the same procedure on Python's GDAL viagdal.Warp()
. Both procedures yield resampled rasters within 1-2 minutes.The code used is really simple:
The data can be downloaded from this link: https://filesender.surf.nl/?s=download&token=5e53ebf0-f031-4043-a318-d41d585f03c5 Note that it will expire in 2 weeks.
This file is just one of the tiles of the Sky View Factor (SVF) for the Netherlands, publicly available here: https://dataplatform.knmi.nl/dataset/svf-nl-3
Hope this is helpful to locate the problem.
Thanks for all the good work! :-)
Irene
Beta Was this translation helpful? Give feedback.
All reactions