You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for reporting this issue. Indeed, the sentinel1 data is very heavy, and I have to take small chunk sizes when working with that data on my laptop, for example:
Thanks for the hint and it does work for me. I usually set DASK_ARRAY__CHUNK_SIZE to 2200*2200 (about 68MB.), which gives me satisfying performance on most other readers except this one. I wish we had more specific tips for adjust the value depending on readers. That'd be better...
Describe the bug
Nearly all my 64GB memory and it still asks for more.
To Reproduce
Actual results
Environment Info:
The text was updated successfully, but these errors were encountered: