-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
User cannot upload 30GB file #1208
Comments
@dchoi27 by the stack trace this is something not working with Piece computation. I also think other issue we saw from Ian hits the same code path in the WASM implementation of the Piece hashing |
There seems to be multiple issues here at play. Looking into this I discovered that we keep using increasingly more memory and at around 4GB wasm starts to fail. One issue is it looks like we fail do dealocate memory tracked by storacha/fr32-sha2-256-trunc254-padded-binary-tree-multihash#28 But even if I remove the Piece hashing entirely here We still keep using more and more memory around 15GB when uploading 17GB file and we climb up really quickly around when only 6% is processed If I comment following lines out, memory use is under 200MB so there is something wrong going on within the commented block |
I'm focusing on storacha/fr32-sha2-256-trunc254-padded-binary-tree-multihash#28 @gobengo if you'll have time maybe you can take a look at the other leak in the store.js file |
…eak fixes (#63) Motivation: * storacha/w3up#1208
I was able to upload a 6GB file just now even though it used to trigger this same error. I am hopeful 🤞🏻 this might be fixed. @dchoi27 could you ask the end-user to retest? |
Ah unfortunately they asked to have their account deleted when this didn't work. Maybe if we get 1-2 other points of validation we can close this. |
ok - the original user got back to me and confirmed it worked! closing this |
…eak fixes (#63) Motivation: * storacha/w3up#1208
Don't have a ton of detail here but a user reached out to the support email with an issue uploading a large, 30GB file. Here's the errors. If anyone else encounters something similar please comment so we can work with you to understand what's going on.
The text was updated successfully, but these errors were encountered: