-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Uploading a large >5GB file to S3 errors out #1945
Comments
I've gotten this response back from the Express server, I don't really know how as I can't reproduce it but think it gives some clue:
Looks like it did hit some sort of a limit of 102400, which I wonder if it's a Express's JSON size limit? Does Uppy send a JSON request to Companion after the upload has finished? And can that JSON request be too large for Express's default configuration to handle it? |
Now I get this error during midway of uploading a 15gb file:
|
This is after I increased the bodyParse.json limit:
I was able to upload a 10gb file fine with this. But 15gb is giving me the above error. |
Upon further tests, the problem is I'm just randomly getting 502 Bad Gateway from AWS endpoint.
Anyone got any ideas? |
Thanks for documenting your investigation! I don't really have a clue why it would do that from the top off my head—does it only happen on the accelerated endpoint or also without? |
Actually, this might be causing the S3 requests to fail: you've set Not setting a limit has very bad failure modes like this, and we'll configure one by default in a future version. Most likely, if you set The JSON issue is interesting too 🤔 we do send a big JSON object to the |
Okay thanks, let me try set the limit to 5 and see if that fixes it. |
Any update on this? Is it resolved? I am having trouble uploading files larger than 8 GB |
@amangijoe Have you set a uppy.use(AwsS3Multipart, {
/* ... your existing options ... */
limit: 5
}) |
Yes, setting limit somehow fixed it for me. |
Great, thanks for the confirmation! We'll be setting a default limit of 6 or 10 or something in the future so we don't run into these issues anymore. |
It's still not working, here is what I have understood the problem to be: Here is the error that I face when my upload completes
I think that when we are sending the completeMultipartUpload (documented here) request is the reason we are getting the error above. the Maybe if we increase the size of chunks the parts array will be smaller. Tell me if I am going wrong somewhere, thanks |
In MultipartUploader.js
will always result in 5 MB as the chunk size until we try to upload a file greater than 50GB, due to this the final I changed the value from 10000 to 500 and was able to upload files upto 30 GB with ease. @goto-bus-stop Do you want me to generate a pull request that allows you to configure the chunk size and also decides an optimal chunk size depending on the file size? |
@amanstack sorry I didn't see your comment! If you'd still like to create a PR for that, that would be super appreciated! |
@goto-bus-stop sure I ll create a PR right away. Thanks. |
Looks like I'm having the exact same issue. We are able to upload multipart files ~= 5GB but above 8G and we get the exact same error. I have gone in and updated the MultipartUploader.js as @amanstack suggested but it doesn't seem to make any difference. Was anybody able to fix this or figure out why its doing this? |
We are facing the same problem (uploading files up to 8GB) and I'm not sure why. I assumed the upload goes directly to S3 so why is the default configuration causing any errors when uploading files? Is my companion server (express) still involved in uploading? |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
Has this been fixed? |
This comment was marked as spam.
This comment was marked as spam.
…r s3 (transloadit#4372) * refine body parsers to make it possible to adjust them individually also improves security and speed because we don't have to parse all body types (and all sizes) for all endpoints * increase body size for s3 complete endpoint fixes transloadit#1945
Help, I posted this same topic on the Companion forum and I got zero response so I'm trying my luck here (https://community.transloadit.com/t/is-there-a-file-size-limit/15115/4).
I'm using the AWS MultiPart plugin like this:
This is the error from Uppy after it finished uploading a large file (15GB):
It works fine when I tested uploading a 5GB file.
Has anyone ever tested uploading something greater than 5GB?
I'm wondering if this is a Node/Express issue or something else.
Please let me know if anyone has tried it, thanks.
The text was updated successfully, but these errors were encountered: