-
Notifications
You must be signed in to change notification settings - Fork 11.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Downloading large files gives out of memory errors #31159
Labels
Comments
Feel free to send in a PR. Thanks. |
PR was merged. |
@driesvints how come this change isn't in Laravel 9.x? How do I apply this fix to my vendor files? |
Because it was reverted, see the PR: #31163 (comment) |
This is still an issue.... |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Description:
This bug manifested itself when I pointed Laravel Nova's
File
field to a file of 1,04 GB. Nova is unable to download those files. The log shows an out-of-memory error.I investigated the problem and I think it lies with
Illuminate\Filesystem\FilesystemAdapter
, which uses afpasstru()
of the stream.framework/src/Illuminate/Filesystem/FilesystemAdapter.php
Lines 163 to 167 in f3c1f48
This loads the whole file in memory. The following would work:
Steps To Reproduce:
Get yourself a large file and point a tinker session to it:
>>> Storage::download('large-file.csv')->send()
With
fpassthru()
, you get a "Allowed memory size of 134217728 bytes exhausted (tried to allocate 1041674240 bytes)".With the
while
loop it just streams data.Happy to PR.
Credits: https://www.php.net/manual/en/function.fpassthru.php#29162
More examples on that page too.
The text was updated successfully, but these errors were encountered: