-
Notifications
You must be signed in to change notification settings - Fork 11.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[6.x] Fix memory usage on downloading large files #31163
Conversation
I have just tested it again to be sure, but no: this really works for me and solves the problem. To be fair, I'm testing on localhost with the local driver, but as I understand it other drivers shouldn't really matter. |
Yes, as you understand it. Have you ran the code? The following: $stream = $this->readStream($path);
while(! feof($stream)) {
echo fread($stream, 2048);
}
fclose($stream);
\Log::info(memory_get_peak_usage()); ... with a file download of 1,04 GB. Gives me 21 MB in the log:
Do you have an example in which it does not work? (I want to learn too here.) |
Very interesting, thanks! The way I tested it was through a From the context I get here, however, I think it's Tinker itself that blows up. See the reference to 'php-console-highlighter':
The crash in the console is also in the log, but there is no mention of crashes in the logs when I go through the browser, so I am still convinced my fix works. |
@voyula It's not bad to be wrong. I was wrong too, because I thought you did not run the code and you did and you had a good point. Don't delete your comments. Thanks for the research! |
@sebsel Yes, it seems working now, magically. But output buffering can affect it, like to fpassthru, i don't personally sure for this solution. I did delete my uncertain suggestions, Laravel core members can give a feedback for this issue. Thanks for your explanations. A Benchmarking Source: |
@@ -162,7 +162,9 @@ public function response($path, $name = null, array $headers = [], $disposition | |||
|
|||
$response->setCallback(function () use ($path) { | |||
$stream = $this->readStream($path); | |||
fpassthru($stream); | |||
while (! feof($stream)) { | |||
echo fread($stream, 2048); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we know why this "dark magic hackery" fixes it 🧙♂️ ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems like output buffering tries chunking data with php free memory size.
If you will give data upper size than free memory size, output buffering chunking attempts with php free memory size will be failed.
This while
giving data with small sizes to output buffer for help to output buffering chunking via php free memory size.
This issue was soo interesting for me. :D
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why not stream_copy_to_stream? Eg https://github.com/symfony/http-foundation/blob/0f977dd6a6ab22789f49cefccdb623734da2ba3c/BinaryFileResponse.php#L297-L303
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why not stream_copy_to_stream? Eg https://github.com/symfony/http-foundation/blob/0f977dd6a6ab22789f49cefccdb623734da2ba3c/BinaryFileResponse.php#L297-L303
Actually, functionally same thing. Only having an different point, while
is chunking data for output buffering.
This reverts commit f239c07.
Reverting for breaking downloads. |
hello, had the same problem, memory overflow while handling a large set of data, in using "laravel/framework": "5.8" so, this fix does not work because it break something else? Or can I apply it? have to apply it manually in sources? |
@LucaColombi 5.8 isn't supported anymore, please upgrade. |
Same problem with Laravel 7. No fix yet? |
@rtrudel you're free to do another attempt if you like. |
Going through the same problem, due to the revert it wasn't fixed yet, any update about whether it's going to be fixed in next versions? I'm thinking on overriding the method for now, anyone got some work around? |
I'd like to share a workaround I got based on the "fix" @sebsel introduced.
The above is chunking the stream of the file instead of loading it through php memory, as mentioned before by @sebsel. Got this fix here: https://www.php.net/manual/en/function.fpassthru.php#29162 |
What exactly did this change break, @taylorotwell ? |
Fixes issue #31159