You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A project I'm working on has 300k failed jobs. When I try to run artisan queue:retry it results in an error...
PHP Fatal error: Allowed memory size of 536870912 bytes exhausted
After looking into how the RetryCommand works, I realized that rather than querying just the IDs from the failed jobs table, it loads the entire table and then plucks the ID column. No matter how high I set the memory limit, the server won't be able to handle loading all 300k failed jobs into memory.
This is a very inefficient way to load the failed job IDs. Any thoughts on how this could be improved without breaking anything?
Steps To Reproduce
Populate the failed_jobs table with 300k failed jobs
Run php artisan queue:retry
The text was updated successfully, but these errors were encountered:
… when dealing with thousands of failed jobs (#49186)
* [10.x] Fixes retrying failed jobs causes PHP memory exhaustion errors when dealing with thousands of failed jobs
fixes#49185
Signed-off-by: Mior Muhammad Zaki <crynobone@gmail.com>
* Apply fixes from StyleCI
* formatting
---------
Signed-off-by: Mior Muhammad Zaki <crynobone@gmail.com>
Co-authored-by: StyleCI Bot <bot@styleci.io>
Co-authored-by: Taylor Otwell <taylor@laravel.com>
… when dealing with thousands of failed jobs (#49186)
* [10.x] Fixes retrying failed jobs causes PHP memory exhaustion errors when dealing with thousands of failed jobs
fixeslaravel/framework#49185
Signed-off-by: Mior Muhammad Zaki <crynobone@gmail.com>
* Apply fixes from StyleCI
* formatting
---------
Signed-off-by: Mior Muhammad Zaki <crynobone@gmail.com>
Co-authored-by: StyleCI Bot <bot@styleci.io>
Co-authored-by: Taylor Otwell <taylor@laravel.com>
Laravel Version
10.34.2
PHP Version
8.1.26
Database Driver & Version
No response
Description
A project I'm working on has 300k failed jobs. When I try to run
artisan queue:retry
it results in an error...PHP Fatal error: Allowed memory size of 536870912 bytes exhausted
After looking into how the
RetryCommand
works, I realized that rather than querying just the IDs from the failed jobs table, it loads the entire table and then plucks the ID column. No matter how high I set the memory limit, the server won't be able to handle loading all 300k failed jobs into memory.This is a very inefficient way to load the failed job IDs. Any thoughts on how this could be improved without breaking anything?
Steps To Reproduce
failed_jobs
table with 300k failed jobsphp artisan queue:retry
The text was updated successfully, but these errors were encountered: