-
Notifications
You must be signed in to change notification settings - Fork 244
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] parquet_test.py::test_buckets failed due to SparkOutOfMemoryError intermittently #10940
Comments
Did not reproduce on subsequent builds |
It appeared one more time in nightly dev build . rapids_it test standalone spark 3.3.0
|
^ above failure was found in nightly multi-thread run on egx machine. |
Also met in premerge, seqID: 9729
|
saw another instance failed pre-merge in #11246 |
This is happening more and more frequently. |
try mitigate first #11260 |
Is this resolved now because of #11260? |
We have not seen any issue after the mitigate PR (increase heap memory). but we still need developers to confirm if the increased memory usage is expected or not |
Scope would be to add FAQ in docs regarding possible setting related to JDK version. |
Describe the bug
parquet_test.py::test_buckets failed due to org.apache.spark.memory.SparkOutOfMemoryError: Unable to acquire 65536 bytes of memory, got 0
This failure occurred in JDK11 nightly build (JDK11-nightly/625)
The text was updated successfully, but these errors were encountered: