[BUG] Spark application stuck on installing extensions to enable rapids GPU SQL support #5369
-
Describe the bug Steps/Code to reproduce bug
Here is the stacktrace till it stops proceeding:
Expected behavior Environment details
Additional context |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments
-
GPU scheduling is not supported in Spark in local mode. Please remove those configs and it should continue. For example command see: https://nvidia.github.io/spark-rapids/docs/get-started/getting-started-on-prem.html#local-mode |
Beta Was this translation helpful? Give feedback.
-
Thanks @tgravescs for the suggestion, it worked! I just had one naïve question. How can one know for sure the GPU acceleration is actually being enabled to process the Spark application? Is there something I can lookout for in the physical plan through |
Beta Was this translation helpful? Give feedback.
-
yes, the best thing to do is look at the explain output or the sql UI page (the sql nodes and plan are there as well). you can find some more information here: https://nvidia.github.io/spark-rapids/docs/get-started/getting-started-on-prem.html#monitoring you can also look at the Debugging section right below that for it to print what is on GPU and what is not and why. There is also a simple example join here: Let me know if you have other questions. |
Beta Was this translation helpful? Give feedback.
-
Great thank you @tgravescs ! I am going to close this issue now. |
Beta Was this translation helpful? Give feedback.
GPU scheduling is not supported in Spark in local mode. Please remove those configs and it should continue.
For example command see: https://nvidia.github.io/spark-rapids/docs/get-started/getting-started-on-prem.html#local-mode