-
Notifications
You must be signed in to change notification settings - Fork 28.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-3217] Add Guava to classpath when SPARK_PREPEND_CLASSES is set. #2141
Conversation
When that option is used, the compiled classes from the build directory are prepended to the classpath. Now that we avoid packaging Guava, that means we have classes referencing the original Guava location in the app's classpath, so errors happen. For that case, add Guava manually to the classpath. Note: if Spark is compiled with "-Phadoop-provided", it's tricky to make things work with SPARK_PREPEND_CLASSES, because you need to add the Hadoop classpath using SPARK_CLASSPATH and that means the older Hadoop Guava overrides the newer one Spark needs. So someone using SPARK_PREPEND_CLASSES needs to remember to not use that profile.
QA tests have started for PR 2141 at commit
|
QA tests have finished for PR 2141 at commit
|
Jenkins, test this please. |
It works for me. Our dear friend |
test this please. LGFM |
QA tests have started for PR 2141 at commit
|
QA tests have finished for PR 2141 at commit
|
Friendly ping. |
/cc @pwendell |
Thanks, this looks good. It's really too bad we keep having to add complexity to the build to support this. For instance, I'm not sure whether it's safe in all cases to add this extra Maybe this weekend I'll see if I can just publish a shaded guava and we can remove the build magic. In the mean time, I can merge this thanks. |
+1 |
When that option is used, the compiled classes from the build directory
are prepended to the classpath. Now that we avoid packaging Guava, that
means we have classes referencing the original Guava location in the app's
classpath, so errors happen.
For that case, add Guava manually to the classpath.
Note: if Spark is compiled with "-Phadoop-provided", it's tricky to
make things work with SPARK_PREPEND_CLASSES, because you need to add
the Hadoop classpath using SPARK_CLASSPATH and that means the older
Hadoop Guava overrides the newer one Spark needs. So someone using
SPARK_PREPEND_CLASSES needs to remember to not use that profile.