-
Notifications
You must be signed in to change notification settings - Fork 28.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[hotfix] [build] Make sure JAVA_HOME is set for tests. #5441
Conversation
LGTM if this works... and I think it will. (edit: actually it doesn't cover SBT tests right?) |
For SBT, I think we should be able to make a corresponding change in spark/project/SparkBuild.scala Line 429 in 470d745
|
Ah, good point, jenkins runs sbt not maven... :-/ |
@@ -119,7 +119,7 @@ object SparkBuild extends PomBuild { | |||
lazy val publishLocalBoth = TaskKey[Unit]("publish-local", "publish local for m2 and ivy") | |||
|
|||
lazy val sharedSettings = graphSettings ++ genjavadocSettings ++ Seq ( | |||
javaHome := Properties.envOrNone("JAVA_HOME").map(file), | |||
javaHome := sys.props.get("java.home").map(file), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm testing out this patch on one of the Jenkins boxes and it looks like this ended up breaking compilation because java.home
points to a JRE, which will cause us to fail with an error when SBT tries to shell out to javac
:
java.io.IOException: Cannot run program "/usr/java/jdk1.7.0_71/jre/bin/javac": error=2, No such file or directory
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
at sbt.SimpleProcessBuilder.run(ProcessImpl.scala:349)
at sbt.AbstractProcessBuilder.run(ProcessImpl.scala:128)
at sbt.AbstractProcessBuilder$$anonfun$runBuffered$1.apply(ProcessImpl.scala:159)
at sbt.AbstractProcessBuilder$$anonfun$runBuffered$1.apply(ProcessImpl.scala:159)
at sbt.compiler.JavacLogger.buffer(AggressiveCompile.scala:235)
at sbt.AbstractProcessBuilder.runBuffered(ProcessImpl.scala:159)
at sbt.AbstractProcessBuilder.$bang(ProcessImpl.scala:156)
[...]
I think the right fix may be to either modify our Spark code to handle the case where |
I'm modifying all of the builds to set |
Test build #29968 has finished for PR 5441 at commit
|
@JoshRosen hopefully with the latest patch things should work without JAVA_HOME. Did you make the change already? |
@vanzin, I pushed that change already but what I can do is configure "new" SparkPullRequestBuilder to not include that change and have it test your PR. One sec... |
Test build #29964 has finished for PR 5441 at commit
|
Test build #657 has finished for PR 5441 at commit
|
BTW I tested this change locally and the YARN tests pass. Command line:
I also checked the maven side using the |
@JoshRosen if I retest this will it pick up the custom builder (without JAVA_HOME)? |
Jenkins, retest this please. |
@vanzin, "NewSparkPullRequestBuilder" is triggered via the http://spark-prs.appspot.com platform. I've granted you the required permissions on that site, so if you "Sign In with GitHub" on that site you should now see a "test with Jenkins" button next to PRs, which triggers the new PRB. I'll trigger a new run now. |
Test build #29973 has finished for PR 5441 at commit
|
Test build #29984 has finished for PR 5441 at commit
|
Test build #659 has finished for PR 5441 at commit
|
Ah, success. |
Let me know if you'd prefer to have a bug for this instead of [hotfix]. |
Just to see if I have this right: this sets |
yes, that's it; as I said earlier, maven makes this unnecessarily complicated. :-/ |
PRs Merged 1. [Internal] Add AppleAwsClientFactory for Mascot (apache#577) 2. Hive: Log new metadata location in commit (apache#4681) 3. change timeout to 120 for now (apache#661) 4. Internal: Add hive_catalog parameter to SparkCatalog (apache#670) 5. Internal: Pull catalog setting to CachedClientPool (apache#673) 6. Core: Defer reading Avro metadata until ManifestFile is read (apache#5206) 7. API: Fix ID assignment in schema merging (apache#5395) 8. AWS: S3OutputStream - failure to close should persist on subsequent close calls (apache#5311) 9. API: Allow schema updates to find fields with case-insensitivity (apache#5440) 10. Spark 3.3: Spark mergeSchema to respect Spark Case Sensitivity Configuration (apache#5441)
This is needed at least for YARN integration tests, since
$JAVA_HOME
is used to launch the executors.