-
Notifications
You must be signed in to change notification settings - Fork 28.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-30626][K8S] Add SPARK_APPLICATION_ID into driver pod env #27347
Conversation
ok to test |
...rnetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/BasicDriverFeatureStep.scala
Outdated
Show resolved
Hide resolved
...rnetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/BasicDriverFeatureStep.scala
Outdated
Show resolved
Hide resolved
...s/core/src/test/scala/org/apache/spark/deploy/k8s/features/BasicDriverFeatureStepSuite.scala
Outdated
Show resolved
Hide resolved
@Jeffwan . The use case looks reasonable.
BTW, as a side-note, please participate VOTE if you can. You don't need to run the full test. You can test only what you are interested in (maybe EKS?) and vote. |
Test build #117322 has finished for PR 27347 at commit
|
Kubernetes integration test starting |
Kubernetes integration test status failure |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Interesting. There are two failures. Could you run the K8s Integration Tests locally and share the result here?
KubernetesSuite:
- Run SparkPi with no resources *** FAILED ***
The code passed to eventually never returned normally. Attempted 70 times over 2.0010790142166663 minutes. Last failure message: false was not true. (KubernetesSuite.scala:315)
- Run SparkPi with a very long application name. *** FAILED ***
The code passed to eventually never returned normally. Attempted 70 times over 2.0007419548333334 minutes. Last failure message: false was not true. (KubernetesSuite.scala:315)
@dongjoon-hyun Thanks for the review. Sure. I would love to. You mean 2.4.5 and 3.0.0-preview2? I manually build spark and test sparkpi and let me run integration test and see anything different. |
I mean |
Run integration-tests locally and notice the problem. Seems
Notice there's pod failure in my cluster.
Here's the pod spec. It's pretty clear because pods image doesn't push to registry and pod can not fetch image correctly. But I am not sure if this has exact same reason to 2 failures in CI. Because of missing container image, all my tests failed. If there're only 2 failures, it's probably due to some other problems.
|
@dongjoon-hyun BTW, when I run
Some paths may not be available and paths needs to be updated, I add |
Test build #117348 has finished for PR 27347 at commit
|
It's weird. I'll take a look tomorrow~ |
Kubernetes integration test starting |
Kubernetes integration test status success |
Oh, the second run looks good. All tests passed.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, LGTM. Thank you, @Jeffwan . This is merged for 3.0.0.
I added your new Jira ID to the Apache Spark contributor group, too. I'd like to recommend you to use a single ID. That will be easier to find your contribution. |
I appreciate that! Thanks! I will definitely get more involved in the community. Thanks! |
What changes were proposed in this pull request?
Add SPARK_APPLICATION_ID environment when spark configure driver pod.
Why are the changes needed?
Currently, driver doesn't have this in environments and it's no convenient to retrieve spark id.
The use case is we want to look up spark application id and create application folder and redirect driver logs to application folder.
Does this PR introduce any user-facing change?
no
How was this patch tested?
unit tested. I also build new distribution and container image to kick off a job in Kubernetes and I do see SPARK_APPLICATION_ID added there. .