-
Notifications
You must be signed in to change notification settings - Fork 28.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SPARK-2641: Passing num executors to spark arguments from properties file #1657
Conversation
…e for num executors
Can one of the admins verify this patch? |
I think currently |
This makes sense to me. However, we should also document it and mention that it only currently works for YARN. |
ok to test |
QA tests have started for PR 1657 at commit
|
QA tests have finished for PR 1657 at commit
|
Can one of the admins verify this patch? |
ok to test |
QA tests have started for PR 1657 at commit
|
QA tests have finished for PR 1657 at commit
|
nit: could you make the pr title refer to what's actually changing? The current title sounds very generic, while the change is specific to a single config option. |
@@ -105,6 +105,8 @@ private[spark] class SparkSubmitArguments(args: Seq[String]) { | |||
.getOrElse(defaultProperties.get("spark.cores.max").orNull) | |||
name = Option(name).getOrElse(defaultProperties.get("spark.app.name").orNull) | |||
jars = Option(jars).getOrElse(defaultProperties.get("spark.jars").orNull) | |||
numExecutors = Option(numExecutors) | |||
.getOrElse(defaultProperties.get("spark.executor.instances").orNull) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we want to do this for yarn only?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In standalone or local mode, the no. of executors is not configurable per job.
Ok I'm merging this into master 1.2, 1.1 and 1.0 thanks |
…file Since we can set spark executor memory and executor cores using property file, we must also be allowed to set the executor instances. Author: Kanwaljit Singh <kanwaljit.singh@guavus.com> Closes #1657 from kjsingh/branch-1.0 and squashes the following commits: d8a5a12 [Kanwaljit Singh] SPARK-2641: Fixing how spark arguments are loaded from properties file for num executors
…file Since we can set spark executor memory and executor cores using property file, we must also be allowed to set the executor instances. Author: Kanwaljit Singh <kanwaljit.singh@guavus.com> Closes #1657 from kjsingh/branch-1.0 and squashes the following commits: d8a5a12 [Kanwaljit Singh] SPARK-2641: Fixing how spark arguments are loaded from properties file for num executors Conflicts: core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala
…file Since we can set spark executor memory and executor cores using property file, we must also be allowed to set the executor instances. Author: Kanwaljit Singh <kanwaljit.singh@guavus.com> Closes #1657 from kjsingh/branch-1.0 and squashes the following commits: d8a5a12 [Kanwaljit Singh] SPARK-2641: Fixing how spark arguments are loaded from properties file for num executors Conflicts: core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala
Since we can set spark executor memory and executor cores using property file, we must also be allowed to set the executor instances.