Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-51254][PYTHON][CONNECT] Disallow --master with Spark Connect URL #50000

Closed
wants to merge 1 commit into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions python/pyspark/errors/error-conditions.json
Original file line number Diff line number Diff line change
Expand Up @@ -507,6 +507,11 @@
"Variant binary is malformed. Please check the data source is valid."
]
},
"MASTER_URL_INVALID": {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

shouldn't we already have a error condition for it? How do we validate MASTER url before without Spark Connect?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We actually don't have the error class (and even if there is, that has to be listed in JVM's error class JSON file). The error is thrown from prepareSubmitEnvironment at core/src/main/scala/org/apache/spark/deploy/SparkSubmit.

"message": [
"Master must either be yarn or start with spark, k8s, or local."
]
},
"MASTER_URL_NOT_SET": {
"message": [
"A master URL must be set in your configuration."
Expand Down
5 changes: 5 additions & 0 deletions python/pyspark/sql/session.py
Original file line number Diff line number Diff line change
Expand Up @@ -508,6 +508,11 @@ def getOrCreate(self) -> "SparkSession":

if url is None and is_api_mode_connect:
url = opts.get("spark.master", os.environ.get("MASTER", "local"))
if url.startswith("sc://"):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

instead of a bandaid fix, can we refactor the code to have different branches for --remote and --master?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The problem is that this is Python specific. Spark Connect Server is launched without Spark Submit itself. For other cases, the exceptions are covered via SparkSubmit.prepareSubmitEnvironment.

raise PySparkRuntimeError(
errorClass="MASTER_URL_INVALID",
messageParameters={},
)

if url is None:
raise PySparkRuntimeError(
Expand Down