-
Notifications
You must be signed in to change notification settings - Fork 28.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-32756][SQL] Fix CaseInsensitiveMap usage for Scala 2.13 #29584
Conversation
ok to test |
How about filing a new jira ticket as a sub-ticket of https://issues.apache.org/jira/browse/SPARK-25075 ? |
Test build #128046 has finished for PR 29584 at commit
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for your contribution, @karolchmist .
Please use a different JIRA issue for Scala 2.13. For now, we are trying our best, but we don't know we can achieve Scala 2.13 in Apache Spark 3.1. It depends on the Scala community's next release.
+1 for @maropu 's advice.
|
Ah, one more comment; please fill the PR description about what kind of compilers errors happens. I think that the information is useful to developers when writing code with scala v2.13. Anyway, thanks for the work, @karolchmist. |
@@ -42,7 +42,10 @@ class CaseInsensitiveMap[T] private (val originalMap: Map[String, T]) extends Ma | |||
override def +[B1 >: T](kv: (String, B1)): CaseInsensitiveMap[B1] = { | |||
new CaseInsensitiveMap(originalMap.filter(!_._1.equalsIgnoreCase(kv._1)) + kv) | |||
} | |||
|
|||
|
|||
override def updated[B1 >: T](key: String, value: B1): CaseInsensitiveMap[B1] = |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I remember we have two copy of this for Scala 2.13 compatibility. Are you sure this is the right place to fix?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I replaced updated
with +
. So I removed this updated
here.
I had to override +
in 2.13
version of CaseInsensitiveMap
, because a standard Map
would be returned from +
.
@@ -118,7 +118,7 @@ class DataFrameReader private[sql](sparkSession: SparkSession) extends Logging { | |||
* @since 1.4.0 | |||
*/ | |||
def option(key: String, value: String): DataFrameReader = { | |||
this.extraOptions += (key -> value) | |||
this.extraOptions = this.extraOptions.updated(key, value) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Because we already implement +
, can you just write this.extraOptions = this.extraOptions.updated + (key -> value)
? I think that is how Scala 2.12 desugared this already, but 2.13 won't. If so that's the most straightforward change, doesn't need a new method.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, I replaced updated
with +
.
I had to override it in 2.13
version of CaseInsensitiveMap
, because a standard Map
would be returned from +
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Weird, it isn't needed in 2.12? OK fine. It could be OK to add it to 2.12 too for completeness
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It was already in 2.12, but not in 2.13
Test build #128107 has finished for PR 29584 at commit
|
|
Thanks, @karolchmist . |
Ah, it's just complaining about whitespace:
|
Test build #128161 has finished for PR 29584 at commit
|
Merged to master |
What changes were proposed in this pull request?
This is a follow-up of #29160. This allows Spark SQL project to compile for Scala 2.13.
Why are the changes needed?
It's needed for #28545
Does this PR introduce any user-facing change?
No
How was this patch tested?
I compiled with Scala 2.13. It fails in
Spark REPL
project, which will be fixed by #28545