-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Consider stdlib additions for dotty cross-compilation #449
Comments
|
See https://github.com/lampepfl/dotty/tree/master/library/src for everything that could be added. For stuff that does not require compiler support, we could also consider cross-compiling a modified version of |
A very useful addition for cross-compiling would be https://github.com/lampepfl/dotty/blob/master/library/src/scala/implicits/Not.scala for implicit negation, see the documentation inside to understand the rationale. |
time is running short for PRs in this area |
But additions could go in post-2.13.0, no? |
No, we're impose forwards and backwards bin compat for the std lib. |
What's the purpose of forward bin compat? |
Libraries compiled with 2.N.(x+1) can safely be used on 2.N.x. |
Isn't sbt going to select the biggest version anyway? |
not everyone uses sbt -- it's good to re-evaluate this before we release 3.0, but the 2.x cycle probably can't change its bin compat policy |
I'd argue this is a legacy policy that can be changed in a major version change (e.g 2.13). scala-library is a transitive dependency, and the latest in a dependency graph should be picked, whatever the tool used to build the software is. |
Related discussion in https://contributors.scala-lang.org/t/question-about-changes-in-the-scalac-release-cycle/1093/11?u=jvican, where I argue how forward bincompat is useful to sbt plugin authors and software with similar plugin systems. |
As a possible counterargument: when you have a Spark installation on your cluster, its Scala version might be fixed to 2.11.8 (let's hope 2.12.6 soon...), which means that every Spark job will have Scala 2.11.8 on its classpath. Without forward compatibility, using libraries compiled with a more recent version could break at runtime. Correct me if this is a brainfart. |
No, your example is like Jorge's: a managed runtime (sbt/spark) where custom software (sbt plugins/spark jobs) that depends on a newer, backwards-compatible scala-library doesn't use the newer jar. It's a bit of chicken-and-egg problem. |
Something that seemed to be missing from that discussion is using libraries in a sbt plugin / spark / ... situation. Theoretically you could ask every spark developer to compile against 2.11.8 (which already means they might have to miss out on possible improvements or fixes in the compiler), but you can't ask them to make sure that every library they're (transitively) using was compiled against 2.11.[0-8]. |
tentatively reassigned to 2.14 milestone |
note that https://github.com/scala/scala-library-next is now open for business |
The text was updated successfully, but these errors were encountered: