Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-11672] [ML] flaky spark.ml read/write tests #9677

Closed
wants to merge 4 commits into from

Conversation

mengxr
Copy link
Contributor

@mengxr mengxr commented Nov 12, 2015

We set sqlContext = null in afterAll. However, this doesn't change SQLContext.activeContext and then SQLContext.getOrCreate might use the SparkContext from previous test suite and hence causes the error. This PR calls clearActive in beforeAll and afterAll to avoid using an old context from other test suites.

cc: @yhuai

@mengxr mengxr changed the title [WIP] [SPARK-11672] flaky spark.ml read/write tests [WIP] [SPARK-11672] [ML] flaky spark.ml read/write tests Nov 12, 2015
@mengxr
Copy link
Contributor Author

mengxr commented Nov 12, 2015

test this please

1 similar comment
@mengxr
Copy link
Contributor Author

mengxr commented Nov 12, 2015

test this please

@SparkQA
Copy link

SparkQA commented Nov 12, 2015

Test build #45770 has finished for PR 9677 at commit e130df6.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Nov 12, 2015

Test build #45773 has finished for PR 9677 at commit e130df6.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@mengxr
Copy link
Contributor Author

mengxr commented Nov 12, 2015

Okay, I think the problem is SQLContext.getOrCreate may return an old SQLContext and we need to clear the active SQLContext in afterAll.

@mengxr
Copy link
Contributor Author

mengxr commented Nov 12, 2015

[info] - read/write *** FAILED *** (2 milliseconds)
[info]   java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
[info] This stopped SparkContext was created at:
[info] 
[info] org.apache.spark.SparkContext.<init>(SparkContext.scala:82)
[info] org.apache.spark.mllib.util.MLlibTestSparkContext$class.beforeAll(MLlibTestSparkContext.scala:34)
[info] org.apache.spark.mllib.classification.LogisticRegressionSuite.beforeAll(LogisticRegressionSuite.scala:172)
[info] org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)
[info] org.apache.spark.mllib.classification.LogisticRegressionSuite.beforeAll(LogisticRegressionSuite.scala:172)
[info] org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)
[info] org.apache.spark.mllib.classification.LogisticRegressionSuite.run(LogisticRegressionSuite.scala:172)
[info] org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:462)
[info] org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:671)
[info] sbt.ForkMain$Run$2.call(ForkMain.java:294)
[info] sbt.ForkMain$Run$2.call(ForkMain.java:284)
[info] java.util.concurrent.FutureTask.run(FutureTask.java:262)
[info] java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
[info] java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
[info] java.lang.Thread.run(Thread.java:745)
[info] 
[info] The currently active SparkContext was created at:
[info] 
[info] org.apache.spark.SparkContext.<init>(SparkContext.scala:82)
[info] org.apache.spark.mllib.util.MLlibTestSparkContext$class.beforeAll(MLlibTestSparkContext.scala:34)
[info] org.apache.spark.ml.feature.BinarizerSuite.org$apache$spark$ml$util$TempDirectory$$super$beforeAll(BinarizerSuite.scala:26)
[info] org.apache.spark.ml.util.TempDirectory$class.beforeAll(TempDirectory.scala:37)
[info] org.apache.spark.ml.feature.BinarizerSuite.beforeAll(BinarizerSuite.scala:31)
[info] org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)
[info] org.apache.spark.ml.feature.BinarizerSuite.beforeAll(BinarizerSuite.scala:26)
[info] org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)
[info] org.apache.spark.ml.feature.BinarizerSuite.run(BinarizerSuite.scala:26)
[info] org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:462)
[info] org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:671)
[info] sbt.ForkMain$Run$2.call(ForkMain.java:294)
[info] sbt.ForkMain$Run$2.call(ForkMain.java:284)
[info] java.util.concurrent.FutureTask.run(FutureTask.java:262)
[info] java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
[info] java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
[info] java.lang.Thread.run(Thread.java:745)
[info]   at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:106)
[info]   at org.apache.spark.SparkContext$$anonfun$parallelize$1.apply(SparkContext.scala:732)
[info]   at org.apache.spark.SparkContext$$anonfun$parallelize$1.apply(SparkContext.scala:731)
[info]   at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
[info]   at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
[info]   at org.apache.spark.SparkContext.withScope(SparkContext.scala:717)
[info]   at org.apache.spark.SparkContext.parallelize(SparkContext.scala:731)
[info]   at org.apache.spark.ml.util.DefaultParamsWriter$.saveMetadata(ReadWrite.scala:206)
[info]   at org.apache.spark.ml.util.DefaultParamsWriter.saveImpl(ReadWrite.scala:178)
[info]   at org.apache.spark.ml.util.Writer.save(ReadWrite.scala:87)
[info]   at org.apache.spark.ml.util.Writable$class.save(ReadWrite.scala:127)
[info]   at org.apache.spark.ml.feature.Binarizer.save(Binarizer.scala:35)
[info]   at org.apache.spark.ml.util.DefaultReadWriteTest$class.testDefaultReadWrite(DefaultReadWriteTest.scala:40)
[info]   at org.apache.spark.ml.feature.BinarizerSuite.testDefaultReadWrite(BinarizerSuite.scala:26)
[info]   at org.apache.spark.ml.feature.BinarizerSuite$$anonfun$6.apply$mcV$sp(BinarizerSuite.scala:76)
[info]   at org.apache.spark.ml.feature.BinarizerSuite$$anonfun$6.apply(BinarizerSuite.scala:71)
[info]   at org.apache.spark.ml.feature.BinarizerSuite$$anonfun$6.apply(BinarizerSuite.scala:71)
[info]   at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
[info]   at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
[info]   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
[info]   at org.scalatest.Transformer.apply(Transformer.scala:22)
[info]   at org.scalatest.Transformer.apply(Transformer.scala:20)
[info]   at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:166)
[info]   at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:42)
[info]   at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:163)
[info]   at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
[info]   at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
[info]   at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
[info]   at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:175)
[info]   at org.scalatest.FunSuite.runTest(FunSuite.scala:1555)
[info]   at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
[info]   at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
[info]   at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
[info]   at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
[info]   at scala.collection.immutable.List.foreach(List.scala:318)
[info]   at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
[info]   at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
[info]   at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
[info]   at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:208)
[info]   at org.scalatest.FunSuite.runTests(FunSuite.scala:1555)
[info]   at org.scalatest.Suite$class.run(Suite.scala:1424)
[info]   at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1555)
[info]   at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
[info]   at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
[info]   at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
[info]   at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:212)
[info]   at org.apache.spark.ml.feature.BinarizerSuite.org$scalatest$BeforeAndAfterAll$$super$run(BinarizerSuite.scala:26)
[info]   at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:257)
[info]   at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:256)
[info]   at org.apache.spark.ml.feature.BinarizerSuite.run(BinarizerSuite.scala:26)
[info]   at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:462)
[info]   at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:671)
[info]   at sbt.ForkMain$Run$2.call(ForkMain.java:294)
[info]   at sbt.ForkMain$Run$2.call(ForkMain.java:284)
[info]   at java.util.concurrent.FutureTask.run(FutureTask.java:262)
[info]   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
[info]   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
[info]   at java.lang.Thread.run(Thread.java:745)

@mengxr mengxr changed the title [WIP] [SPARK-11672] [ML] flaky spark.ml read/write tests [SPARK-11672] [ML] flaky spark.ml read/write tests Nov 13, 2015
@SparkQA
Copy link

SparkQA commented Nov 13, 2015

Test build #45790 has finished for PR 9677 at commit da857a3.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):\n * case class StddevSamp(child: Expression,\n * case class StddevPop(\n

@SparkQA
Copy link

SparkQA commented Nov 13, 2015

Test build #45788 has finished for PR 9677 at commit 62b09a3.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@yhuai
Copy link
Contributor

yhuai commented Nov 13, 2015

LGTM

@mengxr
Copy link
Contributor Author

mengxr commented Nov 13, 2015

Merged into master and branch-1.6.

@asfgit asfgit closed this in e71c075 Nov 13, 2015
asfgit pushed a commit that referenced this pull request Nov 13, 2015
We set `sqlContext = null` in `afterAll`. However, this doesn't change `SQLContext.activeContext`  and then `SQLContext.getOrCreate` might use the `SparkContext` from previous test suite and hence causes the error. This PR calls `clearActive` in `beforeAll` and `afterAll` to avoid using an old context from other test suites.

cc: yhuai

Author: Xiangrui Meng <meng@databricks.com>

Closes #9677 from mengxr/SPARK-11672.2.

(cherry picked from commit e71c075)
Signed-off-by: Xiangrui Meng <meng@databricks.com>
dskrvk pushed a commit to dskrvk/spark that referenced this pull request Nov 13, 2015
We set `sqlContext = null` in `afterAll`. However, this doesn't change `SQLContext.activeContext`  and then `SQLContext.getOrCreate` might use the `SparkContext` from previous test suite and hence causes the error. This PR calls `clearActive` in `beforeAll` and `afterAll` to avoid using an old context from other test suites.

cc: yhuai

Author: Xiangrui Meng <meng@databricks.com>

Closes apache#9677 from mengxr/SPARK-11672.2.
@shivaram
Copy link
Contributor

@mengxr I just saw a failure that seems related to this PR at https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45914/consoleFull

[info] Test org.apache.spark.ml.util.JavaDefaultReadWriteSuite.testDefaultReadWrite started
[error] Test org.apache.spark.ml.util.JavaDefaultReadWriteSuite.testDefaultReadWrite failed: java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
[error] This stopped SparkContext was created at:
[error] 
[error] org.apache.spark.SparkContext.<init>(SparkContext.scala:82)
[error] org.apache.spark.mllib.util.MLlibTestSparkContext$class.beforeAll(MLlibTestSparkContext.scala:34)
[error] org.apache.spark.mllib.optimization.GradientDescentSuite.beforeAll(GradientDescentSuite.scala:65)
[error] org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)
[error] org.apache.spark.mllib.optimization.GradientDescentSuite.beforeAll(GradientDescentSuite.scala:65)
[error] org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)
[error] org.apache.spark.mllib.optimization.GradientDescentSuite.run(GradientDescentSuite.scala:65)
[error] org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:462)
[error] org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:671)
[error] sbt.ForkMain$Run$2.call(ForkMain.java:294)
[error] sbt.ForkMain$Run$2.call(ForkMain.java:284)
[error] java.util.concurrent.FutureTask.run(FutureTask.java:262)
[error] java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
[error] java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
[error] java.lang.Thread.run(Thread.java:745)
[error] 
[error] The currently active SparkContext was created at:
[error] 
[error] org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:65)
[error] org.apache.spark.ml.util.JavaDefaultReadWriteSuite.setUp(JavaDefaultReadWriteSuite.java:39)
[error] sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[error] sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
[error] sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[error] java.lang.reflect.Method.invoke(Method.java:606)
[error] org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
[error] org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
[error] org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
[error] org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
[error] org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
[error] org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
[error] org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
[error] org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
[error] org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
[error] org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
[error] org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
[error] org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
[error] org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
[error] org.junit.runners.ParentRunner.run(ParentRunner.java:309)
[error]          , took 0.212 sec
[error]     at org.apache.spark.SparkContext.org$apache$spark$SparkContext$$assertNotStopped(SparkContext.scala:106)
[error]     at org.apache.spark.SparkContext$$anonfun$parallelize$1.apply(SparkContext.scala:732)
[error]     at org.apache.spark.SparkContext$$anonfun$parallelize$1.apply(SparkContext.scala:731)
[error]     at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
[error]     at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
[error]     at org.apache.spark.SparkContext.withScope(SparkContext.scala:717)
[error]     at org.apache.spark.SparkContext.parallelize(SparkContext.scala:731)
[error]     at org.apache.spark.ml.util.DefaultParamsWriter$.saveMetadata(ReadWrite.scala:209)
[error]     at org.apache.spark.ml.util.DefaultParamsWriter.saveImpl(ReadWrite.scala:181)
[error]     at org.apache.spark.ml.util.Writer.save(ReadWrite.scala:90)
[error]     at org.apache.spark.ml.util.Writable$class.save(ReadWrite.scala:130)
[error]     at org.apache.spark.ml.util.MyParams.save(DefaultReadWriteTest.scala:69)
[error]     at org.apache.spark.ml.util.JavaDefaultReadWriteSuite.testDefaultReadWrite(JavaDefaultReadWriteSuite.java:59)
[error]     ...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants