-
Notifications
You must be signed in to change notification settings - Fork 28.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-20548][FLAKY-TEST] share one REPL instance among REPL test cases #17844
Conversation
Test build #76423 has finished for PR 17844 at commit
|
test this please |
Test build #76461 has finished for PR 17844 at commit
|
Test build #76471 has finished for PR 17844 at commit
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks pretty good. Just some nits.
val currentOffset = out.getBuffer.length() | ||
// append a `val _result = 1` statement to the end of the given code, so that we can know what's | ||
// the final output of this code snippet and rely on it to wait until the output is ready. | ||
in.write((input + s"\nval _result = 1\n").getBytes) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: it's better to use some more special string. E.g.,
val timestamp = System.currentTimeMillis()
in.write((input + s"\nval _result_$timestamp = 1\n").getBytes)
in.flush()
val stopMessage = s"_result_$timestamp: Int = 1"
} | ||
} | ||
|
||
def runInterpreter(input: String): String = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you add a comment mentioning that the codes in input
should not throw an exception?
super.afterAll() | ||
} | ||
|
||
private def waitUntil(cond: () => Boolean): Unit = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can use eventually
eventually(timeout(50.seconds), interval(500.millis)) {
assert(cond(), "current output: " + out.toString)
}
assertDoesNotContain("Exception", output) | ||
assertContains(": Int = 20", output) | ||
} | ||
|
||
test("line wrapper only initialized once when used as encoder outer scope") { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why you don't move this test?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this test only works in local mode...
val classpath = paths.map(new File(_).getAbsolutePath).mkString(File.pathSeparator) | ||
|
||
System.setProperty(CONF_EXECUTOR_CLASSPATH, classpath) | ||
Main.conf.set("spark.master", "local-cluster[2,4,4096]") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: I think it's safe to just use local-cluster[2,1,1024]
.
Main.sparkSession = null | ||
|
||
// Starts a new thread to run the REPL interpreter, so that we won't block. | ||
thread = new Thread(new Runnable { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: please call Thread.setDaemon(true)
in case that something is wrong and it prevents the process from exiting.
Test build #76569 has finished for PR 17844 at commit
|
retest this please |
Test build #76577 has finished for PR 17844 at commit
|
retest this please |
Test build #76582 has finished for PR 17844 at commit
|
retest this please |
Test build #76600 has finished for PR 17844 at commit
|
Test build #76618 has started for PR 17844 at commit |
retest this please |
Test build #76649 has finished for PR 17844 at commit
|
Test build #76656 has finished for PR 17844 at commit
|
retest this please |
Test build #76675 has finished for PR 17844 at commit
|
`ReplSuite.newProductSeqEncoder with REPL defined class` was flaky and throws OOM exception frequently. By analyzing the heap dump, we found the reason is that, in each test case of `ReplSuite`, we create a REPL instance, which creates a classloader and loads a lot of classes related to `SparkContext`. More details please see #17833 (comment). In this PR, we create a new test suite, `SingletonReplSuite`, which shares one REPL instances among all the test cases. Then we move most of the tests from `ReplSuite` to `SingletonReplSuite`, to avoid creating a lot of REPL instances and reduce memory footprint. test only change Author: Wenchen Fan <wenchen@databricks.com> Closes #17844 from cloud-fan/flaky-test. (cherry picked from commit f561a76) Signed-off-by: Wenchen Fan <wenchen@databricks.com>
thanks for the review, merging to maser/2.2! |
## What changes were proposed in this pull request? `ReplSuite.newProductSeqEncoder with REPL defined class` was flaky and throws OOM exception frequently. By analyzing the heap dump, we found the reason is that, in each test case of `ReplSuite`, we create a REPL instance, which creates a classloader and loads a lot of classes related to `SparkContext`. More details please see apache#17833 (comment). In this PR, we create a new test suite, `SingletonReplSuite`, which shares one REPL instances among all the test cases. Then we move most of the tests from `ReplSuite` to `SingletonReplSuite`, to avoid creating a lot of REPL instances and reduce memory footprint. ## How was this patch tested? test only change Author: Wenchen Fan <wenchen@databricks.com> Closes apache#17844 from cloud-fan/flaky-test.
What changes were proposed in this pull request?
ReplSuite.newProductSeqEncoder with REPL defined class
was flaky and throws OOM exception frequently. By analyzing the heap dump, we found the reason is that, in each test case ofReplSuite
, we create a REPL instance, which creates a classloader and loads a lot of classes related toSparkContext
. More details please see #17833 (comment).In this PR, we create a new test suite,
SingletonReplSuite
, which shares one REPL instances among all the test cases. Then we move most of the tests fromReplSuite
toSingletonReplSuite
, to avoid creating a lot of REPL instances and reduce memory footprint.How was this patch tested?
test only change