You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Ye Xianjin <ad...@gmail.com> on 2014/04/14 12:14:45 UTC

Tests failed after assembling the latest code from github

Hi, everyone: 
I am new to Spark development. I download spark's latest code from github. After running sbt/sbt assembly,
I began running  sbt/sbt test in the spark source code dir. But it failed running the repl module test.

Here are some output details.

command:
sbt/sbt "test-only org.apache.spark.repl.*"
output:

[info] Loading project definition from /Volumes/MacintoshHD/github/spark/project/project
[info] Loading project definition from /Volumes/MacintoshHD/github/spark/project
[info] Set current project to root (in build file:/Volumes/MacintoshHD/github/spark/)
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] No tests to run for graphx/test:testOnly
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] No tests to run for bagel/test:testOnly
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] No tests to run for streaming/test:testOnly
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] No tests to run for mllib/test:testOnly
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] No tests to run for catalyst/test:testOnly
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] No tests to run for core/test:testOnly
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] No tests to run for assembly/test:testOnly
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] No tests to run for sql/test:testOnly
[info] ExecutorClassLoaderSuite:
2014-04-14 16:59:31.247 java[8393:1003] Unable to load realm info from SCDynamicStore
[info] - child first *** FAILED *** (440 milliseconds)
[info]   java.lang.ClassNotFoundException: ReplFakeClass2
[info]   at java.lang.ClassLoader.findClass(ClassLoader.java:364)
[info]   at org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26)
[info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
[info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
[info]   at org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30)
[info]   at org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
[info]   at org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
[info]   at scala.Option.getOrElse(Option.scala:120)
[info]   at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:57)
[info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
[info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
[info]   at org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply$mcV$sp(ExecutorClassLoaderSuite.scala:47)
[info]   at org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
[info]   at org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
[info]   at org.scalatest.FunSuite$$anon$1.apply(FunSuite.scala:1265)
[info]   at org.scalatest.Suite$class.withFixture(Suite.scala:1974)
[info]   at org.apache.spark.repl.ExecutorClassLoaderSuite.withFixture(ExecutorClassLoaderSuite.scala:30)
[info]   at org.scalatest.FunSuite$class.invokeWithFixture$1(FunSuite.scala:1262)
[info]   at org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
[info]   at org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
[info]   at org.scalatest.SuperEngine.runTestImpl(Engine.scala:198)
[info]   at org.scalatest.FunSuite$class.runTest(FunSuite.scala:1271)
[info]   at org.apache.spark.repl.ExecutorClassLoaderSuite.runTest(ExecutorClassLoaderSuite.scala:30)
[info]   at org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
[info]   at org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
[info]   at org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:260)
[info]   at org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:249)
[info]   at scala.collection.immutable.List.foreach(List.scala:318)
[info]   at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:249)
[info]   at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:326)
[info]   at org.scalatest.FunSuite$class.runTests(FunSuite.scala:1304)
[info]   at org.apache.spark.repl.ExecutorClassLoaderSuite.runTests(ExecutorClassLoaderSuite.scala:30)
[info]   at org.scalatest.Suite$class.run(Suite.scala:2303)
[info]   at org.apache.spark.repl.ExecutorClassLoaderSuite.org$scalatest$FunSuite$$super$run(ExecutorClassLoaderSuite.scala:30)
[info]   at org.scalatest.FunSuite$$anonfun$run$1.apply(FunSuite.scala:1310)
[info]   at org.scalatest.FunSuite$$anonfun$run$1.apply(FunSuite.scala:1310)
[info]   at org.scalatest.SuperEngine.runImpl(Engine.scala:362)
[info]   at org.scalatest.FunSuite$class.run(FunSuite.scala:1310)
[info]   at org.apache.spark.repl.ExecutorClassLoaderSuite.org$scalatest$BeforeAndAfterAll$$super$run(ExecutorClassLoaderSuite.scala:30)
[info]   at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:213)
[info]   at org.apache.spark.repl.ExecutorClassLoaderSuite.run(ExecutorClassLoaderSuite.scala:30)
[info]   at org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:214)
[info]   at sbt.RunnerWrapper$1.runRunner2(FrameworkWrapper.java:220)
[info]   at sbt.RunnerWrapper$1.execute(FrameworkWrapper.java:233)
[info]   at sbt.ForkMain$Run.runTest(ForkMain.java:243)
[info]   at sbt.ForkMain$Run.runTestSafe(ForkMain.java:214)
[info]   at sbt.ForkMain$Run.runTests(ForkMain.java:190)
[info]   at sbt.ForkMain$Run.run(ForkMain.java:257)
[info]   at sbt.ForkMain.main(ForkMain.java:99)
[info] - parent first *** FAILED *** (59 milliseconds)
[info]   java.lang.ClassNotFoundException: ReplFakeClass1
...
[info]   Cause: java.lang.ClassNotFoundException: ReplFakeClass1
...
[info] - child first can fall back *** FAILED *** (39 milliseconds)
[info]   java.lang.ClassNotFoundException: ReplFakeClass3
...
[info] - child first can fail (46 milliseconds)
[info] ReplSuite:
[info] - propagation of local properties (9 seconds, 353 milliseconds)
[info] - simple foreach with accumulator (7 seconds, 608 milliseconds)
[info] - external vars (5 seconds, 783 milliseconds)
[info] - external classes (4 seconds, 341 milliseconds)
[info] - external functions (4 seconds, 106 milliseconds)
[info] - external functions that access vars (4 seconds, 538 milliseconds)
[info] - broadcast vars (4 seconds, 155 milliseconds)
[info] - interacting with files (3 seconds, 376 milliseconds)
Exception in thread "Connection manager future execution context-0"


Some output is omitted.

Here are some more information:
ReplFakeClass1.class is in the {spark_source_dir}/repl/ReplFakeClass1.class, same as ReplFakeClass2 and 3.
ReplSuite failed in running test("local-cluster mode"). The first time running this test throws OOM error. The exception shown in above is a second try
The test("local-cluster mode") jvm options are '-Xms512M -Xmx512M' which I see from the corresponding stderr log
I have .sbtconfig file in my home dir.  The content is 
export SBT_OPTS="-XX:+CMSClassUnloadingEnabled -XX:PermSize=5120M -XX:MaxPermSize=10240M"


The test hung after the test failed in the ReplSuite. I have to Ctr-c to close the test.

Thank you for you advice.



-- 
Ye Xianjin
Sent with Sparrow (http://www.sparrowmailapp.com/?sig)


Re: Tests failed after assembling the latest code from github

Posted by Ye Xianjin <ad...@gmail.com>.
@Sean Owen, Thanks for your advice.
 There are still some failing tests on my laptop. I will work on this issue(file move) as soon as I figure out other test related issues.


-- 
Ye Xianjin
Sent with Sparrow (http://www.sparrowmailapp.com/?sig)


On Tuesday, April 15, 2014 at 2:41 PM, Sean Owen wrote:

> Good call -- indeed that same Files class has a move() method that
> will try to use renameTo() and then fall back to copy() and delete()
> if needed for this very reason.
> 
> 
> On Tue, Apr 15, 2014 at 6:34 AM, Ye Xianjin <advancedxy@gmail.com (mailto:advancedxy@gmail.com)> wrote:
> > Hi, I think I have found the cause of the tests failing.
> > 
> > I have two disks on my laptop. The spark project dir is on an HDD disk while the tempdir created by google.io.Files.createTempDir is the /var/folders/5q/.... ,which is on the system disk, an SSD.
> > The ExecutorLoaderSuite test uses org.apache.spark.TestUtils.createdCompiledClass methods.
> > The createCompiledClass method first generates the compiled class in the pwd(spark/repl), thens use renameTo to move
> > the file. The renameTo method fails because the dest file is in a different filesystem than the source file.
> > 
> > I modify the TestUtils.scala to first copy the file to dest then delete the original file. The tests go smoothly.
> > Should I issue an jira about this problem? Then I can send a pr on Github.
> > 
> 
> 
> 



Re: Tests failed after assembling the latest code from github

Posted by Sean Owen <so...@cloudera.com>.
Good call -- indeed that same Files class has a move() method that
will try to use renameTo() and then fall back to copy() and delete()
if needed for this very reason.


On Tue, Apr 15, 2014 at 6:34 AM, Ye Xianjin <ad...@gmail.com> wrote:
> Hi, I think I have found the cause of the tests failing.
>
> I have two disks on my laptop. The spark project dir is on an HDD disk while the tempdir created by google.io.Files.createTempDir is the /var/folders/5q/.... ,which is on the system disk, an SSD.
> The ExecutorLoaderSuite test uses org.apache.spark.TestUtils.createdCompiledClass methods.
> The createCompiledClass method first generates the compiled class in the pwd(spark/repl), thens use renameTo to move
> the file. The renameTo method fails because the dest file is in a different filesystem than the source file.
>
> I modify the TestUtils.scala to first copy the file to dest then delete the original file. The tests go smoothly.
> Should I issue an jira about this problem? Then I can send a pr on Github.

Re: Tests failed after assembling the latest code from github

Posted by Aaron Davidson <il...@gmail.com>.
By all means, it would be greatly appreciated!


On Mon, Apr 14, 2014 at 10:34 PM, Ye Xianjin <ad...@gmail.com> wrote:

> Hi, I think I have found the cause of the tests failing.
>
> I have two disks on my laptop. The spark project dir is on an HDD disk
> while the tempdir created by google.io.Files.createTempDir is the
> /var/folders/5q/.... ,which is on the system disk, an SSD.
> The ExecutorLoaderSuite test uses
> org.apache.spark.TestUtils.createdCompiledClass methods.
> The createCompiledClass method first generates the compiled class in the
> pwd(spark/repl), thens use renameTo to move
> the file. The renameTo method fails because the dest file is in a
> different filesystem than the source file.
>
> I modify the TestUtils.scala to first copy the file to dest then delete
> the original file. The tests go smoothly.
> Should I issue an jira about this problem? Then I can send a pr on Github.
>
> --
> Ye Xianjin
> Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
>
>
> On Tuesday, April 15, 2014 at 3:43 AM, Ye Xianjin wrote:
>
> > well. This is very strange.
> > I looked into ExecutorClassLoaderSuite.scala and ReplSuite.scala and
> made small changes to ExecutorClassLoaderSuite.scala (mostly output some
> internal variables). After that, when running repl test, I noticed the
> ReplSuite
> > was tested first and the test result is ok. But the
> ExecutorClassLoaderSuite test was weird.
> > Here is the output:
> > [info] ExecutorClassLoaderSuite:
> > [error] Uncaught exception when running
> org.apache.spark.repl.ExecutorClassLoaderSuite: java.lang.OutOfMemoryError:
> PermGen space
> > [error] Uncaught exception when running
> org.apache.spark.repl.ExecutorClassLoaderSuite: java.lang.OutOfMemoryError:
> PermGen space
> > Internal error when running tests: java.lang.OutOfMemoryError: PermGen
> space
> > Exception in thread "Thread-3" java.io.EOFException
> > at
> java.io.ObjectInputStream$BlockDataInputStream.peekByte(ObjectInputStream.java:2577)
> > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1297)
> > at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1685)
> > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1323)
> > at java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
> > at sbt.React.react(ForkTests.scala:116)
> > at
> sbt.ForkTests$$anonfun$mainTestTask$1$Acceptor$2$.run(ForkTests.scala:75)
> > at java.lang.Thread.run(Thread.java:695)
> >
> >
> > I revert my changes. The test result is same.
> >
> >  I touched the ReplSuite.scala file (use touch command), the test order
> is reversed, same as the very beginning. And the output is also the
> same.(The result in my first post).
> >
> >
> > --
> > Ye Xianjin
> > Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
> >
> >
> > On Tuesday, April 15, 2014 at 3:14 AM, Aaron Davidson wrote:
> >
> > > This may have something to do with running the tests on a Mac, as
> there is
> > > a lot of File/URI/URL stuff going on in that test which may just have
> > > happened to work if run on a Linux system (like Jenkins). Note that
> this
> > > suite was added relatively recently:
> > > https://github.com/apache/spark/pull/217
> > >
> > >
> > > On Mon, Apr 14, 2014 at 12:04 PM, Ye Xianjin <advancedxy@gmail.com(mailto:
> advancedxy@gmail.com)> wrote:
> > >
> > > > Thank you for your reply.
> > > >
> > > > After building the assembly jar, the repl test still failed. The
> error
> > > > output is same as I post before.
> > > >
> > > > --
> > > > Ye Xianjin
> > > > Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
> > > >
> > > >
> > > > On Tuesday, April 15, 2014 at 1:39 AM, Michael Armbrust wrote:
> > > >
> > > > > I believe you may need an assembly jar to run the ReplSuite.
> "sbt/sbt
> > > > > assembly/assembly".
> > > > >
> > > > > Michael
> > > > >
> > > > >
> > > > > On Mon, Apr 14, 2014 at 3:14 AM, Ye Xianjin <advancedxy@gmail.com(mailto:
> advancedxy@gmail.com)(mailto:
> > > > advancedxy@gmail.com (mailto:advancedxy@gmail.com))> wrote:
> > > > >
> > > > > > Hi, everyone:
> > > > > > I am new to Spark development. I download spark's latest code
> from
> > > > > >
> > > > >
> > > > >
> > > >
> > > > github.
> > > > > > After running sbt/sbt assembly,
> > > > > > I began running sbt/sbt test in the spark source code dir. But it
> > > > > >
> > > > >
> > > >
> > > > failed
> > > > > > running the repl module test.
> > > > > >
> > > > > > Here are some output details.
> > > > > >
> > > > > > command:
> > > > > > sbt/sbt "test-only org.apache.spark.repl.*"
> > > > > > output:
> > > > > >
> > > > > > [info] Loading project definition from
> > > > > > /Volumes/MacintoshHD/github/spark/project/project
> > > > > > [info] Loading project definition from
> > > > > > /Volumes/MacintoshHD/github/spark/project
> > > > > > [info] Set current project to root (in build
> > > > > > file:/Volumes/MacintoshHD/github/spark/)
> > > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > > [info] No tests to run for graphx/test:testOnly
> > > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > > [info] No tests to run for bagel/test:testOnly
> > > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > > [info] No tests to run for streaming/test:testOnly
> > > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > > [info] No tests to run for mllib/test:testOnly
> > > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > > [info] No tests to run for catalyst/test:testOnly
> > > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > > [info] No tests to run for core/test:testOnly
> > > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > > [info] No tests to run for assembly/test:testOnly
> > > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > > [info] No tests to run for sql/test:testOnly
> > > > > > [info] ExecutorClassLoaderSuite:
> > > > > > 2014-04-14 16:59:31.247 java[8393:1003] Unable to load realm
> info from
> > > > > > SCDynamicStore
> > > > > > [info] - child first *** FAILED *** (440 milliseconds)
> > > > > > [info] java.lang.ClassNotFoundException: ReplFakeClass2
> > > > > > [info] at java.lang.ClassLoader.findClass(ClassLoader.java:364)
> > > > > > [info] at
> > > > > >
> > > > >
> > > >
> > > >
> org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26)
> > > > > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > > > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > > > > [info] at
> > > > > >
> > > > >
> > > >
> > > >
> org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30)
> > > > > > [info] at
> > > > >
> > > >
> > > >
> org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> > > > > > [info] at
> > > > >
> > > >
> > > >
> org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> > > > > > [info] at scala.Option.getOrElse(Option.scala:120)
> > > > > > [info] at
> > > > > >
> > > > >
> > > >
> > > >
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:57)
> > > > > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > > > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > > > > [info] at
> > > > > >
> > > > >
> > > >
> > > >
> org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply$mcV$sp(ExecutorClassLoaderSuite.scala:47)
> > > > > > [info] at
> > > > >
> > > >
> > > >
> org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> > > > > > [info] at
> > > > >
> > > >
> > > >
> org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> > > > > > [info] at
> org.scalatest.FunSuite$$anon$1.apply(FunSuite.scala:1265)
> > > > > > [info] at org.scalatest.Suite$class.withFixture(Suite.scala:1974)
> > > > > > [info] at
> > > > > >
> > > > >
> > > >
> > > >
> org.apache.spark.repl.ExecutorClassLoaderSuite.withFixture(ExecutorClassLoaderSuite.scala:30)
> > > > > > [info] at
> > > > > >
> org.scalatest.FunSuite$class.invokeWithFixture$1(FunSuite.scala:1262)
> > > > > > [info] at
> > > > > >
> org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> > > > > > [info] at
> > > > > >
> org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> > > > > > [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:198)
> > > > > > [info] at
> org.scalatest.FunSuite$class.runTest(FunSuite.scala:1271)
> > > > > > [info] at
> > > > > >
> > > > >
> > > >
> > > >
> org.apache.spark.repl.ExecutorClassLoaderSuite.runTest(ExecutorClassLoaderSuite.scala:30)
> > > > > > [info] at
> > > > > >
> org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> > > > > > [info] at
> > > > > >
> org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> > > > > > [info] at
> > > > > >
> > > > >
> > > >
> > > >
> org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:260)
> > > > > > [info] at
> > > > >
> > > >
> > > >
> org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:249)
> > > > > > [info] at scala.collection.immutable.List.foreach(List.scala:318)
> > > > > > [info] at org.scalatest.SuperEngine.org (
> http://org.scalatest.SuperEngine.org) (
> > > > > >
> > > > >
> > > >
> > > > http://org.scalatest.SuperEngine.org)
> > > > > > $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:249)
> > > > > > [info] at
> org.scalatest.SuperEngine.runTestsImpl(Engine.scala:326)
> > > > > > [info] at
> org.scalatest.FunSuite$class.runTests(FunSuite.scala:1304)
> > > > > > [info] at
> > > > > >
> > > > >
> > > >
> > > >
> org.apache.spark.repl.ExecutorClassLoaderSuite.runTests(ExecutorClassLoaderSuite.scala:30)
> > > > > > [info] at org.scalatest.Suite$class.run(Suite.scala:2303)
> > > > > > [info] at org.apache.spark.repl.ExecutorClassLoaderSuite.org (
> http://org.apache.spark.repl.ExecutorClassLoaderSuite.org) (
> > > > > >
> > > > >
> > > >
> > > > http://org.apache.spark.repl.ExecutorClassLoaderSuite.org)
> > > > > > $scalatest$FunSuite$$super$run(ExecutorClassLoaderSuite.scala:30)
> > > > > > [info] at
> > > > > > org.scalatest.FunSuite$$anonfun$run$1.apply(FunSuite.scala:1310)
> > > > > > [info] at
> > > > > > org.scalatest.FunSuite$$anonfun$run$1.apply(FunSuite.scala:1310)
> > > > > > [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:362)
> > > > > > [info] at org.scalatest.FunSuite$class.run(FunSuite.scala:1310)
> > > > > > [info] at org.apache.spark.repl.ExecutorClassLoaderSuite.org (
> http://org.apache.spark.repl.ExecutorClassLoaderSuite.org) (
> > > > > >
> > > > >
> > > >
> > > > http://org.apache.spark.repl.ExecutorClassLoaderSuite.org)
> > > > > >
> > > > >
> > > >
> > > >
> $scalatest$BeforeAndAfterAll$$super$run(ExecutorClassLoaderSuite.scala:30)
> > > > > > [info] at
> > > > > >
> org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:213)
> > > > > > [info] at
> > > > > >
> > > > >
> > > >
> > > >
> org.apache.spark.repl.ExecutorClassLoaderSuite.run(ExecutorClassLoaderSuite.scala:30)
> > > > > > [info] at
> > > > >
> > > >
> > > >
> org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:214)
> > > > > > [info] at
> sbt.RunnerWrapper$1.runRunner2(FrameworkWrapper.java:220)
> > > > > > [info] at sbt.RunnerWrapper$1.execute(FrameworkWrapper.java:233)
> > > > > > [info] at sbt.ForkMain$Run.runTest(ForkMain.java:243)
> > > > > > [info] at sbt.ForkMain$Run.runTestSafe(ForkMain.java:214)
> > > > > > [info] at sbt.ForkMain$Run.runTests(ForkMain.java:190)
> > > > > > [info] at sbt.ForkMain$Run.run(ForkMain.java:257)
> > > > > > [info] at sbt.ForkMain.main(ForkMain.java:99)
> > > > > > [info] - parent first *** FAILED *** (59 milliseconds)
> > > > > > [info] java.lang.ClassNotFoundException: ReplFakeClass1
> > > > > > ...
> > > > > > [info] Cause: java.lang.ClassNotFoundException: ReplFakeClass1
> > > > > > ...
> > > > > > [info] - child first can fall back *** FAILED *** (39
> milliseconds)
> > > > > > [info] java.lang.ClassNotFoundException: ReplFakeClass3
> > > > > > ...
> > > > > > [info] - child first can fail (46 milliseconds)
> > > > > > [info] ReplSuite:
> > > > > > [info] - propagation of local properties (9 seconds, 353
> milliseconds)
> > > > > > [info] - simple foreach with accumulator (7 seconds, 608
> milliseconds)
> > > > > > [info] - external vars (5 seconds, 783 milliseconds)
> > > > > > [info] - external classes (4 seconds, 341 milliseconds)
> > > > > > [info] - external functions (4 seconds, 106 milliseconds)
> > > > > > [info] - external functions that access vars (4 seconds, 538
> > > > > >
> > > > >
> > > >
> > > > milliseconds)
> > > > > > [info] - broadcast vars (4 seconds, 155 milliseconds)
> > > > > > [info] - interacting with files (3 seconds, 376 milliseconds)
> > > > > > Exception in thread "Connection manager future execution
> context-0"
> > > > > >
> > > > > >
> > > > > > Some output is omitted.
> > > > > >
> > > > > > Here are some more information:
> > > > > > ReplFakeClass1.class is in the
> > > > > > {spark_source_dir}/repl/ReplFakeClass1.class, same as
> ReplFakeClass2
> > > > > >
> > > > >
> > > >
> > > > and 3.
> > > > > > ReplSuite failed in running test("local-cluster mode"). The
> first time
> > > > > > running this test throws OOM error. The exception shown in above
> is a
> > > > > > second try
> > > > > > The test("local-cluster mode") jvm options are '-Xms512M
> -Xmx512M'
> > > > > >
> > > > >
> > > >
> > > > which I
> > > > > > see from the corresponding stderr log
> > > > > > I have .sbtconfig file in my home dir. The content is
> > > > > > export SBT_OPTS="-XX:+CMSClassUnloadingEnabled -XX:PermSize=5120M
> > > > > > -XX:MaxPermSize=10240M"
> > > > > >
> > > > > >
> > > > > > The test hung after the test failed in the ReplSuite. I have to
> Ctr-c
> > > > to
> > > > > > close the test.
> > > > > >
> > > > > > Thank you for you advice.
> > > > > >
> > > > > >
> > > > > >
> > > > > > --
> > > > > > Ye Xianjin
> > > > > > Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
> > > > > >
> > > > >
> > > >
> > > >
> > >
> > >
> > >
> > >
> >
> >
>
>

Re: Tests failed after assembling the latest code from github

Posted by Ye Xianjin <ad...@gmail.com>.
Hi, I think I have found the cause of the tests failing. 

I have two disks on my laptop. The spark project dir is on an HDD disk while the tempdir created by google.io.Files.createTempDir is the /var/folders/5q/.... ,which is on the system disk, an SSD.
The ExecutorLoaderSuite test uses org.apache.spark.TestUtils.createdCompiledClass methods.
The createCompiledClass method first generates the compiled class in the pwd(spark/repl), thens use renameTo to move
the file. The renameTo method fails because the dest file is in a different filesystem than the source file.

I modify the TestUtils.scala to first copy the file to dest then delete the original file. The tests go smoothly.
Should I issue an jira about this problem? Then I can send a pr on Github.

-- 
Ye Xianjin
Sent with Sparrow (http://www.sparrowmailapp.com/?sig)


On Tuesday, April 15, 2014 at 3:43 AM, Ye Xianjin wrote:

> well. This is very strange. 
> I looked into ExecutorClassLoaderSuite.scala and ReplSuite.scala and made small changes to ExecutorClassLoaderSuite.scala (mostly output some internal variables). After that, when running repl test, I noticed the ReplSuite  
> was tested first and the test result is ok. But the ExecutorClassLoaderSuite test was weird.
> Here is the output:
> [info] ExecutorClassLoaderSuite:
> [error] Uncaught exception when running org.apache.spark.repl.ExecutorClassLoaderSuite: java.lang.OutOfMemoryError: PermGen space
> [error] Uncaught exception when running org.apache.spark.repl.ExecutorClassLoaderSuite: java.lang.OutOfMemoryError: PermGen space
> Internal error when running tests: java.lang.OutOfMemoryError: PermGen space
> Exception in thread "Thread-3" java.io.EOFException
> at java.io.ObjectInputStream$BlockDataInputStream.peekByte(ObjectInputStream.java:2577)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1297)
> at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1685)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1323)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
> at sbt.React.react(ForkTests.scala:116)
> at sbt.ForkTests$$anonfun$mainTestTask$1$Acceptor$2$.run(ForkTests.scala:75)
> at java.lang.Thread.run(Thread.java:695)
> 
> 
> I revert my changes. The test result is same.
> 
>  I touched the ReplSuite.scala file (use touch command), the test order is reversed, same as the very beginning. And the output is also the same.(The result in my first post).
> 
> 
> -- 
> Ye Xianjin
> Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
> 
> 
> On Tuesday, April 15, 2014 at 3:14 AM, Aaron Davidson wrote:
> 
> > This may have something to do with running the tests on a Mac, as there is
> > a lot of File/URI/URL stuff going on in that test which may just have
> > happened to work if run on a Linux system (like Jenkins). Note that this
> > suite was added relatively recently:
> > https://github.com/apache/spark/pull/217
> > 
> > 
> > On Mon, Apr 14, 2014 at 12:04 PM, Ye Xianjin <advancedxy@gmail.com (mailto:advancedxy@gmail.com)> wrote:
> > 
> > > Thank you for your reply.
> > > 
> > > After building the assembly jar, the repl test still failed. The error
> > > output is same as I post before.
> > > 
> > > --
> > > Ye Xianjin
> > > Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
> > > 
> > > 
> > > On Tuesday, April 15, 2014 at 1:39 AM, Michael Armbrust wrote:
> > > 
> > > > I believe you may need an assembly jar to run the ReplSuite. "sbt/sbt
> > > > assembly/assembly".
> > > > 
> > > > Michael
> > > > 
> > > > 
> > > > On Mon, Apr 14, 2014 at 3:14 AM, Ye Xianjin <advancedxy@gmail.com (mailto:advancedxy@gmail.com)(mailto:
> > > advancedxy@gmail.com (mailto:advancedxy@gmail.com))> wrote:
> > > > 
> > > > > Hi, everyone:
> > > > > I am new to Spark development. I download spark's latest code from
> > > > > 
> > > > 
> > > > 
> > > 
> > > github.
> > > > > After running sbt/sbt assembly,
> > > > > I began running sbt/sbt test in the spark source code dir. But it
> > > > > 
> > > > 
> > > 
> > > failed
> > > > > running the repl module test.
> > > > > 
> > > > > Here are some output details.
> > > > > 
> > > > > command:
> > > > > sbt/sbt "test-only org.apache.spark.repl.*"
> > > > > output:
> > > > > 
> > > > > [info] Loading project definition from
> > > > > /Volumes/MacintoshHD/github/spark/project/project
> > > > > [info] Loading project definition from
> > > > > /Volumes/MacintoshHD/github/spark/project
> > > > > [info] Set current project to root (in build
> > > > > file:/Volumes/MacintoshHD/github/spark/)
> > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > [info] No tests to run for graphx/test:testOnly
> > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > [info] No tests to run for bagel/test:testOnly
> > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > [info] No tests to run for streaming/test:testOnly
> > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > [info] No tests to run for mllib/test:testOnly
> > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > [info] No tests to run for catalyst/test:testOnly
> > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > [info] No tests to run for core/test:testOnly
> > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > [info] No tests to run for assembly/test:testOnly
> > > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > > [info] No tests to run for sql/test:testOnly
> > > > > [info] ExecutorClassLoaderSuite:
> > > > > 2014-04-14 16:59:31.247 java[8393:1003] Unable to load realm info from
> > > > > SCDynamicStore
> > > > > [info] - child first *** FAILED *** (440 milliseconds)
> > > > > [info] java.lang.ClassNotFoundException: ReplFakeClass2
> > > > > [info] at java.lang.ClassLoader.findClass(ClassLoader.java:364)
> > > > > [info] at
> > > > > 
> > > > 
> > > 
> > > org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26)
> > > > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > > > [info] at
> > > > > 
> > > > 
> > > 
> > > org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30)
> > > > > [info] at
> > > > 
> > > 
> > > org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> > > > > [info] at
> > > > 
> > > 
> > > org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> > > > > [info] at scala.Option.getOrElse(Option.scala:120)
> > > > > [info] at
> > > > > 
> > > > 
> > > 
> > > org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:57)
> > > > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > > > [info] at
> > > > > 
> > > > 
> > > 
> > > org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply$mcV$sp(ExecutorClassLoaderSuite.scala:47)
> > > > > [info] at
> > > > 
> > > 
> > > org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> > > > > [info] at
> > > > 
> > > 
> > > org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> > > > > [info] at org.scalatest.FunSuite$$anon$1.apply(FunSuite.scala:1265)
> > > > > [info] at org.scalatest.Suite$class.withFixture(Suite.scala:1974)
> > > > > [info] at
> > > > > 
> > > > 
> > > 
> > > org.apache.spark.repl.ExecutorClassLoaderSuite.withFixture(ExecutorClassLoaderSuite.scala:30)
> > > > > [info] at
> > > > > org.scalatest.FunSuite$class.invokeWithFixture$1(FunSuite.scala:1262)
> > > > > [info] at
> > > > > org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> > > > > [info] at
> > > > > org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> > > > > [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:198)
> > > > > [info] at org.scalatest.FunSuite$class.runTest(FunSuite.scala:1271)
> > > > > [info] at
> > > > > 
> > > > 
> > > 
> > > org.apache.spark.repl.ExecutorClassLoaderSuite.runTest(ExecutorClassLoaderSuite.scala:30)
> > > > > [info] at
> > > > > org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> > > > > [info] at
> > > > > org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> > > > > [info] at
> > > > > 
> > > > 
> > > 
> > > org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:260)
> > > > > [info] at
> > > > 
> > > 
> > > org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:249)
> > > > > [info] at scala.collection.immutable.List.foreach(List.scala:318)
> > > > > [info] at org.scalatest.SuperEngine.org (http://org.scalatest.SuperEngine.org) (
> > > > > 
> > > > 
> > > 
> > > http://org.scalatest.SuperEngine.org)
> > > > > $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:249)
> > > > > [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:326)
> > > > > [info] at org.scalatest.FunSuite$class.runTests(FunSuite.scala:1304)
> > > > > [info] at
> > > > > 
> > > > 
> > > 
> > > org.apache.spark.repl.ExecutorClassLoaderSuite.runTests(ExecutorClassLoaderSuite.scala:30)
> > > > > [info] at org.scalatest.Suite$class.run(Suite.scala:2303)
> > > > > [info] at org.apache.spark.repl.ExecutorClassLoaderSuite.org (http://org.apache.spark.repl.ExecutorClassLoaderSuite.org) (
> > > > > 
> > > > 
> > > 
> > > http://org.apache.spark.repl.ExecutorClassLoaderSuite.org)
> > > > > $scalatest$FunSuite$$super$run(ExecutorClassLoaderSuite.scala:30)
> > > > > [info] at
> > > > > org.scalatest.FunSuite$$anonfun$run$1.apply(FunSuite.scala:1310)
> > > > > [info] at
> > > > > org.scalatest.FunSuite$$anonfun$run$1.apply(FunSuite.scala:1310)
> > > > > [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:362)
> > > > > [info] at org.scalatest.FunSuite$class.run(FunSuite.scala:1310)
> > > > > [info] at org.apache.spark.repl.ExecutorClassLoaderSuite.org (http://org.apache.spark.repl.ExecutorClassLoaderSuite.org) (
> > > > > 
> > > > 
> > > 
> > > http://org.apache.spark.repl.ExecutorClassLoaderSuite.org)
> > > > > 
> > > > 
> > > 
> > > $scalatest$BeforeAndAfterAll$$super$run(ExecutorClassLoaderSuite.scala:30)
> > > > > [info] at
> > > > > org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:213)
> > > > > [info] at
> > > > > 
> > > > 
> > > 
> > > org.apache.spark.repl.ExecutorClassLoaderSuite.run(ExecutorClassLoaderSuite.scala:30)
> > > > > [info] at
> > > > 
> > > 
> > > org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:214)
> > > > > [info] at sbt.RunnerWrapper$1.runRunner2(FrameworkWrapper.java:220)
> > > > > [info] at sbt.RunnerWrapper$1.execute(FrameworkWrapper.java:233)
> > > > > [info] at sbt.ForkMain$Run.runTest(ForkMain.java:243)
> > > > > [info] at sbt.ForkMain$Run.runTestSafe(ForkMain.java:214)
> > > > > [info] at sbt.ForkMain$Run.runTests(ForkMain.java:190)
> > > > > [info] at sbt.ForkMain$Run.run(ForkMain.java:257)
> > > > > [info] at sbt.ForkMain.main(ForkMain.java:99)
> > > > > [info] - parent first *** FAILED *** (59 milliseconds)
> > > > > [info] java.lang.ClassNotFoundException: ReplFakeClass1
> > > > > ...
> > > > > [info] Cause: java.lang.ClassNotFoundException: ReplFakeClass1
> > > > > ...
> > > > > [info] - child first can fall back *** FAILED *** (39 milliseconds)
> > > > > [info] java.lang.ClassNotFoundException: ReplFakeClass3
> > > > > ...
> > > > > [info] - child first can fail (46 milliseconds)
> > > > > [info] ReplSuite:
> > > > > [info] - propagation of local properties (9 seconds, 353 milliseconds)
> > > > > [info] - simple foreach with accumulator (7 seconds, 608 milliseconds)
> > > > > [info] - external vars (5 seconds, 783 milliseconds)
> > > > > [info] - external classes (4 seconds, 341 milliseconds)
> > > > > [info] - external functions (4 seconds, 106 milliseconds)
> > > > > [info] - external functions that access vars (4 seconds, 538
> > > > > 
> > > > 
> > > 
> > > milliseconds)
> > > > > [info] - broadcast vars (4 seconds, 155 milliseconds)
> > > > > [info] - interacting with files (3 seconds, 376 milliseconds)
> > > > > Exception in thread "Connection manager future execution context-0"
> > > > > 
> > > > > 
> > > > > Some output is omitted.
> > > > > 
> > > > > Here are some more information:
> > > > > ReplFakeClass1.class is in the
> > > > > {spark_source_dir}/repl/ReplFakeClass1.class, same as ReplFakeClass2
> > > > > 
> > > > 
> > > 
> > > and 3.
> > > > > ReplSuite failed in running test("local-cluster mode"). The first time
> > > > > running this test throws OOM error. The exception shown in above is a
> > > > > second try
> > > > > The test("local-cluster mode") jvm options are '-Xms512M -Xmx512M'
> > > > > 
> > > > 
> > > 
> > > which I
> > > > > see from the corresponding stderr log
> > > > > I have .sbtconfig file in my home dir. The content is
> > > > > export SBT_OPTS="-XX:+CMSClassUnloadingEnabled -XX:PermSize=5120M
> > > > > -XX:MaxPermSize=10240M"
> > > > > 
> > > > > 
> > > > > The test hung after the test failed in the ReplSuite. I have to Ctr-c
> > > to
> > > > > close the test.
> > > > > 
> > > > > Thank you for you advice.
> > > > > 
> > > > > 
> > > > > 
> > > > > --
> > > > > Ye Xianjin
> > > > > Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
> > > > > 
> > > > 
> > > 
> > > 
> > 
> > 
> > 
> > 
> 
> 


Re: Tests failed after assembling the latest code from github

Posted by Ye Xianjin <ad...@gmail.com>.
well. This is very strange. 
I looked into ExecutorClassLoaderSuite.scala and ReplSuite.scala and made small changes to ExecutorClassLoaderSuite.scala (mostly output some internal variables). After that, when running repl test, I noticed the ReplSuite  
was tested first and the test result is ok. But the ExecutorClassLoaderSuite test was weird.
Here is the output:
[info] ExecutorClassLoaderSuite:
[error] Uncaught exception when running org.apache.spark.repl.ExecutorClassLoaderSuite: java.lang.OutOfMemoryError: PermGen space
[error] Uncaught exception when running org.apache.spark.repl.ExecutorClassLoaderSuite: java.lang.OutOfMemoryError: PermGen space
Internal error when running tests: java.lang.OutOfMemoryError: PermGen space
Exception in thread "Thread-3" java.io.EOFException
at java.io.ObjectInputStream$BlockDataInputStream.peekByte(ObjectInputStream.java:2577)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1297)
at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1685)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1323)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
at sbt.React.react(ForkTests.scala:116)
at sbt.ForkTests$$anonfun$mainTestTask$1$Acceptor$2$.run(ForkTests.scala:75)
at java.lang.Thread.run(Thread.java:695)


I revert my changes. The test result is same.

 I touched the ReplSuite.scala file (use touch command), the test order is reversed, same as the very beginning. And the output is also the same.(The result in my first post).


-- 
Ye Xianjin
Sent with Sparrow (http://www.sparrowmailapp.com/?sig)


On Tuesday, April 15, 2014 at 3:14 AM, Aaron Davidson wrote:

> This may have something to do with running the tests on a Mac, as there is
> a lot of File/URI/URL stuff going on in that test which may just have
> happened to work if run on a Linux system (like Jenkins). Note that this
> suite was added relatively recently:
> https://github.com/apache/spark/pull/217
> 
> 
> On Mon, Apr 14, 2014 at 12:04 PM, Ye Xianjin <advancedxy@gmail.com (mailto:advancedxy@gmail.com)> wrote:
> 
> > Thank you for your reply.
> > 
> > After building the assembly jar, the repl test still failed. The error
> > output is same as I post before.
> > 
> > --
> > Ye Xianjin
> > Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
> > 
> > 
> > On Tuesday, April 15, 2014 at 1:39 AM, Michael Armbrust wrote:
> > 
> > > I believe you may need an assembly jar to run the ReplSuite. "sbt/sbt
> > > assembly/assembly".
> > > 
> > > Michael
> > > 
> > > 
> > > On Mon, Apr 14, 2014 at 3:14 AM, Ye Xianjin <advancedxy@gmail.com (mailto:advancedxy@gmail.com)(mailto:
> > advancedxy@gmail.com (mailto:advancedxy@gmail.com))> wrote:
> > > 
> > > > Hi, everyone:
> > > > I am new to Spark development. I download spark's latest code from
> > > > 
> > > 
> > > 
> > 
> > github.
> > > > After running sbt/sbt assembly,
> > > > I began running sbt/sbt test in the spark source code dir. But it
> > > > 
> > > 
> > 
> > failed
> > > > running the repl module test.
> > > > 
> > > > Here are some output details.
> > > > 
> > > > command:
> > > > sbt/sbt "test-only org.apache.spark.repl.*"
> > > > output:
> > > > 
> > > > [info] Loading project definition from
> > > > /Volumes/MacintoshHD/github/spark/project/project
> > > > [info] Loading project definition from
> > > > /Volumes/MacintoshHD/github/spark/project
> > > > [info] Set current project to root (in build
> > > > file:/Volumes/MacintoshHD/github/spark/)
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for graphx/test:testOnly
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for bagel/test:testOnly
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for streaming/test:testOnly
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for mllib/test:testOnly
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for catalyst/test:testOnly
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for core/test:testOnly
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for assembly/test:testOnly
> > > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > > [info] No tests to run for sql/test:testOnly
> > > > [info] ExecutorClassLoaderSuite:
> > > > 2014-04-14 16:59:31.247 java[8393:1003] Unable to load realm info from
> > > > SCDynamicStore
> > > > [info] - child first *** FAILED *** (440 milliseconds)
> > > > [info] java.lang.ClassNotFoundException: ReplFakeClass2
> > > > [info] at java.lang.ClassLoader.findClass(ClassLoader.java:364)
> > > > [info] at
> > > > 
> > > 
> > 
> > org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26)
> > > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > > [info] at
> > > > 
> > > 
> > 
> > org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30)
> > > > [info] at
> > > 
> > 
> > org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> > > > [info] at
> > > 
> > 
> > org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> > > > [info] at scala.Option.getOrElse(Option.scala:120)
> > > > [info] at
> > > > 
> > > 
> > 
> > org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:57)
> > > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > > [info] at
> > > > 
> > > 
> > 
> > org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply$mcV$sp(ExecutorClassLoaderSuite.scala:47)
> > > > [info] at
> > > 
> > 
> > org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> > > > [info] at
> > > 
> > 
> > org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> > > > [info] at org.scalatest.FunSuite$$anon$1.apply(FunSuite.scala:1265)
> > > > [info] at org.scalatest.Suite$class.withFixture(Suite.scala:1974)
> > > > [info] at
> > > > 
> > > 
> > 
> > org.apache.spark.repl.ExecutorClassLoaderSuite.withFixture(ExecutorClassLoaderSuite.scala:30)
> > > > [info] at
> > > > org.scalatest.FunSuite$class.invokeWithFixture$1(FunSuite.scala:1262)
> > > > [info] at
> > > > org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> > > > [info] at
> > > > org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> > > > [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:198)
> > > > [info] at org.scalatest.FunSuite$class.runTest(FunSuite.scala:1271)
> > > > [info] at
> > > > 
> > > 
> > 
> > org.apache.spark.repl.ExecutorClassLoaderSuite.runTest(ExecutorClassLoaderSuite.scala:30)
> > > > [info] at
> > > > org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> > > > [info] at
> > > > org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> > > > [info] at
> > > > 
> > > 
> > 
> > org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:260)
> > > > [info] at
> > > 
> > 
> > org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:249)
> > > > [info] at scala.collection.immutable.List.foreach(List.scala:318)
> > > > [info] at org.scalatest.SuperEngine.org (http://org.scalatest.SuperEngine.org) (
> > > > 
> > > 
> > 
> > http://org.scalatest.SuperEngine.org)
> > > > $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:249)
> > > > [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:326)
> > > > [info] at org.scalatest.FunSuite$class.runTests(FunSuite.scala:1304)
> > > > [info] at
> > > > 
> > > 
> > 
> > org.apache.spark.repl.ExecutorClassLoaderSuite.runTests(ExecutorClassLoaderSuite.scala:30)
> > > > [info] at org.scalatest.Suite$class.run(Suite.scala:2303)
> > > > [info] at org.apache.spark.repl.ExecutorClassLoaderSuite.org (http://org.apache.spark.repl.ExecutorClassLoaderSuite.org) (
> > > > 
> > > 
> > 
> > http://org.apache.spark.repl.ExecutorClassLoaderSuite.org)
> > > > $scalatest$FunSuite$$super$run(ExecutorClassLoaderSuite.scala:30)
> > > > [info] at
> > > > org.scalatest.FunSuite$$anonfun$run$1.apply(FunSuite.scala:1310)
> > > > [info] at
> > > > org.scalatest.FunSuite$$anonfun$run$1.apply(FunSuite.scala:1310)
> > > > [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:362)
> > > > [info] at org.scalatest.FunSuite$class.run(FunSuite.scala:1310)
> > > > [info] at org.apache.spark.repl.ExecutorClassLoaderSuite.org (http://org.apache.spark.repl.ExecutorClassLoaderSuite.org) (
> > > > 
> > > 
> > 
> > http://org.apache.spark.repl.ExecutorClassLoaderSuite.org)
> > > > 
> > > 
> > 
> > $scalatest$BeforeAndAfterAll$$super$run(ExecutorClassLoaderSuite.scala:30)
> > > > [info] at
> > > > org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:213)
> > > > [info] at
> > > > 
> > > 
> > 
> > org.apache.spark.repl.ExecutorClassLoaderSuite.run(ExecutorClassLoaderSuite.scala:30)
> > > > [info] at
> > > 
> > 
> > org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:214)
> > > > [info] at sbt.RunnerWrapper$1.runRunner2(FrameworkWrapper.java:220)
> > > > [info] at sbt.RunnerWrapper$1.execute(FrameworkWrapper.java:233)
> > > > [info] at sbt.ForkMain$Run.runTest(ForkMain.java:243)
> > > > [info] at sbt.ForkMain$Run.runTestSafe(ForkMain.java:214)
> > > > [info] at sbt.ForkMain$Run.runTests(ForkMain.java:190)
> > > > [info] at sbt.ForkMain$Run.run(ForkMain.java:257)
> > > > [info] at sbt.ForkMain.main(ForkMain.java:99)
> > > > [info] - parent first *** FAILED *** (59 milliseconds)
> > > > [info] java.lang.ClassNotFoundException: ReplFakeClass1
> > > > ...
> > > > [info] Cause: java.lang.ClassNotFoundException: ReplFakeClass1
> > > > ...
> > > > [info] - child first can fall back *** FAILED *** (39 milliseconds)
> > > > [info] java.lang.ClassNotFoundException: ReplFakeClass3
> > > > ...
> > > > [info] - child first can fail (46 milliseconds)
> > > > [info] ReplSuite:
> > > > [info] - propagation of local properties (9 seconds, 353 milliseconds)
> > > > [info] - simple foreach with accumulator (7 seconds, 608 milliseconds)
> > > > [info] - external vars (5 seconds, 783 milliseconds)
> > > > [info] - external classes (4 seconds, 341 milliseconds)
> > > > [info] - external functions (4 seconds, 106 milliseconds)
> > > > [info] - external functions that access vars (4 seconds, 538
> > > > 
> > > 
> > 
> > milliseconds)
> > > > [info] - broadcast vars (4 seconds, 155 milliseconds)
> > > > [info] - interacting with files (3 seconds, 376 milliseconds)
> > > > Exception in thread "Connection manager future execution context-0"
> > > > 
> > > > 
> > > > Some output is omitted.
> > > > 
> > > > Here are some more information:
> > > > ReplFakeClass1.class is in the
> > > > {spark_source_dir}/repl/ReplFakeClass1.class, same as ReplFakeClass2
> > > > 
> > > 
> > 
> > and 3.
> > > > ReplSuite failed in running test("local-cluster mode"). The first time
> > > > running this test throws OOM error. The exception shown in above is a
> > > > second try
> > > > The test("local-cluster mode") jvm options are '-Xms512M -Xmx512M'
> > > > 
> > > 
> > 
> > which I
> > > > see from the corresponding stderr log
> > > > I have .sbtconfig file in my home dir. The content is
> > > > export SBT_OPTS="-XX:+CMSClassUnloadingEnabled -XX:PermSize=5120M
> > > > -XX:MaxPermSize=10240M"
> > > > 
> > > > 
> > > > The test hung after the test failed in the ReplSuite. I have to Ctr-c
> > to
> > > > close the test.
> > > > 
> > > > Thank you for you advice.
> > > > 
> > > > 
> > > > 
> > > > --
> > > > Ye Xianjin
> > > > Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
> > > > 
> > > 
> > 
> > 
> 
> 
> 



Re: Tests failed after assembling the latest code from github

Posted by Aaron Davidson <il...@gmail.com>.
This may have something to do with running the tests on a Mac, as there is
a lot of File/URI/URL stuff going on in that test which may just have
happened to work if run on a Linux system (like Jenkins). Note that this
suite was added relatively recently:
https://github.com/apache/spark/pull/217


On Mon, Apr 14, 2014 at 12:04 PM, Ye Xianjin <ad...@gmail.com> wrote:

> Thank you for your reply.
>
> After building the assembly jar, the repl test still failed. The error
> output is same as I post before.
>
> --
> Ye Xianjin
> Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
>
>
> On Tuesday, April 15, 2014 at 1:39 AM, Michael Armbrust wrote:
>
> > I believe you may need an assembly jar to run the ReplSuite. "sbt/sbt
> > assembly/assembly".
> >
> > Michael
> >
> >
> > On Mon, Apr 14, 2014 at 3:14 AM, Ye Xianjin <advancedxy@gmail.com(mailto:
> advancedxy@gmail.com)> wrote:
> >
> > > Hi, everyone:
> > > I am new to Spark development. I download spark's latest code from
> github.
> > > After running sbt/sbt assembly,
> > > I began running sbt/sbt test in the spark source code dir. But it
> failed
> > > running the repl module test.
> > >
> > > Here are some output details.
> > >
> > > command:
> > > sbt/sbt "test-only org.apache.spark.repl.*"
> > > output:
> > >
> > > [info] Loading project definition from
> > > /Volumes/MacintoshHD/github/spark/project/project
> > > [info] Loading project definition from
> > > /Volumes/MacintoshHD/github/spark/project
> > > [info] Set current project to root (in build
> > > file:/Volumes/MacintoshHD/github/spark/)
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for graphx/test:testOnly
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for bagel/test:testOnly
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for streaming/test:testOnly
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for mllib/test:testOnly
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for catalyst/test:testOnly
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for core/test:testOnly
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for assembly/test:testOnly
> > > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > > [info] No tests to run for sql/test:testOnly
> > > [info] ExecutorClassLoaderSuite:
> > > 2014-04-14 16:59:31.247 java[8393:1003] Unable to load realm info from
> > > SCDynamicStore
> > > [info] - child first *** FAILED *** (440 milliseconds)
> > > [info] java.lang.ClassNotFoundException: ReplFakeClass2
> > > [info] at java.lang.ClassLoader.findClass(ClassLoader.java:364)
> > > [info] at
> > >
> org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26)
> > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > [info] at
> > >
> org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> > > [info] at scala.Option.getOrElse(Option.scala:120)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:57)
> > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply$mcV$sp(ExecutorClassLoaderSuite.scala:47)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> > > [info] at org.scalatest.FunSuite$$anon$1.apply(FunSuite.scala:1265)
> > > [info] at org.scalatest.Suite$class.withFixture(Suite.scala:1974)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoaderSuite.withFixture(ExecutorClassLoaderSuite.scala:30)
> > > [info] at
> > > org.scalatest.FunSuite$class.invokeWithFixture$1(FunSuite.scala:1262)
> > > [info] at
> > > org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> > > [info] at
> > > org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> > > [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:198)
> > > [info] at org.scalatest.FunSuite$class.runTest(FunSuite.scala:1271)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoaderSuite.runTest(ExecutorClassLoaderSuite.scala:30)
> > > [info] at
> > > org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> > > [info] at
> > > org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> > > [info] at
> > >
> org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:260)
> > > [info] at
> > >
> org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:249)
> > > [info] at scala.collection.immutable.List.foreach(List.scala:318)
> > > [info] at org.scalatest.SuperEngine.org (
> http://org.scalatest.SuperEngine.org)
> > > $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:249)
> > > [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:326)
> > > [info] at org.scalatest.FunSuite$class.runTests(FunSuite.scala:1304)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoaderSuite.runTests(ExecutorClassLoaderSuite.scala:30)
> > > [info] at org.scalatest.Suite$class.run(Suite.scala:2303)
> > > [info] at org.apache.spark.repl.ExecutorClassLoaderSuite.org (
> http://org.apache.spark.repl.ExecutorClassLoaderSuite.org)
> > > $scalatest$FunSuite$$super$run(ExecutorClassLoaderSuite.scala:30)
> > > [info] at
> > > org.scalatest.FunSuite$$anonfun$run$1.apply(FunSuite.scala:1310)
> > > [info] at
> > > org.scalatest.FunSuite$$anonfun$run$1.apply(FunSuite.scala:1310)
> > > [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:362)
> > > [info] at org.scalatest.FunSuite$class.run(FunSuite.scala:1310)
> > > [info] at org.apache.spark.repl.ExecutorClassLoaderSuite.org (
> http://org.apache.spark.repl.ExecutorClassLoaderSuite.org)
> > >
> $scalatest$BeforeAndAfterAll$$super$run(ExecutorClassLoaderSuite.scala:30)
> > > [info] at
> > > org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:213)
> > > [info] at
> > >
> org.apache.spark.repl.ExecutorClassLoaderSuite.run(ExecutorClassLoaderSuite.scala:30)
> > > [info] at
> > >
> org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:214)
> > > [info] at sbt.RunnerWrapper$1.runRunner2(FrameworkWrapper.java:220)
> > > [info] at sbt.RunnerWrapper$1.execute(FrameworkWrapper.java:233)
> > > [info] at sbt.ForkMain$Run.runTest(ForkMain.java:243)
> > > [info] at sbt.ForkMain$Run.runTestSafe(ForkMain.java:214)
> > > [info] at sbt.ForkMain$Run.runTests(ForkMain.java:190)
> > > [info] at sbt.ForkMain$Run.run(ForkMain.java:257)
> > > [info] at sbt.ForkMain.main(ForkMain.java:99)
> > > [info] - parent first *** FAILED *** (59 milliseconds)
> > > [info] java.lang.ClassNotFoundException: ReplFakeClass1
> > > ...
> > > [info] Cause: java.lang.ClassNotFoundException: ReplFakeClass1
> > > ...
> > > [info] - child first can fall back *** FAILED *** (39 milliseconds)
> > > [info] java.lang.ClassNotFoundException: ReplFakeClass3
> > > ...
> > > [info] - child first can fail (46 milliseconds)
> > > [info] ReplSuite:
> > > [info] - propagation of local properties (9 seconds, 353 milliseconds)
> > > [info] - simple foreach with accumulator (7 seconds, 608 milliseconds)
> > > [info] - external vars (5 seconds, 783 milliseconds)
> > > [info] - external classes (4 seconds, 341 milliseconds)
> > > [info] - external functions (4 seconds, 106 milliseconds)
> > > [info] - external functions that access vars (4 seconds, 538
> milliseconds)
> > > [info] - broadcast vars (4 seconds, 155 milliseconds)
> > > [info] - interacting with files (3 seconds, 376 milliseconds)
> > > Exception in thread "Connection manager future execution context-0"
> > >
> > >
> > > Some output is omitted.
> > >
> > > Here are some more information:
> > > ReplFakeClass1.class is in the
> > > {spark_source_dir}/repl/ReplFakeClass1.class, same as ReplFakeClass2
> and 3.
> > > ReplSuite failed in running test("local-cluster mode"). The first time
> > > running this test throws OOM error. The exception shown in above is a
> > > second try
> > > The test("local-cluster mode") jvm options are '-Xms512M -Xmx512M'
> which I
> > > see from the corresponding stderr log
> > > I have .sbtconfig file in my home dir. The content is
> > > export SBT_OPTS="-XX:+CMSClassUnloadingEnabled -XX:PermSize=5120M
> > > -XX:MaxPermSize=10240M"
> > >
> > >
> > > The test hung after the test failed in the ReplSuite. I have to Ctr-c
> to
> > > close the test.
> > >
> > > Thank you for you advice.
> > >
> > >
> > >
> > > --
> > > Ye Xianjin
> > > Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
> > >
> >
> >
> >
>
>
>

Re: Tests failed after assembling the latest code from github

Posted by Ye Xianjin <ad...@gmail.com>.
Thank you for your reply. 

After building the assembly jar, the repl test still failed. The error output is same as I post before. 

-- 
Ye Xianjin
Sent with Sparrow (http://www.sparrowmailapp.com/?sig)


On Tuesday, April 15, 2014 at 1:39 AM, Michael Armbrust wrote:

> I believe you may need an assembly jar to run the ReplSuite. "sbt/sbt
> assembly/assembly".
> 
> Michael
> 
> 
> On Mon, Apr 14, 2014 at 3:14 AM, Ye Xianjin <advancedxy@gmail.com (mailto:advancedxy@gmail.com)> wrote:
> 
> > Hi, everyone:
> > I am new to Spark development. I download spark's latest code from github.
> > After running sbt/sbt assembly,
> > I began running sbt/sbt test in the spark source code dir. But it failed
> > running the repl module test.
> > 
> > Here are some output details.
> > 
> > command:
> > sbt/sbt "test-only org.apache.spark.repl.*"
> > output:
> > 
> > [info] Loading project definition from
> > /Volumes/MacintoshHD/github/spark/project/project
> > [info] Loading project definition from
> > /Volumes/MacintoshHD/github/spark/project
> > [info] Set current project to root (in build
> > file:/Volumes/MacintoshHD/github/spark/)
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for graphx/test:testOnly
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for bagel/test:testOnly
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for streaming/test:testOnly
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for mllib/test:testOnly
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for catalyst/test:testOnly
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for core/test:testOnly
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for assembly/test:testOnly
> > [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> > [info] No tests to run for sql/test:testOnly
> > [info] ExecutorClassLoaderSuite:
> > 2014-04-14 16:59:31.247 java[8393:1003] Unable to load realm info from
> > SCDynamicStore
> > [info] - child first *** FAILED *** (440 milliseconds)
> > [info] java.lang.ClassNotFoundException: ReplFakeClass2
> > [info] at java.lang.ClassLoader.findClass(ClassLoader.java:364)
> > [info] at
> > org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26)
> > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > [info] at
> > org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> > [info] at scala.Option.getOrElse(Option.scala:120)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:57)
> > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > [info] at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply$mcV$sp(ExecutorClassLoaderSuite.scala:47)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> > [info] at org.scalatest.FunSuite$$anon$1.apply(FunSuite.scala:1265)
> > [info] at org.scalatest.Suite$class.withFixture(Suite.scala:1974)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoaderSuite.withFixture(ExecutorClassLoaderSuite.scala:30)
> > [info] at
> > org.scalatest.FunSuite$class.invokeWithFixture$1(FunSuite.scala:1262)
> > [info] at
> > org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> > [info] at
> > org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> > [info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:198)
> > [info] at org.scalatest.FunSuite$class.runTest(FunSuite.scala:1271)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoaderSuite.runTest(ExecutorClassLoaderSuite.scala:30)
> > [info] at
> > org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> > [info] at
> > org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> > [info] at
> > org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:260)
> > [info] at
> > org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:249)
> > [info] at scala.collection.immutable.List.foreach(List.scala:318)
> > [info] at org.scalatest.SuperEngine.org (http://org.scalatest.SuperEngine.org)
> > $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:249)
> > [info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:326)
> > [info] at org.scalatest.FunSuite$class.runTests(FunSuite.scala:1304)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoaderSuite.runTests(ExecutorClassLoaderSuite.scala:30)
> > [info] at org.scalatest.Suite$class.run(Suite.scala:2303)
> > [info] at org.apache.spark.repl.ExecutorClassLoaderSuite.org (http://org.apache.spark.repl.ExecutorClassLoaderSuite.org)
> > $scalatest$FunSuite$$super$run(ExecutorClassLoaderSuite.scala:30)
> > [info] at
> > org.scalatest.FunSuite$$anonfun$run$1.apply(FunSuite.scala:1310)
> > [info] at
> > org.scalatest.FunSuite$$anonfun$run$1.apply(FunSuite.scala:1310)
> > [info] at org.scalatest.SuperEngine.runImpl(Engine.scala:362)
> > [info] at org.scalatest.FunSuite$class.run(FunSuite.scala:1310)
> > [info] at org.apache.spark.repl.ExecutorClassLoaderSuite.org (http://org.apache.spark.repl.ExecutorClassLoaderSuite.org)
> > $scalatest$BeforeAndAfterAll$$super$run(ExecutorClassLoaderSuite.scala:30)
> > [info] at
> > org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:213)
> > [info] at
> > org.apache.spark.repl.ExecutorClassLoaderSuite.run(ExecutorClassLoaderSuite.scala:30)
> > [info] at
> > org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:214)
> > [info] at sbt.RunnerWrapper$1.runRunner2(FrameworkWrapper.java:220)
> > [info] at sbt.RunnerWrapper$1.execute(FrameworkWrapper.java:233)
> > [info] at sbt.ForkMain$Run.runTest(ForkMain.java:243)
> > [info] at sbt.ForkMain$Run.runTestSafe(ForkMain.java:214)
> > [info] at sbt.ForkMain$Run.runTests(ForkMain.java:190)
> > [info] at sbt.ForkMain$Run.run(ForkMain.java:257)
> > [info] at sbt.ForkMain.main(ForkMain.java:99)
> > [info] - parent first *** FAILED *** (59 milliseconds)
> > [info] java.lang.ClassNotFoundException: ReplFakeClass1
> > ...
> > [info] Cause: java.lang.ClassNotFoundException: ReplFakeClass1
> > ...
> > [info] - child first can fall back *** FAILED *** (39 milliseconds)
> > [info] java.lang.ClassNotFoundException: ReplFakeClass3
> > ...
> > [info] - child first can fail (46 milliseconds)
> > [info] ReplSuite:
> > [info] - propagation of local properties (9 seconds, 353 milliseconds)
> > [info] - simple foreach with accumulator (7 seconds, 608 milliseconds)
> > [info] - external vars (5 seconds, 783 milliseconds)
> > [info] - external classes (4 seconds, 341 milliseconds)
> > [info] - external functions (4 seconds, 106 milliseconds)
> > [info] - external functions that access vars (4 seconds, 538 milliseconds)
> > [info] - broadcast vars (4 seconds, 155 milliseconds)
> > [info] - interacting with files (3 seconds, 376 milliseconds)
> > Exception in thread "Connection manager future execution context-0"
> > 
> > 
> > Some output is omitted.
> > 
> > Here are some more information:
> > ReplFakeClass1.class is in the
> > {spark_source_dir}/repl/ReplFakeClass1.class, same as ReplFakeClass2 and 3.
> > ReplSuite failed in running test("local-cluster mode"). The first time
> > running this test throws OOM error. The exception shown in above is a
> > second try
> > The test("local-cluster mode") jvm options are '-Xms512M -Xmx512M' which I
> > see from the corresponding stderr log
> > I have .sbtconfig file in my home dir. The content is
> > export SBT_OPTS="-XX:+CMSClassUnloadingEnabled -XX:PermSize=5120M
> > -XX:MaxPermSize=10240M"
> > 
> > 
> > The test hung after the test failed in the ReplSuite. I have to Ctr-c to
> > close the test.
> > 
> > Thank you for you advice.
> > 
> > 
> > 
> > --
> > Ye Xianjin
> > Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
> > 
> 
> 
> 



Re: Tests failed after assembling the latest code from github

Posted by Michael Armbrust <mi...@databricks.com>.
I believe you may need an assembly jar to run the ReplSuite. "sbt/sbt
assembly/assembly".

Michael


On Mon, Apr 14, 2014 at 3:14 AM, Ye Xianjin <ad...@gmail.com> wrote:

> Hi, everyone:
> I am new to Spark development. I download spark's latest code from github.
> After running sbt/sbt assembly,
> I began running  sbt/sbt test in the spark source code dir. But it failed
> running the repl module test.
>
> Here are some output details.
>
> command:
> sbt/sbt "test-only org.apache.spark.repl.*"
> output:
>
> [info] Loading project definition from
> /Volumes/MacintoshHD/github/spark/project/project
> [info] Loading project definition from
> /Volumes/MacintoshHD/github/spark/project
> [info] Set current project to root (in build
> file:/Volumes/MacintoshHD/github/spark/)
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for graphx/test:testOnly
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for bagel/test:testOnly
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for streaming/test:testOnly
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for mllib/test:testOnly
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for catalyst/test:testOnly
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for core/test:testOnly
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for assembly/test:testOnly
> [info] Passed: Total 0, Failed 0, Errors 0, Passed 0
> [info] No tests to run for sql/test:testOnly
> [info] ExecutorClassLoaderSuite:
> 2014-04-14 16:59:31.247 java[8393:1003] Unable to load realm info from
> SCDynamicStore
> [info] - child first *** FAILED *** (440 milliseconds)
> [info]   java.lang.ClassNotFoundException: ReplFakeClass2
> [info]   at java.lang.ClassLoader.findClass(ClassLoader.java:364)
> [info]   at
> org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.scala:26)
> [info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> [info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> [info]   at
> org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.scala:30)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoader$$anonfun$findClass$1.apply(ExecutorClassLoader.scala:57)
> [info]   at scala.Option.getOrElse(Option.scala:120)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:57)
> [info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> [info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply$mcV$sp(ExecutorClassLoaderSuite.scala:47)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoaderSuite$$anonfun$1.apply(ExecutorClassLoaderSuite.scala:44)
> [info]   at org.scalatest.FunSuite$$anon$1.apply(FunSuite.scala:1265)
> [info]   at org.scalatest.Suite$class.withFixture(Suite.scala:1974)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoaderSuite.withFixture(ExecutorClassLoaderSuite.scala:30)
> [info]   at
> org.scalatest.FunSuite$class.invokeWithFixture$1(FunSuite.scala:1262)
> [info]   at
> org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> [info]   at
> org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> [info]   at org.scalatest.SuperEngine.runTestImpl(Engine.scala:198)
> [info]   at org.scalatest.FunSuite$class.runTest(FunSuite.scala:1271)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoaderSuite.runTest(ExecutorClassLoaderSuite.scala:30)
> [info]   at
> org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> [info]   at
> org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> [info]   at
> org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:260)
> [info]   at
> org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:249)
> [info]   at scala.collection.immutable.List.foreach(List.scala:318)
> [info]   at org.scalatest.SuperEngine.org
> $scalatest$SuperEngine$$runTestsInBranch(Engine.scala:249)
> [info]   at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:326)
> [info]   at org.scalatest.FunSuite$class.runTests(FunSuite.scala:1304)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoaderSuite.runTests(ExecutorClassLoaderSuite.scala:30)
> [info]   at org.scalatest.Suite$class.run(Suite.scala:2303)
> [info]   at org.apache.spark.repl.ExecutorClassLoaderSuite.org
> $scalatest$FunSuite$$super$run(ExecutorClassLoaderSuite.scala:30)
> [info]   at
> org.scalatest.FunSuite$$anonfun$run$1.apply(FunSuite.scala:1310)
> [info]   at
> org.scalatest.FunSuite$$anonfun$run$1.apply(FunSuite.scala:1310)
> [info]   at org.scalatest.SuperEngine.runImpl(Engine.scala:362)
> [info]   at org.scalatest.FunSuite$class.run(FunSuite.scala:1310)
> [info]   at org.apache.spark.repl.ExecutorClassLoaderSuite.org
> $scalatest$BeforeAndAfterAll$$super$run(ExecutorClassLoaderSuite.scala:30)
> [info]   at
> org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:213)
> [info]   at
> org.apache.spark.repl.ExecutorClassLoaderSuite.run(ExecutorClassLoaderSuite.scala:30)
> [info]   at
> org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:214)
> [info]   at sbt.RunnerWrapper$1.runRunner2(FrameworkWrapper.java:220)
> [info]   at sbt.RunnerWrapper$1.execute(FrameworkWrapper.java:233)
> [info]   at sbt.ForkMain$Run.runTest(ForkMain.java:243)
> [info]   at sbt.ForkMain$Run.runTestSafe(ForkMain.java:214)
> [info]   at sbt.ForkMain$Run.runTests(ForkMain.java:190)
> [info]   at sbt.ForkMain$Run.run(ForkMain.java:257)
> [info]   at sbt.ForkMain.main(ForkMain.java:99)
> [info] - parent first *** FAILED *** (59 milliseconds)
> [info]   java.lang.ClassNotFoundException: ReplFakeClass1
> ...
> [info]   Cause: java.lang.ClassNotFoundException: ReplFakeClass1
> ...
> [info] - child first can fall back *** FAILED *** (39 milliseconds)
> [info]   java.lang.ClassNotFoundException: ReplFakeClass3
> ...
> [info] - child first can fail (46 milliseconds)
> [info] ReplSuite:
> [info] - propagation of local properties (9 seconds, 353 milliseconds)
> [info] - simple foreach with accumulator (7 seconds, 608 milliseconds)
> [info] - external vars (5 seconds, 783 milliseconds)
> [info] - external classes (4 seconds, 341 milliseconds)
> [info] - external functions (4 seconds, 106 milliseconds)
> [info] - external functions that access vars (4 seconds, 538 milliseconds)
> [info] - broadcast vars (4 seconds, 155 milliseconds)
> [info] - interacting with files (3 seconds, 376 milliseconds)
> Exception in thread "Connection manager future execution context-0"
>
>
> Some output is omitted.
>
> Here are some more information:
> ReplFakeClass1.class is in the
> {spark_source_dir}/repl/ReplFakeClass1.class, same as ReplFakeClass2 and 3.
> ReplSuite failed in running test("local-cluster mode"). The first time
> running this test throws OOM error. The exception shown in above is a
> second try
> The test("local-cluster mode") jvm options are '-Xms512M -Xmx512M' which I
> see from the corresponding stderr log
> I have .sbtconfig file in my home dir.  The content is
> export SBT_OPTS="-XX:+CMSClassUnloadingEnabled -XX:PermSize=5120M
> -XX:MaxPermSize=10240M"
>
>
> The test hung after the test failed in the ReplSuite. I have to Ctr-c to
> close the test.
>
> Thank you for you advice.
>
>
>
> --
> Ye Xianjin
> Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
>
>