You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Tathagata Das (JIRA)" <ji...@apache.org> on 2015/06/01 22:34:17 UTC

[jira] [Commented] (SPARK-1603) Flaky test: o.a.s.streaming.StreamingContextSuite

    [ https://issues.apache.org/jira/browse/SPARK-1603?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14567947#comment-14567947 ] 

Tathagata Das commented on SPARK-1603:
--------------------------------------

[~andrewor14] Do you see this error any more?

> Flaky test: o.a.s.streaming.StreamingContextSuite
> -------------------------------------------------
>
>                 Key: SPARK-1603
>                 URL: https://issues.apache.org/jira/browse/SPARK-1603
>             Project: Spark
>          Issue Type: Bug
>          Components: Streaming
>    Affects Versions: 1.3.0
>            Reporter: Nan Zhu
>            Assignee: Tathagata Das
>            Priority: Critical
>              Labels: flaky-test
>
> When Jenkins was testing 5 PRs at the same time, the test results in my PR shows that  stop gracefully in StreamingContextSuite failed, 
> the stacktrace is as
> {quote}
>  stop gracefully *** FAILED *** (8 seconds, 350 milliseconds)
> [info]   akka.actor.InvalidActorNameException: actor name [JobScheduler] is not unique!
> [info]   at akka.actor.dungeon.ChildrenContainer$TerminatingChildrenContainer.reserve(ChildrenContainer.scala:192)
> [info]   at akka.actor.dungeon.Children$class.reserveChild(Children.scala:77)
> [info]   at akka.actor.ActorCell.reserveChild(ActorCell.scala:338)
> [info]   at akka.actor.dungeon.Children$class.makeChild(Children.scala:186)
> [info]   at akka.actor.dungeon.Children$class.attachChild(Children.scala:42)
> [info]   at akka.actor.ActorCell.attachChild(ActorCell.scala:338)
> [info]   at akka.actor.ActorSystemImpl.actorOf(ActorSystem.scala:518)
> [info]   at org.apache.spark.streaming.scheduler.JobScheduler.start(JobScheduler.scala:57)
> [info]   at org.apache.spark.streaming.StreamingContext.start(StreamingContext.scala:434)
> [info]   at org.apache.spark.streaming.StreamingContextSuite$$anonfun$14$$anonfun$apply$mcV$sp$3.apply$mcVI$sp(StreamingContextSuite.scala:174)
> [info]   at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
> [info]   at org.apache.spark.streaming.StreamingContextSuite$$anonfun$14.apply$mcV$sp(StreamingContextSuite.scala:163)
> [info]   at org.apache.spark.streaming.StreamingContextSuite$$anonfun$14.apply(StreamingContextSuite.scala:159)
> [info]   at org.apache.spark.streaming.StreamingContextSuite$$anonfun$14.apply(StreamingContextSuite.scala:159)
> [info]   at org.scalatest.FunSuite$$anon$1.apply(FunSuite.scala:1265)
> [info]   at org.scalatest.Suite$class.withFixture(Suite.scala:1974)
> [info]   at org.apache.spark.streaming.StreamingContextSuite.withFixture(StreamingContextSuite.scala:34)
> [info]   at org.scalatest.FunSuite$class.invokeWithFixture$1(FunSuite.scala:1262)
> [info]   at org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> [info]   at org.scalatest.FunSuite$$anonfun$runTest$1.apply(FunSuite.scala:1271)
> [info]   at org.scalatest.SuperEngine.runTestImpl(Engine.scala:198)
> [info]   at org.scalatest.FunSuite$class.runTest(FunSuite.scala:1271)
> [info]   at org.apache.spark.streaming.StreamingContextSuite.org$scalatest$BeforeAndAfter$$super$runTest(StreamingContextSuite.scala:34)
> [info]   at org.scalatest.BeforeAndAfter$class.runTest(BeforeAndAfter.scala:171)
> [info]   at org.apache.spark.streaming.StreamingContextSuite.runTest(StreamingContextSuite.scala:34)
> [info]   at org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> [info]   at org.scalatest.FunSuite$$anonfun$runTests$1.apply(FunSuite.scala:1304)
> [info]   at org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:260)
> [info]   at org.scalatest.SuperEngine$$anonfun$org$scalatest$SuperEngine$$runTestsInBranch$1.apply(Engine.scala:249)
> [info]   at scala.collection.immutable.List.foreach(List.scala:318)
> [info]   at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:249)
> [info]   at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:326)
> [info]   at org.scalatest.FunSuite$class.runTests(FunSuite.scala:1304)
> [info]   at org.apache.spark.streaming.StreamingContextSuite.runTests(StreamingContextSuite.scala:34)
> [info]   at org.scalatest.Suite$class.run(Suite.scala:2303)
> [info]   at org.apache.spark.streaming.StreamingContextSuite.org$scalatest$FunSuite$$super$run(StreamingContextSuite.scala:34)
> [info]   at org.scalatest.FunSuite$$anonfun$run$1.apply(FunSuite.scala:1310)
> [info]   at org.scalatest.FunSuite$$anonfun$run$1.apply(FunSuite.scala:1310)
> [info]   at org.scalatest.SuperEngine.runImpl(Engine.scala:362)
> [info]   at org.scalatest.FunSuite$class.run(FunSuite.scala:1310)
> [info]   at org.apache.spark.streaming.StreamingContextSuite.org$scalatest$BeforeAndAfter$$super$run(StreamingContextSuite.scala:34)
> [info]   at org.scalatest.BeforeAndAfter$class.run(BeforeAndAfter.scala:208)
> [info]   at org.apache.spark.streaming.StreamingContextSuite.run(StreamingContextSuite.scala:34)
> [info]   at org.scalatest.tools.ScalaTestFramework$ScalaTestRunner.run(ScalaTestFramework.scala:214)
> [info]   at sbt.RunnerWrapper$1.runRunner2(FrameworkWrapper.java:223)
> [info]   at sbt.RunnerWrapper$1.execute(FrameworkWrapper.java:236)
> [info]   at sbt.ForkMain$Run$2.call(ForkMain.java:294)
> [info]   at sbt.ForkMain$Run$2.call(ForkMain.java:284)
> [info]   at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> [info]   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> [info]   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> [info]   at java.lang.Thread.run(Thread.java:744)
> {quote}
> I think we don't need to assign a fixed name to JobScheduler Actor, instead, we can just use auto-generated name in Akka



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org