You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Shixiong Zhu (JIRA)" <ji...@apache.org> on 2018/01/27 00:19:00 UTC

[jira] [Created] (SPARK-23245) KafkaContinuousSourceSuite may hang forever

Shixiong Zhu created SPARK-23245:
------------------------------------

             Summary: KafkaContinuousSourceSuite may hang forever
                 Key: SPARK-23245
                 URL: https://issues.apache.org/jira/browse/SPARK-23245
             Project: Spark
          Issue Type: Bug
          Components: Structured Streaming, Tests
    Affects Versions: 2.3.0
            Reporter: Shixiong Zhu


The following stream execution thread is holding the lock "IncrementalExecution".

{code}

"stream execution thread for [id = 83790664-fd66-4645-b55a-37c17897c691, runId = febf6c2a-1372-4c83-998c-90984a9a02c2]" #2653 daemon prio=5 os_prio=0 tid=0x00007ff511ae2000 nid=0xcde1 waiting on condition [0x00007ff32ebbb000]
 java.lang.Thread.State: WAITING (parking)
 at sun.misc.Unsafe.park(Native Method)
 - parking to wait for <0x000000071a25fa80> (a scala.concurrent.impl.Promise$CompletionLatch)
 at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
 at java.util.concurrent.locks.AbstractQueuedSynchronizer.parkAndCheckInterrupt(AbstractQueuedSynchronizer.java:836)
 at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:997)
 at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304)
 at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:202)
 at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:218)
 at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:153)
 at org.apache.spark.util.ThreadUtils$.awaitReady(ThreadUtils.scala:222)
 at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:731)
 at org.apache.spark.SparkContext.runJob(SparkContext.scala:2109)
 at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec.doExecute(WriteToDataSourceV2.scala:78)
 at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:135)
 at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
 at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$3.apply(SparkPlan.scala:167)
 at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
 at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:164)
 at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
 at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:112)
 - locked <0x000000071a256e10> (a org.apache.spark.sql.execution.streaming.IncrementalExecution)
 at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:112)
 at org.apache.spark.sql.execution.streaming.continuous.ContinuousExecution$$anonfun$runContinuous$3$$anonfun$apply$1.apply(ContinuousExecution.scala:273)
 at org.apache.spark.sql.execution.streaming.continuous.ContinuousExecution$$anonfun$runContinuous$3$$anonfun$apply$1.apply(ContinuousExecution.scala:273)
 at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:88)
 at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:124)
 at org.apache.spark.sql.execution.streaming.continuous.ContinuousExecution$$anonfun$runContinuous$3.apply(ContinuousExecution.scala:273)
 at org.apache.spark.sql.execution.streaming.continuous.ContinuousExecution$$anonfun$runContinuous$3.apply(ContinuousExecution.scala:273)
 at org.apache.spark.sql.execution.streaming.ProgressReporter$class.reportTimeTaken(ProgressReporter.scala:271)
 at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:60)
 at org.apache.spark.sql.execution.streaming.continuous.ContinuousExecution.runContinuous(ContinuousExecution.scala:271)
 at org.apache.spark.sql.execution.streaming.continuous.ContinuousExecution.runActivatedStream(ContinuousExecution.scala:94)
 at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:291)
 at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:201)

{code}

 

But the test thread is waiting for the same lock

{code}

"pool-1-thread-1-ScalaTest-running-KafkaContinuousSourceSuite" #20 prio=5 os_prio=0 tid=0x00007ff5b4f1e800 nid=0x5566 waiting for monitor entry [0x00007ff51cffb000]
 java.lang.Thread.State: BLOCKED (on object monitor)
 at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:109)
 - waiting to lock <0x000000071a256e10> (a org.apache.spark.sql.execution.streaming.IncrementalExecution)
 at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:109)
 at org.apache.spark.sql.streaming.StreamTest$$anonfun$liftedTree1$1$1$$anonfun$apply$25.apply(StreamTest.scala:475)
 at org.apache.spark.sql.streaming.StreamTest$$anonfun$liftedTree1$1$1$$anonfun$apply$25.apply(StreamTest.scala:475)
 at org.scalatest.concurrent.Eventually$class.makeAValiantAttempt$1(Eventually.scala:395)
 at org.scalatest.concurrent.Eventually$class.tryTryAgain$1(Eventually.scala:409)
 at org.scalatest.concurrent.Eventually$class.eventually(Eventually.scala:439)
 at org.scalatest.concurrent.Eventually$.eventually(Eventually.scala:479)
 at org.scalatest.concurrent.Eventually$class.eventually(Eventually.scala:337)
 at org.scalatest.concurrent.Eventually$.eventually(Eventually.scala:479)
 at org.apache.spark.sql.streaming.StreamTest$class.eventually$1(StreamTest.scala:378)
 at org.apache.spark.sql.streaming.StreamTest$$anonfun$liftedTree1$1$1.apply(StreamTest.scala:474)
 at org.apache.spark.sql.streaming.StreamTest$$anonfun$liftedTree1$1$1.apply(StreamTest.scala:432)
 at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
 at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
 at org.apache.spark.sql.streaming.StreamTest$class.liftedTree1$1(StreamTest.scala:432)
 at org.apache.spark.sql.streaming.StreamTest$class.testStream(StreamTest.scala:431)
 - locked <0x00000007199f3438> (a org.apache.spark.sql.kafka010.KafkaContinuousSourceSuite)
 at org.apache.spark.sql.kafka010.KafkaSourceTest.testStream(KafkaSourceSuite.scala:50)
 at org.apache.spark.sql.kafka010.KafkaSourceSuiteBase.org$apache$spark$sql$kafka010$KafkaSourceSuiteBase$$testFromEarliestOffsets(KafkaSourceSuite.scala:1056)
 at org.apache.spark.sql.kafka010.KafkaSourceSuiteBase$$anonfun$38$$anonfun$apply$8.apply$mcV$sp(KafkaSourceSuite.scala:668)
 at org.apache.spark.sql.kafka010.KafkaSourceSuiteBase$$anonfun$38$$anonfun$apply$8.apply(KafkaSourceSuite.scala:665)
 at org.apache.spark.sql.kafka010.KafkaSourceSuiteBase$$anonfun$38$$anonfun$apply$8.apply(KafkaSourceSuite.scala:665)
 at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
 at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
 at org.scalatest.Transformer.apply(Transformer.scala:22)
 at org.scalatest.Transformer.apply(Transformer.scala:20)
 at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
 at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:108)
 at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
 at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
 at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
 at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
 at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
 at org.apache.spark.sql.kafka010.KafkaSourceTest.org$scalatest$BeforeAndAfterEach$$super$runTest(KafkaSourceSuite.scala:50)
 at org.scalatest.BeforeAndAfterEach$class.runTest(BeforeAndAfterEach.scala:221)
 at org.apache.spark.sql.kafka010.KafkaSourceTest.runTest(KafkaSourceSuite.scala:50)
 at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
 at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
 at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
 at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
 at scala.collection.immutable.List.foreach(List.scala:381)
 at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
 at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
 at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
 at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
 at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
 at org.scalatest.Suite$class.run(Suite.scala:1147)
 at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
 at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
 at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
 at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
 at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
 at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:56)
 at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
 at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
 at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:56)
 at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
 at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:480)
 at sbt.ForkMain$Run$2.call(ForkMain.java:296)
 at sbt.ForkMain$Run$2.call(ForkMain.java:286)
 at java.util.concurrent.FutureTask.run(FutureTask.java:266)
 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
 at java.lang.Thread.run(Thread.java:748)

{code}

 

The deadlock here is StreamTest is waiting for the lock and cannot stop the query and hence hang forever.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org