You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiangrui Meng (JIRA)" <ji...@apache.org> on 2015/09/18 00:09:04 UTC

[jira] [Reopened] (SPARK-8567) Flaky test: o.a.s.sql.hive.HiveSparkSubmitSuite --jars

     [ https://issues.apache.org/jira/browse/SPARK-8567?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Xiangrui Meng reopened SPARK-8567:
----------------------------------

Saw more failures recently:

https://amplab.cs.berkeley.edu/jenkins/job/Spark-1.4-SBT/AMPLAB_JENKINS_BUILD_PROFILE=hadoop2.3,label=centos/449/testReport/junit/org.apache.spark.sql.hive/HiveSparkSubmitSuite/SPARK_8368__includes_jars_passed_in_through___jars/

{code}
org.apache.spark.sql.hive.HiveSparkSubmitSuite.SPARK-8368: includes jars passed in through --jars

Failing for the past 1 build (Since Aborted#449 )
Took 3 min 1 sec.
Error Message

The code passed to failAfter did not complete within 180 seconds.
Stacktrace

sbt.ForkMain$ForkError: The code passed to failAfter did not complete within 180 seconds.
	at org.scalatest.concurrent.Timeouts$$anonfun$failAfter$1.apply(Timeouts.scala:249)
	at org.scalatest.concurrent.Timeouts$$anonfun$failAfter$1.apply(Timeouts.scala:249)
	at org.scalatest.concurrent.Timeouts$class.timeoutAfter(Timeouts.scala:345)
	at org.scalatest.concurrent.Timeouts$class.failAfter(Timeouts.scala:245)
	at org.apache.spark.sql.hive.HiveSparkSubmitSuite.failAfter(HiveSparkSubmitSuite.scala:32)
	at org.apache.spark.sql.hive.HiveSparkSubmitSuite.org$apache$spark$sql$hive$HiveSparkSubmitSuite$$runSparkSubmit(HiveSparkSubmitSuite.scala:90)
	at org.apache.spark.sql.hive.HiveSparkSubmitSuite$$anonfun$1.apply$mcV$sp(HiveSparkSubmitSuite.scala:57)
	at org.apache.spark.sql.hive.HiveSparkSubmitSuite$$anonfun$1.apply(HiveSparkSubmitSuite.scala:44)
	at org.apache.spark.sql.hive.HiveSparkSubmitSuite$$anonfun$1.apply(HiveSparkSubmitSuite.scala:44)
	at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
	at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
	at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
	at org.scalatest.Transformer.apply(Transformer.scala:22)
	at org.scalatest.Transformer.apply(Transformer.scala:20)
	at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:166)
	at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:42)
	at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:163)
	at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
	at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:175)
	at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
	at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:175)
	at org.apache.spark.sql.hive.HiveSparkSubmitSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(HiveSparkSubmitSuite.scala:32)
	at org.scalatest.BeforeAndAfterEach$class.runTest(BeforeAndAfterEach.scala:255)
	at org.apache.spark.sql.hive.HiveSparkSubmitSuite.runTest(HiveSparkSubmitSuite.scala:32)
	at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
	at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:208)
	at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
	at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
	at scala.collection.immutable.List.foreach(List.scala:318)
	at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
	at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
	at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
	at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:208)
	at org.scalatest.FunSuite.runTests(FunSuite.scala:1555)
	at org.scalatest.Suite$class.run(Suite.scala:1424)
	at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1555)
	at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
	at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:212)
	at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
	at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:212)
	at org.scalatest.FunSuite.run(FunSuite.scala:1555)
	at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:462)
	at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:671)
	at sbt.ForkMain$Run$2.call(ForkMain.java:294)
	at sbt.ForkMain$Run$2.call(ForkMain.java:284)
	at java.util.concurrent.FutureTask.run(FutureTask.java:262)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
Caused by: sbt.ForkMain$ForkError
	at java.lang.Object.wait(Native Method)
	at java.lang.Object.wait(Object.java:503)
	at java.lang.UNIXProcess.waitFor(UNIXProcess.java:263)
	at org.apache.spark.sql.hive.HiveSparkSubmitSuite$$anonfun$4.apply$mcI$sp(HiveSparkSubmitSuite.scala:90)
	at org.apache.spark.sql.hive.HiveSparkSubmitSuite$$anonfun$4.apply(HiveSparkSubmitSuite.scala:90)
	at org.apache.spark.sql.hive.HiveSparkSubmitSuite$$anonfun$4.apply(HiveSparkSubmitSuite.scala:90)
	at org.scalatest.concurrent.Timeouts$class.timeoutAfter(Timeouts.scala:326)
	... 46 more
{code}

> Flaky test: o.a.s.sql.hive.HiveSparkSubmitSuite --jars
> ------------------------------------------------------
>
>                 Key: SPARK-8567
>                 URL: https://issues.apache.org/jira/browse/SPARK-8567
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL, Tests
>    Affects Versions: 1.4.1
>            Reporter: Yin Huai
>            Assignee: Yin Huai
>            Priority: Critical
>              Labels: flaky-test
>             Fix For: 1.4.1, 1.5.0
>
>
> Seems tests in HiveSparkSubmitSuite fail with timeout pretty frequently.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org