You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Koert Kuipers <ko...@tresata.com> on 2014/09/14 02:27:17 UTC

spark 1.1.0 unit tests fail

on ubuntu 12.04 with 2 cores and 8G of RAM i see errors when i run the
tests for spark 1.1.0. not sure how significant this is, since i used to
see errors for spark 1.0.0 too....

$ java -version
java version "1.6.0_43"
Java(TM) SE Runtime Environment (build 1.6.0_43-b01)
Java HotSpot(TM) 64-Bit Server VM (build 20.14-b01, mixed mode)

$ mvn -version
Apache Maven 3.0.4
Maven home: /usr/share/maven
Java version: 1.6.0_43, vendor: Sun Microsystems Inc.
Java home: /usr/lib/jvm/jdk1.6.0_43/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "3.5.0-54-generic", arch: "amd64", family: "unix"

$ export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M
-XX:ReservedCodeCacheSize=512m"
$ mvn clean package -DskipTests
$ mvn test

it is still running, and is very slow (and curiously with very low cpu
usage, like 5%) but i already see the following errors:

DriverSuite:
- driver should exit after finishing *** FAILED ***
  TestFailedDueToTimeoutException was thrown during property evaluation.
(DriverSuite.scala:40)
    Message: The code passed to failAfter did not complete within 60
seconds.
    Location: (DriverSuite.scala:41)
    Occurred at table row 0 (zero based, not counting headings), which had
values (
      master = local
    )

SparkSubmitSuite:
- launch simple application with spark-submit *** FAILED ***
  org.apache.spark.SparkException: Process List(./bin/spark-submit,
--class, org.apache.spark.deploy.SimpleApplicationTest, --name, testApp,
--master, local, file:/tmp/1410653580697-0/testJar-1410653580697.jar)
exited with code 1
  at org.apache.spark.util.Utils$.executeAndGetOutput(Utils.scala:872)
  at
org.apache.spark.deploy.SparkSubmitSuite.runSparkSubmit(SparkSubmitSuite.scala:311)
  at
org.apache.spark.deploy.SparkSubmitSuite$$anonfun$14.apply$mcV$sp(SparkSubmitSuite.scala:291)
  at
org.apache.spark.deploy.SparkSubmitSuite$$anonfun$14.apply(SparkSubmitSuite.scala:284)
  at
org.apache.spark.deploy.SparkSubmitSuite$$anonfun$14.apply(SparkSubmitSuite.scala:284)
  at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
  at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
  at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
  at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
  at org.scalatest.Transformer.apply(Transformer.scala:22)
  ...
- spark submit includes jars passed in through --jar *** FAILED ***
  org.apache.spark.SparkException: Process List(./bin/spark-submit,
--class, org.apache.spark.deploy.JarCreationTest, --name, testApp,
--master, local-cluster[2,1,512], --jars,
file:/tmp/1410653674739-0/testJar-1410653674790.jar,file:/tmp/1410653674791-0/testJar-1410653674833.jar,
file:/tmp/1410653674737-0/testJar-1410653674737.jar) exited with code 1
  at org.apache.spark.util.Utils$.executeAndGetOutput(Utils.scala:872)
  at
org.apache.spark.deploy.SparkSubmitSuite.runSparkSubmit(SparkSubmitSuite.scala:311)
  at
org.apache.spark.deploy.SparkSubmitSuite$$anonfun$15.apply$mcV$sp(SparkSubmitSuite.scala:305)
  at
org.apache.spark.deploy.SparkSubmitSuite$$anonfun$15.apply(SparkSubmitSuite.scala:294)
  at
org.apache.spark.deploy.SparkSubmitSuite$$anonfun$15.apply(SparkSubmitSuite.scala:294)
  at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
  at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
  at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
  at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
  at org.scalatest.Transformer.apply(Transformer.scala:22)
  ...

Re: spark 1.1.0 unit tests fail

Posted by Koert Kuipers <ko...@tresata.com>.
ok sounds good. those were the only tests that failed by the way


On Sun, Sep 14, 2014 at 1:07 AM, Andrew Or <an...@databricks.com> wrote:

> Hi Koert,
>
> Thanks for reporting this. These tests have been flaky even on the master
> branch for a long time. You can safely disregard these test failures, as
> the root cause is port collisions from the many SparkContexts we create
> over the course of the entire test. There is a patch that fixes this but
> not back ported into branch-1.1 yet. I will do that shortly.
>
> -Andrew
>
> 2014-09-13 17:27 GMT-07:00 Koert Kuipers <ko...@tresata.com>:
>
> on ubuntu 12.04 with 2 cores and 8G of RAM i see errors when i run the
>> tests for spark 1.1.0. not sure how significant this is, since i used to
>> see errors for spark 1.0.0 too....
>>
>> $ java -version
>> java version "1.6.0_43"
>> Java(TM) SE Runtime Environment (build 1.6.0_43-b01)
>> Java HotSpot(TM) 64-Bit Server VM (build 20.14-b01, mixed mode)
>>
>> $ mvn -version
>> Apache Maven 3.0.4
>> Maven home: /usr/share/maven
>> Java version: 1.6.0_43, vendor: Sun Microsystems Inc.
>> Java home: /usr/lib/jvm/jdk1.6.0_43/jre
>> Default locale: en_US, platform encoding: UTF-8
>> OS name: "linux", version: "3.5.0-54-generic", arch: "amd64", family:
>> "unix"
>>
>> $ export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M
>> -XX:ReservedCodeCacheSize=512m"
>> $ mvn clean package -DskipTests
>> $ mvn test
>>
>> it is still running, and is very slow (and curiously with very low cpu
>> usage, like 5%) but i already see the following errors:
>>
>> DriverSuite:
>> - driver should exit after finishing *** FAILED ***
>>   TestFailedDueToTimeoutException was thrown during property evaluation.
>> (DriverSuite.scala:40)
>>     Message: The code passed to failAfter did not complete within 60
>> seconds.
>>     Location: (DriverSuite.scala:41)
>>     Occurred at table row 0 (zero based, not counting headings), which
>> had values (
>>       master = local
>>     )
>>
>> SparkSubmitSuite:
>> - launch simple application with spark-submit *** FAILED ***
>>   org.apache.spark.SparkException: Process List(./bin/spark-submit,
>> --class, org.apache.spark.deploy.SimpleApplicationTest, --name, testApp,
>> --master, local, file:/tmp/1410653580697-0/testJar-1410653580697.jar)
>> exited with code 1
>>   at org.apache.spark.util.Utils$.executeAndGetOutput(Utils.scala:872)
>>   at
>> org.apache.spark.deploy.SparkSubmitSuite.runSparkSubmit(SparkSubmitSuite.scala:311)
>>   at
>> org.apache.spark.deploy.SparkSubmitSuite$$anonfun$14.apply$mcV$sp(SparkSubmitSuite.scala:291)
>>   at
>> org.apache.spark.deploy.SparkSubmitSuite$$anonfun$14.apply(SparkSubmitSuite.scala:284)
>>   at
>> org.apache.spark.deploy.SparkSubmitSuite$$anonfun$14.apply(SparkSubmitSuite.scala:284)
>>   at
>> org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
>>   at
>> org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
>>   at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
>>   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
>>   at org.scalatest.Transformer.apply(Transformer.scala:22)
>>   ...
>> - spark submit includes jars passed in through --jar *** FAILED ***
>>   org.apache.spark.SparkException: Process List(./bin/spark-submit,
>> --class, org.apache.spark.deploy.JarCreationTest, --name, testApp,
>> --master, local-cluster[2,1,512], --jars,
>> file:/tmp/1410653674739-0/testJar-1410653674790.jar,file:/tmp/1410653674791-0/testJar-1410653674833.jar,
>> file:/tmp/1410653674737-0/testJar-1410653674737.jar) exited with code 1
>>   at org.apache.spark.util.Utils$.executeAndGetOutput(Utils.scala:872)
>>   at
>> org.apache.spark.deploy.SparkSubmitSuite.runSparkSubmit(SparkSubmitSuite.scala:311)
>>   at
>> org.apache.spark.deploy.SparkSubmitSuite$$anonfun$15.apply$mcV$sp(SparkSubmitSuite.scala:305)
>>   at
>> org.apache.spark.deploy.SparkSubmitSuite$$anonfun$15.apply(SparkSubmitSuite.scala:294)
>>   at
>> org.apache.spark.deploy.SparkSubmitSuite$$anonfun$15.apply(SparkSubmitSuite.scala:294)
>>   at
>> org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
>>   at
>> org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
>>   at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
>>   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
>>   at org.scalatest.Transformer.apply(Transformer.scala:22)
>>   ...
>>
>>
>

Re: spark 1.1.0 unit tests fail

Posted by Andrew Or <an...@databricks.com>.
Hi Koert,

Thanks for reporting this. These tests have been flaky even on the master
branch for a long time. You can safely disregard these test failures, as
the root cause is port collisions from the many SparkContexts we create
over the course of the entire test. There is a patch that fixes this but
not back ported into branch-1.1 yet. I will do that shortly.

-Andrew

2014-09-13 17:27 GMT-07:00 Koert Kuipers <ko...@tresata.com>:

> on ubuntu 12.04 with 2 cores and 8G of RAM i see errors when i run the
> tests for spark 1.1.0. not sure how significant this is, since i used to
> see errors for spark 1.0.0 too....
>
> $ java -version
> java version "1.6.0_43"
> Java(TM) SE Runtime Environment (build 1.6.0_43-b01)
> Java HotSpot(TM) 64-Bit Server VM (build 20.14-b01, mixed mode)
>
> $ mvn -version
> Apache Maven 3.0.4
> Maven home: /usr/share/maven
> Java version: 1.6.0_43, vendor: Sun Microsystems Inc.
> Java home: /usr/lib/jvm/jdk1.6.0_43/jre
> Default locale: en_US, platform encoding: UTF-8
> OS name: "linux", version: "3.5.0-54-generic", arch: "amd64", family:
> "unix"
>
> $ export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M
> -XX:ReservedCodeCacheSize=512m"
> $ mvn clean package -DskipTests
> $ mvn test
>
> it is still running, and is very slow (and curiously with very low cpu
> usage, like 5%) but i already see the following errors:
>
> DriverSuite:
> - driver should exit after finishing *** FAILED ***
>   TestFailedDueToTimeoutException was thrown during property evaluation.
> (DriverSuite.scala:40)
>     Message: The code passed to failAfter did not complete within 60
> seconds.
>     Location: (DriverSuite.scala:41)
>     Occurred at table row 0 (zero based, not counting headings), which had
> values (
>       master = local
>     )
>
> SparkSubmitSuite:
> - launch simple application with spark-submit *** FAILED ***
>   org.apache.spark.SparkException: Process List(./bin/spark-submit,
> --class, org.apache.spark.deploy.SimpleApplicationTest, --name, testApp,
> --master, local, file:/tmp/1410653580697-0/testJar-1410653580697.jar)
> exited with code 1
>   at org.apache.spark.util.Utils$.executeAndGetOutput(Utils.scala:872)
>   at
> org.apache.spark.deploy.SparkSubmitSuite.runSparkSubmit(SparkSubmitSuite.scala:311)
>   at
> org.apache.spark.deploy.SparkSubmitSuite$$anonfun$14.apply$mcV$sp(SparkSubmitSuite.scala:291)
>   at
> org.apache.spark.deploy.SparkSubmitSuite$$anonfun$14.apply(SparkSubmitSuite.scala:284)
>   at
> org.apache.spark.deploy.SparkSubmitSuite$$anonfun$14.apply(SparkSubmitSuite.scala:284)
>   at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
>   at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
>   at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
>   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
>   at org.scalatest.Transformer.apply(Transformer.scala:22)
>   ...
> - spark submit includes jars passed in through --jar *** FAILED ***
>   org.apache.spark.SparkException: Process List(./bin/spark-submit,
> --class, org.apache.spark.deploy.JarCreationTest, --name, testApp,
> --master, local-cluster[2,1,512], --jars,
> file:/tmp/1410653674739-0/testJar-1410653674790.jar,file:/tmp/1410653674791-0/testJar-1410653674833.jar,
> file:/tmp/1410653674737-0/testJar-1410653674737.jar) exited with code 1
>   at org.apache.spark.util.Utils$.executeAndGetOutput(Utils.scala:872)
>   at
> org.apache.spark.deploy.SparkSubmitSuite.runSparkSubmit(SparkSubmitSuite.scala:311)
>   at
> org.apache.spark.deploy.SparkSubmitSuite$$anonfun$15.apply$mcV$sp(SparkSubmitSuite.scala:305)
>   at
> org.apache.spark.deploy.SparkSubmitSuite$$anonfun$15.apply(SparkSubmitSuite.scala:294)
>   at
> org.apache.spark.deploy.SparkSubmitSuite$$anonfun$15.apply(SparkSubmitSuite.scala:294)
>   at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
>   at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
>   at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
>   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
>   at org.scalatest.Transformer.apply(Transformer.scala:22)
>   ...
>
>