You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by "assaf.mendelson" <as...@rsa.com> on 2016/09/12 05:49:18 UTC

Test fails when compiling spark with tests

Hi,
I am trying to set up a spark development environment. I forked the spark git project and cloned the fork. I then checked out branch-2.0 tag (which I assume is the released source code).
I then compiled spark twice.
The first using:
mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -DskipTests clean package
This compiled successfully.
The second using mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 clean package
This got a failure in Spark Project Core with the following test failing:
- caching in memory and disk, replicated
- caching in memory and disk, serialized, replicated *** FAILED ***
  java.util.concurrent.TimeoutException: Can't find 2 executors before 30000 milliseconds elapsed
  at org.apache.spark.ui.jobs.JobProgressListener.waitUntilExecutorsUp(JobProgressListener.scala:573)
  at org.apache.spark.DistributedSuite.org$apache$spark$DistributedSuite$$testCaching(DistributedSuite.scala:154)
  at org.apache.spark.DistributedSuite$$anonfun$32$$anonfun$apply$1.apply$mcV$sp(DistributedSuite.scala:191)
  at org.apache.spark.DistributedSuite$$anonfun$32$$anonfun$apply$1.apply(DistributedSuite.scala:191)
  at org.apache.spark.DistributedSuite$$anonfun$32$$anonfun$apply$1.apply(DistributedSuite.scala:191)
  at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
  at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
  at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
  at org.scalatest.Transformer.apply(Transformer.scala:22)
  at org.scalatest.Transformer.apply(Transformer.scala:20)
  ...
- compute without caching when no partitions fit in memory

I made no changes to the code whatsoever. Can anyone help me figure out what is wrong with my environment?
BTW I am using maven 3.3.9 and java 1.8.0_101-b13

Thanks,
                Assaf




--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Test-fails-when-compiling-spark-with-tests-tp18919.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

Re: Test fails when compiling spark with tests

Posted by Fred Reiss <fr...@gmail.com>.
Also try doing a fresh clone of the git repository. I've seen some of those
rare failure modes corrupt parts of my local copy in the past.

FWIW the main branch as of yesterday afternoon is building fine in my
environment.

Fred

On Tue, Sep 13, 2016 at 6:29 PM, Jakob Odersky <ja...@odersky.com> wrote:

> There are some flaky tests that occasionally fail, my first
> recommendation would be to re-run the test suite. Another thing to
> check is if there are any applications listening to spark's default
> ports.
> Btw, what is your environment like? In case it is windows, I don't
> think tests are regularly run against that platform and therefore
> could very well be broken.
>
> On Sun, Sep 11, 2016 at 10:49 PM, assaf.mendelson
> <as...@rsa.com> wrote:
> > Hi,
> >
> > I am trying to set up a spark development environment. I forked the spark
> > git project and cloned the fork. I then checked out branch-2.0 tag
> (which I
> > assume is the released source code).
> >
> > I then compiled spark twice.
> >
> > The first using:
> >
> > mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -DskipTests clean package
> >
> > This compiled successfully.
> >
> > The second using mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 clean
> > package
> >
> > This got a failure in Spark Project Core with the following test failing:
> >
> > - caching in memory and disk, replicated
> >
> > - caching in memory and disk, serialized, replicated *** FAILED ***
> >
> >   java.util.concurrent.TimeoutException: Can't find 2 executors before
> 30000
> > milliseconds elapsed
> >
> >   at
> > org.apache.spark.ui.jobs.JobProgressListener.waitUntilExecutorsUp(
> JobProgressListener.scala:573)
> >
> >   at
> > org.apache.spark.DistributedSuite.org$apache$spark$DistributedSuite$$
> testCaching(DistributedSuite.scala:154)
> >
> >   at
> > org.apache.spark.DistributedSuite$$anonfun$32$$
> anonfun$apply$1.apply$mcV$sp(DistributedSuite.scala:191)
> >
> >   at
> > org.apache.spark.DistributedSuite$$anonfun$32$$anonfun$apply$1.apply(
> DistributedSuite.scala:191)
> >
> >   at
> > org.apache.spark.DistributedSuite$$anonfun$32$$anonfun$apply$1.apply(
> DistributedSuite.scala:191)
> >
> >   at
> > org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(
> Transformer.scala:22)
> >
> >   at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
> >
> >   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> >
> >   at org.scalatest.Transformer.apply(Transformer.scala:22)
> >
> >   at org.scalatest.Transformer.apply(Transformer.scala:20)
> >
> >   ...
> >
> > - compute without caching when no partitions fit in memory
> >
> >
> >
> > I made no changes to the code whatsoever. Can anyone help me figure out
> what
> > is wrong with my environment?
> >
> > BTW I am using maven 3.3.9 and java 1.8.0_101-b13
> >
> >
> >
> > Thanks,
> >
> >                 Assaf
> >
> >
> > ________________________________
> > View this message in context: Test fails when compiling spark with tests
> > Sent from the Apache Spark Developers List mailing list archive at
> > Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

Re: Test fails when compiling spark with tests

Posted by Jakob Odersky <ja...@odersky.com>.
There are some flaky tests that occasionally fail, my first
recommendation would be to re-run the test suite. Another thing to
check is if there are any applications listening to spark's default
ports.
Btw, what is your environment like? In case it is windows, I don't
think tests are regularly run against that platform and therefore
could very well be broken.

On Sun, Sep 11, 2016 at 10:49 PM, assaf.mendelson
<as...@rsa.com> wrote:
> Hi,
>
> I am trying to set up a spark development environment. I forked the spark
> git project and cloned the fork. I then checked out branch-2.0 tag (which I
> assume is the released source code).
>
> I then compiled spark twice.
>
> The first using:
>
> mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -DskipTests clean package
>
> This compiled successfully.
>
> The second using mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 clean
> package
>
> This got a failure in Spark Project Core with the following test failing:
>
> - caching in memory and disk, replicated
>
> - caching in memory and disk, serialized, replicated *** FAILED ***
>
>   java.util.concurrent.TimeoutException: Can't find 2 executors before 30000
> milliseconds elapsed
>
>   at
> org.apache.spark.ui.jobs.JobProgressListener.waitUntilExecutorsUp(JobProgressListener.scala:573)
>
>   at
> org.apache.spark.DistributedSuite.org$apache$spark$DistributedSuite$$testCaching(DistributedSuite.scala:154)
>
>   at
> org.apache.spark.DistributedSuite$$anonfun$32$$anonfun$apply$1.apply$mcV$sp(DistributedSuite.scala:191)
>
>   at
> org.apache.spark.DistributedSuite$$anonfun$32$$anonfun$apply$1.apply(DistributedSuite.scala:191)
>
>   at
> org.apache.spark.DistributedSuite$$anonfun$32$$anonfun$apply$1.apply(DistributedSuite.scala:191)
>
>   at
> org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
>
>   at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
>
>   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
>
>   at org.scalatest.Transformer.apply(Transformer.scala:22)
>
>   at org.scalatest.Transformer.apply(Transformer.scala:20)
>
>   ...
>
> - compute without caching when no partitions fit in memory
>
>
>
> I made no changes to the code whatsoever. Can anyone help me figure out what
> is wrong with my environment?
>
> BTW I am using maven 3.3.9 and java 1.8.0_101-b13
>
>
>
> Thanks,
>
>                 Assaf
>
>
> ________________________________
> View this message in context: Test fails when compiling spark with tests
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org