You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by "Ganelin, Ilya" <Il...@capitalone.com> on 2015/03/11 01:09:47 UTC

Spark tests hang on local machine due to "testGuavaOptional" in JavaAPISuite

Hi all – building Spark on my local machine with build/mvn clean package test runs until it hits the JavaAPISuite where it hangs indefinitely. Through some experimentation, I’ve narrowed it down to the following test:


/**
 * Test for SPARK-3647. This test needs to use the maven-built assembly to trigger the issue,
 * since that's the only artifact where Guava classes have been relocated.
 */
@Test
public void testGuavaOptional() {
  // Stop the context created in setUp() and start a local-cluster one, to force usage of the
  // assembly.
  sc.stop();
  JavaSparkContext localCluster = new JavaSparkContext("local-cluster[1,1,512]", "JavaAPISuite");
  try {
    JavaRDD<Integer> rdd1 = localCluster.parallelize(Arrays.asList(1, 2, null), 3);
    JavaRDD<Optional<Integer>> rdd2 = rdd1.map(
      new Function<Integer, Optional<Integer>>() {
        @Override
        public Optional<Integer> call(Integer i) {
          return Optional.fromNullable(i);
        }
      });
    rdd2.collect();
  } finally {
    localCluster.stop();
  }
}


If I remove this test, things work smoothly. Has anyone else seen this? Thanks.
________________________________________________________

The information contained in this e-mail is confidential and/or proprietary to Capital One and/or its affiliates. The information transmitted herewith is intended only for use by the individual or entity to which it is addressed.  If the reader of this message is not the intended recipient, you are hereby notified that any review, retransmission, dissemination, distribution, copying or other use of, or taking of any action in reliance upon this information is strictly prohibited. If you have received this communication in error, please contact the sender and delete the material from your computer.

Re: Spark tests hang on local machine due to "testGuavaOptional" in JavaAPISuite

Posted by Sean Owen <so...@cloudera.com>.
Yes and I remember it was caused by ... well something related to the
Guava shading and the fact that you're running a mini cluster and then
talking to it. I can't remember what exactly resolved it but try a
clean build. Somehow I think it had to do with multiple assembly files
or something like that.

On Wed, Mar 11, 2015 at 12:09 AM, Ganelin, Ilya
<Il...@capitalone.com> wrote:
> Hi all – building Spark on my local machine with build/mvn clean package test runs until it hits the JavaAPISuite where it hangs indefinitely. Through some experimentation, I’ve narrowed it down to the following test:
>
>
> /**
>  * Test for SPARK-3647. This test needs to use the maven-built assembly to trigger the issue,
>  * since that's the only artifact where Guava classes have been relocated.
>  */
> @Test
> public void testGuavaOptional() {
>   // Stop the context created in setUp() and start a local-cluster one, to force usage of the
>   // assembly.
>   sc.stop();
>   JavaSparkContext localCluster = new JavaSparkContext("local-cluster[1,1,512]", "JavaAPISuite");
>   try {
>     JavaRDD<Integer> rdd1 = localCluster.parallelize(Arrays.asList(1, 2, null), 3);
>     JavaRDD<Optional<Integer>> rdd2 = rdd1.map(
>       new Function<Integer, Optional<Integer>>() {
>         @Override
>         public Optional<Integer> call(Integer i) {
>           return Optional.fromNullable(i);
>         }
>       });
>     rdd2.collect();
>   } finally {
>     localCluster.stop();
>   }
> }
>
>
> If I remove this test, things work smoothly. Has anyone else seen this? Thanks.
> ________________________________________________________
>
> The information contained in this e-mail is confidential and/or proprietary to Capital One and/or its affiliates. The information transmitted herewith is intended only for use by the individual or entity to which it is addressed.  If the reader of this message is not the intended recipient, you are hereby notified that any review, retransmission, dissemination, distribution, copying or other use of, or taking of any action in reliance upon this information is strictly prohibited. If you have received this communication in error, please contact the sender and delete the material from your computer.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org