You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:04:13 UTC

[jira] [Updated] (SPARK-20210) Scala tests aborted in Spark SQL on ppc64le

     [ https://issues.apache.org/jira/browse/SPARK-20210?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-20210:
---------------------------------
    Labels: bulk-closed ppc64le  (was: ppc64le)

> Scala tests aborted in Spark SQL on ppc64le
> -------------------------------------------
>
>                 Key: SPARK-20210
>                 URL: https://issues.apache.org/jira/browse/SPARK-20210
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.2.0
>         Environment: Ubuntu 14.04 ppc64le 
> $ java -version
> openjdk version "1.8.0_111"
> OpenJDK Runtime Environment (build 1.8.0_111-8u111-b14-3~14.04.1-b14)
> OpenJDK 64-Bit Server VM (build 25.111-b14, mixed mode)
>            Reporter: Sonia Garudi
>            Priority: Minor
>              Labels: bulk-closed, ppc64le
>
> The tests get aborted with the following error :
> {code}
> *** RUN ABORTED ***
>   org.apache.spark.rpc.RpcTimeoutException: Futures timed out after [120 seconds]. This timeout is controlled by spark.rpc.askTimeout
>   at org.apache.spark.rpc.RpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcTimeout.scala:47)
>   at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:62)
>   at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:58)
>   at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
>   at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:76)
>   at org.apache.spark.storage.BlockManagerMaster.removeRdd(BlockManagerMaster.scala:125)
>   at org.apache.spark.SparkContext.unpersistRDD(SparkContext.scala:1792)
>   at org.apache.spark.rdd.RDD.unpersist(RDD.scala:216)
>   at org.apache.spark.sql.execution.CacheManager$$anonfun$clearCache$1$$anonfun$apply$mcV$sp$1.apply(CacheManager.scala:75)
>   at org.apache.spark.sql.execution.CacheManager$$anonfun$clearCache$1$$anonfun$apply$mcV$sp$1.apply(CacheManager.scala:75)
>   ...
>   Cause: java.util.concurrent.TimeoutException: Futures timed out after [120 seconds]
>   at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>   at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>   at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:201)
>   at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
>   at org.apache.spark.storage.BlockManagerMaster.removeRdd(BlockManagerMaster.scala:125)
>   at org.apache.spark.SparkContext.unpersistRDD(SparkContext.scala:1792)☃001B[0m
>   at org.apache.spark.rdd.RDD.unpersist(RDD.scala:216)
>   at org.apache.spark.sql.execution.CacheManager$$anonfun$clearCache$1$$anonfun$apply$mcV$sp$1.apply(CacheManager.scala:75)
>   at org.apache.spark.sql.execution.CacheManager$$anonfun$clearCache$1$$anonfun$apply$mcV$sp$1.apply(CacheManager.scala:75)
>   at scala.collection.Iterator$class.foreach(Iterator.scala:893)
>   ...
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org