You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/04/19 11:09:00 UTC
[jira] [Commented] (SPARK-27516)
java.util.concurrent.TimeoutException: Futures timed out after [100000
milliseconds]
[ https://issues.apache.org/jira/browse/SPARK-27516?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16821844#comment-16821844 ]
Hyukjin Kwon commented on SPARK-27516:
--------------------------------------
I can't reproduce this. I suspect this environment specific problem or Yarn's. Please provide more information that this indicates an issue in Spark.
> java.util.concurrent.TimeoutException: Futures timed out after [100000 milliseconds]
> ------------------------------------------------------------------------------------
>
> Key: SPARK-27516
> URL: https://issues.apache.org/jira/browse/SPARK-27516
> Project: Spark
> Issue Type: Bug
> Components: PySpark
> Affects Versions: 2.2.0
> Environment: linux
> YARN cluster mode
> Reporter: lucasysfeng
> Priority: Minor
> Attachments: driver_gc, spark.stderr
>
>
>
> {code:java}
> #! /usr/bin/env python
> # -*- coding: utf-8 -*-
> from pyspark import SparkContext
> from pyspark.sql import SparkSession
> if __name__ == '__main__':
> spark = SparkSession.builder.appName('sparktest').getOrCreate()
> # Other code is omitted below
> {code}
>
> *The code is simple, but occasionally throws the following exception:*
> 19/04/15 21:30:00 ERROR yarn.ApplicationMaster: Uncaught exception:
> java.util.concurrent.TimeoutException: Futures timed out after [100000 milliseconds]
> at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
> at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
> at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:201)
> at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:400)
> at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:253)
> at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:771)
> at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:69)
> at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:68)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1743)
> at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:68)
> at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:769)
> at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
>
> I know spark.yarn.am.waitTime can increase the sparkcontext initialization time.
> Why does SparkContext initialization take so long?
>
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org