You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "lucasysfeng (JIRA)" <ji...@apache.org> on 2019/04/19 02:59:00 UTC

[jira] [Created] (SPARK-27516) java.util.concurrent.TimeoutException: Futures timed out after [100000 milliseconds]

lucasysfeng created SPARK-27516:
-----------------------------------

             Summary: java.util.concurrent.TimeoutException: Futures timed out after [100000 milliseconds]
                 Key: SPARK-27516
                 URL: https://issues.apache.org/jira/browse/SPARK-27516
             Project: Spark
          Issue Type: Bug
          Components: PySpark
    Affects Versions: 2.2.0
         Environment: linux

YARN  cluster mode
            Reporter: lucasysfeng


 
{code:java}
#! /usr/bin/env python
# -*- coding: utf-8 -*-

from pyspark import SparkContext
from pyspark.sql import SparkSession

if __name__ == '__main__':
    spark = SparkSession.builder.appName('sparktest').getOrCreate()
    # Other code is omitted below
{code}
 

*The code is simple, but occasionally throws the following exception:*
19/04/15 21:30:00 ERROR yarn.ApplicationMaster: Uncaught exception: 
java.util.concurrent.TimeoutException: Futures timed out after [100000 milliseconds]
 at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
 at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
 at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:201)
 at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:400)
 at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:253)
 at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:771)
 at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:69)
 at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:68)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:422)
 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1743)
 at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:68)
 at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:769)
 at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)


I know spark.yarn.am.waitTime can increase the sparkcontext initialization time.
Why does SparkContext initialization take so long?

 

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org