You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by SamRoberts <sa...@yahoo.com> on 2015/07/05 08:40:27 UTC

Futures timed out after 10000 milliseconds

I have a very simple application, where I am intializing the Spark Context
and using the context. The problem happens with both Spark 1.3.1 and 1.4.0;
Scala 2.10.4; Java 1.7.0_79

Full Program
========
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._

object SimpleApp {
 def main(args: Array[String]) {
 val logFile = "/home/user1/SimpleSpark/src/data/sample.txt"
 val sc = new SparkContext("local[4]", "Simple App",
"/home/user1/spark-1.3.1-bin-hadoop2.6",
List("/home/user1/SimpleSpark/target/scala-2.10/simple-project_2.10-1.0.jar"))
 val logData = sc.textFile(logFile, 2).cache()
 val numTHEs = logData.filter(line => line.contains("the")).count()
 println("Lines with the: %s".format(numTHEs))
 sc.stop()
 }
}


The program runs a couple of times, then I get the error below, after which
it keeps failing with this timeout. Rebooting sometimes helps, but at other
times it doesn't. I can't proceed with my Spark learning without this.. pl.
help.
Sam

Exception in thread "main" java.util.concurrent.TimeoutException: Futures
timed out after [10000 milliseconds]
        at
scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
        at
scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
        at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
        at
scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
        at scala.concurrent.Await$.result(package.scala:107)
        at akka.remote.Remoting.start(Remoting.scala:180)
        at
akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
        at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:618)
        at
akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:615)
        at akka.actor.ActorSystemImpl._start(ActorSystem.scala:615)
        at akka.actor.ActorSystemImpl.start(ActorSystem.scala:632)
        at akka.actor.ActorSystem$.apply(ActorSystem.scala:141)
        at akka.actor.ActorSystem$.apply(ActorSystem.scala:118)
        at
org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:122)
        at
org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:55)
        at
org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
        at
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1837)
        at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
        at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1828)
        at
org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:57)
        at org.apache.spark.SparkEnv$.create(SparkEnv.scala:223)
        at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)
        at
org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:269)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:272)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:154)
        at SimpleApp$.main(SimpleApp.scala:7)
        at SimpleApp.main(SimpleApp.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
        at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
        at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Futures-timed-out-after-10000-milliseconds-tp23622.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Futures timed out after 10000 milliseconds

Posted by SamRoberts <sa...@yahoo.com>.
Please note -- I am trying to run this with sbt run or spark-submit --
getting the same errors in both.

Since I am in stand-alone mode, I assume I need not start the spark master,
am I right.

I realize this is probably a basic setup issue, but am unable to get past
it. Any help will be appreciated.





--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Futures-timed-out-after-10000-milliseconds-tp23622p23627.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Futures timed out after 10000 milliseconds

Posted by Ted Yu <yu...@gmail.com>.
Sam:
bq. where would one set this timeout?

With the following, it would be relatively easier to see which conf to
change:

[SPARK-6980] [CORE] Akka timeout exceptions indicate which conf controls
them (RPC Layer)

FYI

On Sun, Jul 5, 2015 at 1:46 PM, Sean Owen <so...@cloudera.com> wrote:

> Usually this message means that the test was starting some process
> like a Spark master and it didn't ever start. The eventual error is
> "timeout". You have to try to dig in to the test and logs to catch the
> real reason.
>
> On Sun, Jul 5, 2015 at 9:23 PM, SamRoberts <sa...@yahoo.com>
> wrote:
> > Also, it's not clear where to 10000 millisec timeout is coming from. Can
> > someone explain -- and if it's a legitimate timeout problem, where would
> one
> > set this timeout?
> >
> >
> >
> > --
> > View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Futures-timed-out-after-10000-milliseconds-tp23622p23629.html
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> > For additional commands, e-mail: user-help@spark.apache.org
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Re: Futures timed out after 10000 milliseconds

Posted by Sean Owen <so...@cloudera.com>.
Usually this message means that the test was starting some process
like a Spark master and it didn't ever start. The eventual error is
"timeout". You have to try to dig in to the test and logs to catch the
real reason.

On Sun, Jul 5, 2015 at 9:23 PM, SamRoberts <sa...@yahoo.com> wrote:
> Also, it's not clear where to 10000 millisec timeout is coming from. Can
> someone explain -- and if it's a legitimate timeout problem, where would one
> set this timeout?
>
>
>
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Futures-timed-out-after-10000-milliseconds-tp23622p23629.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Futures timed out after 10000 milliseconds

Posted by SamRoberts <sa...@yahoo.com>.
Thanks.. tried local[*] -- it didn't help.

I agree that it is something to do with the SparkContext..



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Futures-timed-out-after-10000-milliseconds-tp23622p23633.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Futures timed out after 10000 milliseconds

Posted by SamRoberts <sa...@yahoo.com>.
Also, it's not clear where to 10000 millisec timeout is coming from. Can
someone explain -- and if it's a legitimate timeout problem, where would one
set this timeout?



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Futures-timed-out-after-10000-milliseconds-tp23622p23629.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Futures timed out after 10000 milliseconds

Posted by SamRoberts <sa...@yahoo.com>.
One more data point -- sbt seems to have a bigger problem with this than
spark-submit. With spark-submit, I am able to get it to run several times,
while sbt fails most of the time (or more recently all the time).



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Futures-timed-out-after-10000-milliseconds-tp23622p23634.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org