You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sunil Prabhakara (JIRA)" <ji...@apache.org> on 2014/10/13 14:57:34 UTC

[jira] [Commented] (SPARK-1138) Spark 0.9.0 does not work with Hadoop / HDFS

    [ https://issues.apache.org/jira/browse/SPARK-1138?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14169254#comment-14169254 ] 

Sunil Prabhakara commented on SPARK-1138:
-----------------------------------------

I am using Cloudera Version 4.2.1, Spark 1.1.0 and Scala 2.10.4; Observing similar error 

ERROR Remoting: Remoting error: [Startup failed] [
akka.remote.RemoteTransportException: Startup failed
	at akka.remote.Remoting.akka$remote$Remoting$$notifyError(Remoting.scala:129)
	at akka.remote.Remoting.start(Remoting.scala:194)
	at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
	at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
	at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
	at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
	at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
	at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
	at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
	at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
	at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
	at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
...
along with 
Exception in thread "main" org.jboss.netty.channel.ChannelException: Failed to bind to: <my-host-name>/10.65.42.145:0
	at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
	at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:391)
	at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:388)
	at scala.util.Success$$anonfun$map$1.apply(Try.scala:206)
	at scala.util.Try$.apply(Try.scala:161)
	at scala.util.Success.map(Try.scala:206)
	at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
	at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
...

For the second error I tried to update the /etc/hosts file with IP address of my host name and updated the spark-env.sh files with same IP address as suggested in other answer but still struck with the above issues.

> Spark 0.9.0 does not work with Hadoop / HDFS
> --------------------------------------------
>
>                 Key: SPARK-1138
>                 URL: https://issues.apache.org/jira/browse/SPARK-1138
>             Project: Spark
>          Issue Type: Bug
>            Reporter: Sam Abeyratne
>
> UPDATE: This problem is certainly related to trying to use Spark 0.9.0 and the latest cloudera Hadoop / HDFS in the same jar.  It seems no matter how I fiddle with the deps, the do not play nice together.
> I'm getting a java.util.concurrent.TimeoutException when trying to create a spark context with 0.9.  I cannot, whatever I do, change the timeout.  I've tried using System.setProperty, the SparkConf mechanism of creating a SparkContext and the -D flags when executing my jar.  I seem to be able to run simple jobs from the spark-shell OK, but my more complicated jobs require external libraries so I need to build jars and execute them.
> Some code that causes this:
> println("Creating config")
>     val conf = new SparkConf()
>       .setMaster(clusterMaster)
>       .setAppName("MyApp")
>       .setSparkHome(sparkHome)
>       .set("spark.akka.askTimeout", parsed.getOrElse(timeouts, "100"))
>       .set("spark.akka.timeout", parsed.getOrElse(timeouts, "100"))
>     println("Creating sc")
>     implicit val sc = new SparkContext(conf)
> The output:
> Creating config
> Creating sc
> log4j:WARN No appenders could be found for logger (akka.event.slf4j.Slf4jLogger).
> log4j:WARN Please initialize the log4j system properly.
> log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> [ERROR] [02/26/2014 11:05:25.491] [main] [Remoting] Remoting error: [Startup timed out] [
> akka.remote.RemoteTransportException: Startup timed out
> 	at akka.remote.Remoting.akka$remote$Remoting$$notifyError(Remoting.scala:129)
> 	at akka.remote.Remoting.start(Remoting.scala:191)
> 	at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
> 	at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
> 	at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
> 	at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
> 	at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
> 	at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
> 	at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:96)
> 	at org.apache.spark.SparkEnv$.create(SparkEnv.scala:126)
> 	at org.apache.spark.SparkContext.<init>(SparkContext.scala:139)
> 	at com.adbrain.accuracy.EvaluateAdtruthIDs$.main(EvaluateAdtruthIDs.scala:40)
> 	at com.adbrain.accuracy.EvaluateAdtruthIDs.main(EvaluateAdtruthIDs.scala)
> Caused by: java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
> 	at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
> 	at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
> 	at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
> 	at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
> 	at scala.concurrent.Await$.result(package.scala:107)
> 	at akka.remote.Remoting.start(Remoting.scala:173)
> 	... 11 more
> ]
> Exception in thread "main" java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
> 	at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
> 	at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
> 	at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
> 	at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
> 	at scala.concurrent.Await$.result(package.scala:107)
> 	at akka.remote.Remoting.start(Remoting.scala:173)
> 	at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
> 	at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
> 	at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
> 	at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
> 	at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
> 	at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
> 	at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:96)
> 	at org.apache.spark.SparkEnv$.create(SparkEnv.scala:126)
> 	at org.apache.spark.SparkContext.<init>(SparkContext.scala:139)
> 	at com.adbrain.accuracy.EvaluateAdtruthIDs$.main(EvaluateAdtruthIDs.scala:40)
> 	at com.adbrain.accuracy.EvaluateAdtruthIDs.main(EvaluateAdtruthIDs.scala)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org