You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Alexander Bezzubov <bz...@apache.org> on 2015/08/01 18:07:24 UTC

Re: java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]‏

Hi Ankit,

did you manage to overcome the difficulties that you descibe?

I saw couple other people struggling with the same issue:

java.lang.VerifyError: (class:
org/jboss/netty/channel/socket/nio/NioWorkerPool,
method: createWorker signature: (Ljava/util/concurrent/
Executor;)Lorg/jboss/netty/channel/socket/nio/AbstractNioWorker;) Wrong
return type in function

Could you please post the java and version you used to compile and run
Zeppelin and then try building with Java 1.7 and maven 3.3.x and see if
that helps?

--
BR,
Alexander


On Thu, Jul 30, 2015 at 4:01 PM, Ankit Gupta <an...@outlook.com>
wrote:

> I installed latest pull of zeppelin using following command
>
> mvn install -DskipTests -Dspark.version=1.4.0
> -Dhadoop.version=2.0.0-cdh4.2.0
>
> I have not made any change to conf folder (master = local[*] ) and tried
> to run following in a notebook
>
> val z = sc.parallelize(List(1,2,3,4,5,6), 2)
> z.first()
>
> got the error
>
> java.util.concurrent.TimeoutException: Futures timed out after [10000
> milliseconds] at
> scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219) at
> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) at
> scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) at
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
> at scala.concurrent.Await$.result(package.scala:107) at
> akka.remote.Remoting.start(Remoting.scala:180) at
> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
> at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:618) at
> akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:615) at
> akka.actor.ActorSystemImpl._start(ActorSystem.scala:615) at
> akka.actor.ActorSystemImpl.start(ActorSystem.scala:632) at
> akka.actor.ActorSystem$.apply(ActorSystem.scala:141) at
> akka.actor.ActorSystem$.apply(ActorSystem.scala:118) at
> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:122)
> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54) at
> org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53) at
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1991)
> at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141) at
> org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1982) at
> org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56) at
> org.apache.spark.rpc.akka.AkkaRpcEnvFactory.create(AkkaRpcEnv.scala:245) at
> org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:52) at
> org.apache.spark.SparkEnv$.create(SparkEnv.scala:247) at
> org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:188) at
> org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267) at
> org.apache.spark.SparkContext.<init>(SparkContext.scala:424) at
> org.apache.zeppelin.spark.SparkInterpreter.createSparkContext(SparkInterpreter.java:301)
> at
> org.apache.zeppelin.spark.SparkInterpreter.getSparkContext(SparkInterpreter.java:146)
> at
> org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:423)
> at
> org.apache.zeppelin.interpreter.ClassloaderInterpreter.open(ClassloaderInterpreter.java:74)
> at
> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:68)
> at
> org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:92)
> at
> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:276)
> at org.apache.zeppelin.scheduler.Job.run(Job.java:170) at
> org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:118)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266) at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
>
> =======================
>
> zeppelin-interpreter-spark-root-<ip>.log contains following exceptions
>
> Please instead use:
> - ./spark-submit with --driver-class-path to augment the driver classpath
> - spark.executor.extraClassPath to augment the executor classpath
>
> WARN [2015-07-30 06:46:16,609] ({pool-2-thread-2}
> Logging.scala[logWarning]:71) - Setting 'spark.executor.extraClassPath' to
> ':/root/incubator-zeppelin/interpreter/spark/*:/root/incubator-zeppelin/zeppelin-interpreter/target/lib/*:/root/incubator-zeppelin/zeppelin-server/target/lib/*:/root/incubator-zeppelin/zeppelin-zengine/target/lib/*:/root/incubator-zeppelin/zeppelin-interpreter/target/lib/*:/root/incubator-zeppelin/*::/root/incubator-zeppelin/conf:/root/incubator-zeppelin/zeppelin-interpreter/target/classes:/root/incubator-zeppelin/zeppelin-zengine/target/classes:/root/incubator-zeppelin/zeppelin-server/target/classes:/root/incubator-zeppelin/conf:/root/incubator-zeppelin/conf:/root/incubator-zeppelin/zeppelin-interpreter/target/classes'
> as a work-around.
> WARN [2015-07-30 06:46:16,610] ({pool-2-thread-2}
> Logging.scala[logWarning]:71) - Setting 'spark.driver.extraClassPath' to
> ':/root/incubator-zeppelin/interpreter/spark/*:/root/incubator-zeppelin/zeppelin-interpreter/target/lib/*:/root/incubator-zeppelin/zeppelin-server/target/lib/*:/root/incubator-zeppelin/zeppelin-zengine/target/lib/*:/root/incubator-zeppelin/zeppelin-interpreter/target/lib/*:/root/incubator-zeppelin/*::/root/incubator-zeppelin/conf:/root/incubator-zeppelin/zeppelin-interpreter/target/classes:/root/incubator-zeppelin/zeppelin-zengine/target/classes:/root/incubator-zeppelin/zeppelin-server/target/classes:/root/incubator-zeppelin/conf:/root/incubator-zeppelin/conf:/root/incubator-zeppelin/zeppelin-interpreter/target/classes'
> as a work-around.
> INFO [2015-07-30 06:46:16,625] ({pool-2-thread-2}
> Logging.scala[logInfo]:59) - Changing view acls to: root
> INFO [2015-07-30 06:46:16,626] ({pool-2-thread-2}
> Logging.scala[logInfo]:59) - Changing modify acls to: root
> INFO [2015-07-30 06:46:16,626] ({pool-2-thread-2}
> Logging.scala[logInfo]:59) - SecurityManager: authentication disabled; ui
> acls disabled; users with view permissions: Set(root); users with modify
> permissions: Set(root)
> INFO [2015-07-30 06:46:17,203]
> ({sparkDriver-akka.actor.default-dispatcher-4}
> Slf4jLogger.scala[applyOrElse]:80) - Slf4jLogger started
> INFO [2015-07-30 06:46:17,269]
> ({sparkDriver-akka.actor.default-dispatcher-2}
> Slf4jLogger.scala[apply$mcV$sp]:74) - Starting remoting
> ERROR [2015-07-30 06:46:17,333]
> ({sparkDriver-akka.actor.default-dispatcher-4}
> Slf4jLogger.scala[apply$mcV$sp]:66) - Uncaught fatal error from thread
> [sparkDriver-akka.remote.default-remote-dispatcher-7] shutting down
> ActorSystem [sparkDriver]
> java.lang.VerifyError: (class:
> org/jboss/netty/channel/socket/nio/NioWorkerPool, method: createWorker
> signature:
> (Ljava/util/concurrent/Executor;)Lorg/jboss/netty/channel/socket/nio/AbstractNioWorker;)
> Wrong return type in function
> at
> akka.remote.transport.netty.NettyTransport.<init>(NettyTransport.scala:283)
> at
> akka.remote.transport.netty.NettyTransport.<init>(NettyTransport.scala:240)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:408)
> at
> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
> at scala.util.Try$.apply(Try.scala:161)
> at
> akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
> at
> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
> at
> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
> at scala.util.Success.flatMap(Try.scala:200)
> at
> akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
> at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:692)
> at akka.remote.EndpointManager$$anonfun$9.apply(Remoting.scala:684)
> at
> scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)
> at scala.collection.Iterator$class.foreach(Iterator.scala:727)
> at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
> at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
> at
> scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721)
> at
> akka.remote.EndpointManager.akka$remote$EndpointManager$$listens(Remoting.scala:684)
> at
> akka.remote.EndpointManager$$anonfun$receive$2.applyOrElse(Remoting.scala:492)
> at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
>
>
> INFO [2015-07-30 06:46:17,356]
> ({sparkDriver-akka.actor.default-dispatcher-2}
> Slf4jLogger.scala[apply$mcV$sp]:74) - Shutting down remote daemon.
> INFO [2015-07-30 06:46:17,360]
> ({sparkDriver-akka.actor.default-dispatcher-2}
> Slf4jLogger.scala[apply$mcV$sp]:74) - Remote daemon shut down; proceeding
> with flushing remote transports.
> INFO [2015-07-30 06:46:17,378]
> ({sparkDriver-akka.actor.default-dispatcher-2}
> Slf4jLogger.scala[apply$mcV$sp]:74) - Remoting shut down.
> ERROR [2015-07-30 06:46:27,292] ({pool-2-thread-2}
> Logging.scala[logError]:96) - Error initializing SparkContext.
> java.util.concurrent.TimeoutException: Futures timed out after [10000
> milliseconds]
> at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
> at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
> at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
> at
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
> at scala.concurrent.Await$.result(package.scala:107)
> at akka.remote.Remoting.start(Remoting.scala:180)
> at
> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
> at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:618)
> at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:615)
> at akka.actor.ActorSystemImpl._start(ActorSystem.scala:615)
> at akka.actor.ActorSystemImpl.start(ActorSystem.scala:632)
> at akka.actor.ActorSystem$.apply(ActorSystem.scala:141)
> at akka.actor.ActorSystem$.apply(ActorSystem.scala:118)
> at
> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:122)
> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
> at
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1991)
> at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
> at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1982)
> at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
> at org.apache.spark.rpc.akka.AkkaRpcEnvFactory.create(AkkaRpcEnv.scala:245)
> at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:52)
> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:247)
> at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:188)
> at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)
>
>
>
>
> RROR [2015-07-30 06:46:27,296] ({pool-2-thread-2} Job.java[run]:183) - Job
> failed
> org.apache.zeppelin.interpreter.InterpreterException:
> java.util.concurrent.TimeoutException: Futures timed out after [10000
> milliseconds]
> at
> org.apache.zeppelin.interpreter.ClassloaderInterpreter.open(ClassloaderInterpreter.java:76)
> at
> org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:68)
> at
> org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:92)
> at
> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:276)
> at org.apache.zeppelin.scheduler.Job.run(Job.java:170)
> at
> org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:118)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
> at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.util.concurrent.TimeoutException: Futures timed out after
> [10000 milliseconds]
> at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
> at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
> at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
> at
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
> at scala.concurrent.Await$.result(package.scala:107)
> at akka.remote.Remoting.start(Remoting.scala:180)
> at
> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
> at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:618)
> at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:615)
> at akka.actor.ActorSystemImpl._start(ActorSystem.scala:615)
> at akka.actor.ActorSystemImpl.start(ActorSystem.scala:632)
> at akka.actor.ActorSystem$.apply(ActorSystem.scala:141)
> at akka.actor.ActorSystem$.apply(ActorSystem.scala:118)
> at
> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:122)
> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
> at
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1991)
> at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
> at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1982)
> at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
> at org.apache.spark.rpc.akka.AkkaRpcEnvFactory.create(AkkaRpcEnv.scala:245)
> at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:52)
> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:247)
>
>
>
> Regards,
> Ankit Gupta
>
>