You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Sarath Chandra <sa...@algofusiontech.com> on 2015/03/04 12:38:38 UTC

Unable to submit spark job to mesos cluster

Hi,

I have a cluster running on CDH5.2.1 and I have a Mesos cluster (version
0.18.1). Through a Oozie java action I'm want to submit a Spark job to
mesos cluster. Before configuring it as Oozie job I'm testing the java
action from command line and getting exception as below. While running I'm
pointing the classpath to "<CDH Home>/jars" folder.

What is going wrong? Is there any additional configuration to be done which
I'm missing?

[ERROR] [03/04/2015 17:00:49.968] [main] [Remoting] Remoting error:
[Startup timed out] [
akka.remote.RemoteTransportException: Startup timed out
at
akka.remote.Remoting.akka$remote$Remoting$$notifyError(Remoting.scala:129)
at akka.remote.Remoting.start(Remoting.scala:191)
at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
at
org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
at
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1454)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1450)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:156)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
at
com.algofusion.reconciliation.execution.utils.ExecutionUtils.<clinit>(ExecutionUtils.java:130)
at
com.algofusion.reconciliation.execution.ReconExecutionController.initialize(ReconExecutionController.java:257)
at
com.algofusion.reconciliation.execution.ReconExecutionController.main(ReconExecutionController.java:105)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after
[10000 milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at
scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at akka.remote.Remoting.start(Remoting.scala:173)
... 18 more
]
Exception in thread "main" java.lang.ExceptionInInitializerError
at
com.algofusion.reconciliation.execution.ReconExecutionController.initialize(ReconExecutionController.java:257)
at
com.algofusion.reconciliation.execution.ReconExecutionController.main(ReconExecutionController.java:105)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after
[10000 milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at
scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at akka.remote.Remoting.start(Remoting.scala:173)
at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
at
org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
at
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1454)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1450)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:156)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
at
com.algofusion.reconciliation.execution.utils.ExecutionUtils.<clinit>(ExecutionUtils.java:130)
... 2 more

Regards,
Sarath.

Re: Unable to submit spark job to mesos cluster

Posted by Arush Kharbanda <ar...@sigmoidanalytics.com>.
You can try increasing the Akka time out in the config, you can set the
following in your config.

spark.core.connection.ack.wait.timeout: 600
spark.akka.timeout: 1000 (In secs)
spark.akka.frameSize:50

On Wed, Mar 4, 2015 at 5:14 PM, Sarath Chandra <
sarathchandra.josyam@algofusiontech.com> wrote:

> From the lines pointed in the exception log, I figured out that my code is
> unable to get the spark context. To isolate the problem, I've written a
> small code as below -
>
> *import org.apache.spark.SparkConf;*
> *import org.apache.spark.SparkContext;*
>
> *public class Test {*
> *        public static void main(String[] args) throws Exception {*
> *                SparkConf sparkConf = new
> SparkConf().setMaster("mesos://node2.algofusiontech.com:5050
> <http://node2.algofusiontech.com:5050>").setAppName("test");*
> *                SparkContext context = new SparkContext(sparkConf);*
> *        }*
> *}*
>
> When I run this code as -  *java -cp ".:/opt/cloudera/parcels/CDH/jars/*"
> Test*
> I'm getting the below exception dump. Please help.
>
> *1    [sparkDriver-akka.actor.default-dispatcher-2] ERROR
> akka.actor.ActorSystemImpl  - Uncaught fatal error from thread
> [sparkDriver-akka.actor.default-dispatcher-4] shutting down ActorSystem
> [sparkDriver]*
> *java.lang.NoSuchMethodError:
> org.jboss.netty.channel.socket.nio.NioWorkerPool.<init>(Ljava/util/concurrent/Executor;I)V*
> * at
> akka.remote.transport.netty.NettyTransport.<init>(NettyTransport.scala:282)*
> * at
> akka.remote.transport.netty.NettyTransport.<init>(NettyTransport.scala:239)*
> * at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)*
> * at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)*
> * at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)*
> * at java.lang.reflect.Constructor.newInstance(Constructor.java:526)*
> * at
> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)*
> * at scala.util.Try$.apply(Try.scala:161)*
> * at
> akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)*
> * at
> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)*
> * at
> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)*
> * at scala.util.Success.flatMap(Try.scala:200)*
> * at
> akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)*
> * at akka.remote.EndpointManager$$anonfun$8.apply(Remoting.scala:618)*
> * at akka.remote.EndpointManager$$anonfun$8.apply(Remoting.scala:610)*
> * at
> scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)*
> * at scala.collection.Iterator$class.foreach(Iterator.scala:727)*
> * at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)*
> * at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)*
> * at scala.collection.AbstractIterable.foreach(Iterable.scala:54)*
> * at
> scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721)*
> * at
> akka.remote.EndpointManager.akka$remote$EndpointManager$$listens(Remoting.scala:610)*
> * at
> akka.remote.EndpointManager$$anonfun$receive$2.applyOrElse(Remoting.scala:450)*
> * at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)*
> * at akka.actor.ActorCell.invoke(ActorCell.scala:456)*
> * at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)*
> * at akka.dispatch.Mailbox.run(Mailbox.scala:219)*
> * at
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)*
> * at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)*
> * at
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)*
> * at
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)*
> * at
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)*
> *[ERROR] [03/04/2015 17:13:23.745] [main] [Remoting] Remoting error:
> [Startup timed out] [*
> *akka.remote.RemoteTransportException: Startup timed out*
> * at
> akka.remote.Remoting.akka$remote$Remoting$$notifyError(Remoting.scala:129)*
> * at akka.remote.Remoting.start(Remoting.scala:191)*
> * at
> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)*
> * at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)*
> * at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)*
> * at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)*
> * at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)*
> * at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)*
> * at
> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)*
> * at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)*
> * at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)*
> * at
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1454)*
> * at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)*
> * at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1450)*
> * at
> org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)*
> * at org.apache.spark.SparkEnv$.create(SparkEnv.scala:156)*
> * at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)*
> * at Test.main(Test.java:7)*
> *Caused by: java.util.concurrent.TimeoutException: Futures timed out after
> [10000 milliseconds]*
> * at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)*
> * at
> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)*
> * at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)*
> * at
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)*
> * at scala.concurrent.Await$.result(package.scala:107)*
> * at akka.remote.Remoting.start(Remoting.scala:173)*
> * ... 16 more*
> *]*
> *Exception in thread "main" java.util.concurrent.TimeoutException: Futures
> timed out after [10000 milliseconds]*
> * at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)*
> * at
> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)*
> * at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)*
> * at
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)*
> * at scala.concurrent.Await$.result(package.scala:107)*
> * at akka.remote.Remoting.start(Remoting.scala:173)*
> * at
> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)*
> * at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)*
> * at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)*
> * at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)*
> * at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)*
> * at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)*
> * at
> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)*
> * at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)*
> * at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)*
> * at
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1454)*
> * at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)*
> * at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1450)*
> * at
> org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)*
> * at org.apache.spark.SparkEnv$.create(SparkEnv.scala:156)*
> * at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)*
> * at Test.main(Test.java:7)*
>
> Regards,
> Sarath.
>
> Thanks & Regards,
> *Sarath Chandra Josyam*
> Sr. Technical Architect
> *Algofusion Technologies India Pvt. Ltd.*
> Email: sarathchandra.josyam@algofusiontech.com
> Phone: +91-80-65330112/113
> Mobile: +91 8762491331
>
> On Wed, Mar 4, 2015 at 5:08 PM, Sarath Chandra <
> sarathchandra.josyam@algofusiontech.com> wrote:
>
>> Hi,
>>
>> I have a cluster running on CDH5.2.1 and I have a Mesos cluster (version
>> 0.18.1). Through a Oozie java action I'm want to submit a Spark job to
>> mesos cluster. Before configuring it as Oozie job I'm testing the java
>> action from command line and getting exception as below. While running I'm
>> pointing the classpath to "<CDH Home>/jars" folder.
>>
>> What is going wrong? Is there any additional configuration to be done
>> which I'm missing?
>>
>> [ERROR] [03/04/2015 17:00:49.968] [main] [Remoting] Remoting error:
>> [Startup timed out] [
>> akka.remote.RemoteTransportException: Startup timed out
>> at
>> akka.remote.Remoting.akka$remote$Remoting$$notifyError(Remoting.scala:129)
>> at akka.remote.Remoting.start(Remoting.scala:191)
>> at
>> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
>> at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
>> at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
>> at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
>> at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
>> at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
>> at
>> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
>> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
>> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
>> at
>> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1454)
>> at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>> at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1450)
>> at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
>> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:156)
>> at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
>> at
>> com.algofusion.reconciliation.execution.utils.ExecutionUtils.<clinit>(ExecutionUtils.java:130)
>> at
>> com.algofusion.reconciliation.execution.ReconExecutionController.initialize(ReconExecutionController.java:257)
>> at
>> com.algofusion.reconciliation.execution.ReconExecutionController.main(ReconExecutionController.java:105)
>> Caused by: java.util.concurrent.TimeoutException: Futures timed out after
>> [10000 milliseconds]
>> at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>> at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>> at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
>> at
>> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>> at scala.concurrent.Await$.result(package.scala:107)
>> at akka.remote.Remoting.start(Remoting.scala:173)
>> ... 18 more
>> ]
>> Exception in thread "main" java.lang.ExceptionInInitializerError
>> at
>> com.algofusion.reconciliation.execution.ReconExecutionController.initialize(ReconExecutionController.java:257)
>> at
>> com.algofusion.reconciliation.execution.ReconExecutionController.main(ReconExecutionController.java:105)
>> Caused by: java.util.concurrent.TimeoutException: Futures timed out after
>> [10000 milliseconds]
>> at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>> at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>> at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
>> at
>> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>> at scala.concurrent.Await$.result(package.scala:107)
>> at akka.remote.Remoting.start(Remoting.scala:173)
>> at
>> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
>> at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
>> at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
>> at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
>> at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
>> at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
>> at
>> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
>> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
>> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
>> at
>> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1454)
>> at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>> at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1450)
>> at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
>> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:156)
>> at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
>> at
>> com.algofusion.reconciliation.execution.utils.ExecutionUtils.<clinit>(ExecutionUtils.java:130)
>> ... 2 more
>>
>> Regards,
>> Sarath.
>>
>
>


-- 

[image: Sigmoid Analytics] <http://htmlsig.com/www.sigmoidanalytics.com>

*Arush Kharbanda* || Technical Teamlead

arush@sigmoidanalytics.com || www.sigmoidanalytics.com

Re: Unable to submit spark job to mesos cluster

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
Looks like you are having 2 netty jars in the classpath.

Thanks
Best Regards

On Wed, Mar 4, 2015 at 5:14 PM, Sarath Chandra <
sarathchandra.josyam@algofusiontech.com> wrote:

> From the lines pointed in the exception log, I figured out that my code is
> unable to get the spark context. To isolate the problem, I've written a
> small code as below -
>
> *import org.apache.spark.SparkConf;*
> *import org.apache.spark.SparkContext;*
>
> *public class Test {*
> *        public static void main(String[] args) throws Exception {*
> *                SparkConf sparkConf = new
> SparkConf().setMaster("mesos://node2.algofusiontech.com:5050
> <http://node2.algofusiontech.com:5050>").setAppName("test");*
> *                SparkContext context = new SparkContext(sparkConf);*
> *        }*
> *}*
>
> When I run this code as -  *java -cp ".:/opt/cloudera/parcels/CDH/jars/*"
> Test*
> I'm getting the below exception dump. Please help.
>
> *1    [sparkDriver-akka.actor.default-dispatcher-2] ERROR
> akka.actor.ActorSystemImpl  - Uncaught fatal error from thread
> [sparkDriver-akka.actor.default-dispatcher-4] shutting down ActorSystem
> [sparkDriver]*
> *java.lang.NoSuchMethodError:
> org.jboss.netty.channel.socket.nio.NioWorkerPool.<init>(Ljava/util/concurrent/Executor;I)V*
> * at
> akka.remote.transport.netty.NettyTransport.<init>(NettyTransport.scala:282)*
> * at
> akka.remote.transport.netty.NettyTransport.<init>(NettyTransport.scala:239)*
> * at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)*
> * at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)*
> * at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)*
> * at java.lang.reflect.Constructor.newInstance(Constructor.java:526)*
> * at
> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)*
> * at scala.util.Try$.apply(Try.scala:161)*
> * at
> akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)*
> * at
> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)*
> * at
> akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)*
> * at scala.util.Success.flatMap(Try.scala:200)*
> * at
> akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)*
> * at akka.remote.EndpointManager$$anonfun$8.apply(Remoting.scala:618)*
> * at akka.remote.EndpointManager$$anonfun$8.apply(Remoting.scala:610)*
> * at
> scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)*
> * at scala.collection.Iterator$class.foreach(Iterator.scala:727)*
> * at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)*
> * at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)*
> * at scala.collection.AbstractIterable.foreach(Iterable.scala:54)*
> * at
> scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721)*
> * at
> akka.remote.EndpointManager.akka$remote$EndpointManager$$listens(Remoting.scala:610)*
> * at
> akka.remote.EndpointManager$$anonfun$receive$2.applyOrElse(Remoting.scala:450)*
> * at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)*
> * at akka.actor.ActorCell.invoke(ActorCell.scala:456)*
> * at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)*
> * at akka.dispatch.Mailbox.run(Mailbox.scala:219)*
> * at
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)*
> * at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)*
> * at
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)*
> * at
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)*
> * at
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)*
> *[ERROR] [03/04/2015 17:13:23.745] [main] [Remoting] Remoting error:
> [Startup timed out] [*
> *akka.remote.RemoteTransportException: Startup timed out*
> * at
> akka.remote.Remoting.akka$remote$Remoting$$notifyError(Remoting.scala:129)*
> * at akka.remote.Remoting.start(Remoting.scala:191)*
> * at
> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)*
> * at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)*
> * at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)*
> * at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)*
> * at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)*
> * at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)*
> * at
> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)*
> * at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)*
> * at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)*
> * at
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1454)*
> * at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)*
> * at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1450)*
> * at
> org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)*
> * at org.apache.spark.SparkEnv$.create(SparkEnv.scala:156)*
> * at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)*
> * at Test.main(Test.java:7)*
> *Caused by: java.util.concurrent.TimeoutException: Futures timed out after
> [10000 milliseconds]*
> * at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)*
> * at
> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)*
> * at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)*
> * at
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)*
> * at scala.concurrent.Await$.result(package.scala:107)*
> * at akka.remote.Remoting.start(Remoting.scala:173)*
> * ... 16 more*
> *]*
> *Exception in thread "main" java.util.concurrent.TimeoutException: Futures
> timed out after [10000 milliseconds]*
> * at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)*
> * at
> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)*
> * at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)*
> * at
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)*
> * at scala.concurrent.Await$.result(package.scala:107)*
> * at akka.remote.Remoting.start(Remoting.scala:173)*
> * at
> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)*
> * at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)*
> * at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)*
> * at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)*
> * at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)*
> * at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)*
> * at
> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)*
> * at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)*
> * at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)*
> * at
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1454)*
> * at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)*
> * at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1450)*
> * at
> org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)*
> * at org.apache.spark.SparkEnv$.create(SparkEnv.scala:156)*
> * at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)*
> * at Test.main(Test.java:7)*
>
> Regards,
> Sarath.
>
> Thanks & Regards,
> *Sarath Chandra Josyam*
> Sr. Technical Architect
> *Algofusion Technologies India Pvt. Ltd.*
> Email: sarathchandra.josyam@algofusiontech.com
> Phone: +91-80-65330112/113
> Mobile: +91 8762491331
>
> On Wed, Mar 4, 2015 at 5:08 PM, Sarath Chandra <
> sarathchandra.josyam@algofusiontech.com> wrote:
>
>> Hi,
>>
>> I have a cluster running on CDH5.2.1 and I have a Mesos cluster (version
>> 0.18.1). Through a Oozie java action I'm want to submit a Spark job to
>> mesos cluster. Before configuring it as Oozie job I'm testing the java
>> action from command line and getting exception as below. While running I'm
>> pointing the classpath to "<CDH Home>/jars" folder.
>>
>> What is going wrong? Is there any additional configuration to be done
>> which I'm missing?
>>
>> [ERROR] [03/04/2015 17:00:49.968] [main] [Remoting] Remoting error:
>> [Startup timed out] [
>> akka.remote.RemoteTransportException: Startup timed out
>> at
>> akka.remote.Remoting.akka$remote$Remoting$$notifyError(Remoting.scala:129)
>> at akka.remote.Remoting.start(Remoting.scala:191)
>> at
>> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
>> at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
>> at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
>> at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
>> at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
>> at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
>> at
>> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
>> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
>> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
>> at
>> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1454)
>> at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>> at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1450)
>> at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
>> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:156)
>> at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
>> at
>> com.algofusion.reconciliation.execution.utils.ExecutionUtils.<clinit>(ExecutionUtils.java:130)
>> at
>> com.algofusion.reconciliation.execution.ReconExecutionController.initialize(ReconExecutionController.java:257)
>> at
>> com.algofusion.reconciliation.execution.ReconExecutionController.main(ReconExecutionController.java:105)
>> Caused by: java.util.concurrent.TimeoutException: Futures timed out after
>> [10000 milliseconds]
>> at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>> at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>> at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
>> at
>> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>> at scala.concurrent.Await$.result(package.scala:107)
>> at akka.remote.Remoting.start(Remoting.scala:173)
>> ... 18 more
>> ]
>> Exception in thread "main" java.lang.ExceptionInInitializerError
>> at
>> com.algofusion.reconciliation.execution.ReconExecutionController.initialize(ReconExecutionController.java:257)
>> at
>> com.algofusion.reconciliation.execution.ReconExecutionController.main(ReconExecutionController.java:105)
>> Caused by: java.util.concurrent.TimeoutException: Futures timed out after
>> [10000 milliseconds]
>> at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>> at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>> at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
>> at
>> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>> at scala.concurrent.Await$.result(package.scala:107)
>> at akka.remote.Remoting.start(Remoting.scala:173)
>> at
>> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
>> at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
>> at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
>> at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
>> at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
>> at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
>> at
>> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
>> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
>> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
>> at
>> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1454)
>> at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>> at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1450)
>> at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
>> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:156)
>> at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
>> at
>> com.algofusion.reconciliation.execution.utils.ExecutionUtils.<clinit>(ExecutionUtils.java:130)
>> ... 2 more
>>
>> Regards,
>> Sarath.
>>
>
>

Re: Unable to submit spark job to mesos cluster

Posted by Sarath Chandra <sa...@algofusiontech.com>.
>From the lines pointed in the exception log, I figured out that my code is
unable to get the spark context. To isolate the problem, I've written a
small code as below -

*import org.apache.spark.SparkConf;*
*import org.apache.spark.SparkContext;*

*public class Test {*
*        public static void main(String[] args) throws Exception {*
*                SparkConf sparkConf = new
SparkConf().setMaster("mesos://node2.algofusiontech.com:5050
<http://node2.algofusiontech.com:5050>").setAppName("test");*
*                SparkContext context = new SparkContext(sparkConf);*
*        }*
*}*

When I run this code as -  *java -cp ".:/opt/cloudera/parcels/CDH/jars/*"
Test*
I'm getting the below exception dump. Please help.

*1    [sparkDriver-akka.actor.default-dispatcher-2] ERROR
akka.actor.ActorSystemImpl  - Uncaught fatal error from thread
[sparkDriver-akka.actor.default-dispatcher-4] shutting down ActorSystem
[sparkDriver]*
*java.lang.NoSuchMethodError:
org.jboss.netty.channel.socket.nio.NioWorkerPool.<init>(Ljava/util/concurrent/Executor;I)V*
* at
akka.remote.transport.netty.NettyTransport.<init>(NettyTransport.scala:282)*
* at
akka.remote.transport.netty.NettyTransport.<init>(NettyTransport.scala:239)*
* at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)*
* at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)*
* at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)*
* at java.lang.reflect.Constructor.newInstance(Constructor.java:526)*
* at
akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)*
* at scala.util.Try$.apply(Try.scala:161)*
* at
akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)*
* at
akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)*
* at
akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)*
* at scala.util.Success.flatMap(Try.scala:200)*
* at
akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)*
* at akka.remote.EndpointManager$$anonfun$8.apply(Remoting.scala:618)*
* at akka.remote.EndpointManager$$anonfun$8.apply(Remoting.scala:610)*
* at
scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)*
* at scala.collection.Iterator$class.foreach(Iterator.scala:727)*
* at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)*
* at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)*
* at scala.collection.AbstractIterable.foreach(Iterable.scala:54)*
* at
scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721)*
* at
akka.remote.EndpointManager.akka$remote$EndpointManager$$listens(Remoting.scala:610)*
* at
akka.remote.EndpointManager$$anonfun$receive$2.applyOrElse(Remoting.scala:450)*
* at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)*
* at akka.actor.ActorCell.invoke(ActorCell.scala:456)*
* at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)*
* at akka.dispatch.Mailbox.run(Mailbox.scala:219)*
* at
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)*
* at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)*
* at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)*
* at
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)*
* at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)*
*[ERROR] [03/04/2015 17:13:23.745] [main] [Remoting] Remoting error:
[Startup timed out] [*
*akka.remote.RemoteTransportException: Startup timed out*
* at
akka.remote.Remoting.akka$remote$Remoting$$notifyError(Remoting.scala:129)*
* at akka.remote.Remoting.start(Remoting.scala:191)*
* at
akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)*
* at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)*
* at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)*
* at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)*
* at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)*
* at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)*
* at
org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)*
* at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)*
* at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)*
* at
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1454)*
* at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)*
* at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1450)*
* at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)*
* at org.apache.spark.SparkEnv$.create(SparkEnv.scala:156)*
* at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)*
* at Test.main(Test.java:7)*
*Caused by: java.util.concurrent.TimeoutException: Futures timed out after
[10000 milliseconds]*
* at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)*
* at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)*
* at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)*
* at
scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)*
* at scala.concurrent.Await$.result(package.scala:107)*
* at akka.remote.Remoting.start(Remoting.scala:173)*
* ... 16 more*
*]*
*Exception in thread "main" java.util.concurrent.TimeoutException: Futures
timed out after [10000 milliseconds]*
* at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)*
* at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)*
* at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)*
* at
scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)*
* at scala.concurrent.Await$.result(package.scala:107)*
* at akka.remote.Remoting.start(Remoting.scala:173)*
* at
akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)*
* at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)*
* at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)*
* at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)*
* at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)*
* at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)*
* at
org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)*
* at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)*
* at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)*
* at
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1454)*
* at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)*
* at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1450)*
* at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)*
* at org.apache.spark.SparkEnv$.create(SparkEnv.scala:156)*
* at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)*
* at Test.main(Test.java:7)*

Regards,
Sarath.

Thanks & Regards,
*Sarath Chandra Josyam*
Sr. Technical Architect
*Algofusion Technologies India Pvt. Ltd.*
Email: sarathchandra.josyam@algofusiontech.com
Phone: +91-80-65330112/113
Mobile: +91 8762491331

On Wed, Mar 4, 2015 at 5:08 PM, Sarath Chandra <
sarathchandra.josyam@algofusiontech.com> wrote:

> Hi,
>
> I have a cluster running on CDH5.2.1 and I have a Mesos cluster (version
> 0.18.1). Through a Oozie java action I'm want to submit a Spark job to
> mesos cluster. Before configuring it as Oozie job I'm testing the java
> action from command line and getting exception as below. While running I'm
> pointing the classpath to "<CDH Home>/jars" folder.
>
> What is going wrong? Is there any additional configuration to be done
> which I'm missing?
>
> [ERROR] [03/04/2015 17:00:49.968] [main] [Remoting] Remoting error:
> [Startup timed out] [
> akka.remote.RemoteTransportException: Startup timed out
> at
> akka.remote.Remoting.akka$remote$Remoting$$notifyError(Remoting.scala:129)
> at akka.remote.Remoting.start(Remoting.scala:191)
> at
> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
> at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
> at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
> at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
> at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
> at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
> at
> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
> at
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1454)
> at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
> at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1450)
> at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:156)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
> at
> com.algofusion.reconciliation.execution.utils.ExecutionUtils.<clinit>(ExecutionUtils.java:130)
> at
> com.algofusion.reconciliation.execution.ReconExecutionController.initialize(ReconExecutionController.java:257)
> at
> com.algofusion.reconciliation.execution.ReconExecutionController.main(ReconExecutionController.java:105)
> Caused by: java.util.concurrent.TimeoutException: Futures timed out after
> [10000 milliseconds]
> at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
> at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
> at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
> at
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
> at scala.concurrent.Await$.result(package.scala:107)
> at akka.remote.Remoting.start(Remoting.scala:173)
> ... 18 more
> ]
> Exception in thread "main" java.lang.ExceptionInInitializerError
> at
> com.algofusion.reconciliation.execution.ReconExecutionController.initialize(ReconExecutionController.java:257)
> at
> com.algofusion.reconciliation.execution.ReconExecutionController.main(ReconExecutionController.java:105)
> Caused by: java.util.concurrent.TimeoutException: Futures timed out after
> [10000 milliseconds]
> at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
> at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
> at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
> at
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
> at scala.concurrent.Await$.result(package.scala:107)
> at akka.remote.Remoting.start(Remoting.scala:173)
> at
> akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
> at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
> at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
> at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
> at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
> at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
> at
> org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
> at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
> at
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1454)
> at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
> at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1450)
> at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:156)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
> at
> com.algofusion.reconciliation.execution.utils.ExecutionUtils.<clinit>(ExecutionUtils.java:130)
> ... 2 more
>
> Regards,
> Sarath.
>