You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by JaeSung Jun <ja...@gmail.com> on 2016/04/29 06:38:24 UTC

Unit test error

Hi All,

I'm developing custom data source & relation provider based on spark 1.6.1.
Every unit test has its own Spark Context, and it runs successfully when
running one by one.
But when running in sbt(sbt:test), error pops up when initializing spark
contest like followings :

org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint:
spark://HeartbeatReceiver@192.168.123.101:54079

at
org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$asyncSetupEndpointRefByURI$1.apply(NettyRpcEnv.scala:148)

at
org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$asyncSetupEndpointRefByURI$1.apply(NettyRpcEnv.scala:144)

at scala.concurrent.Future$$anonfun$flatMap$1.apply(Future.scala:251)

at scala.concurrent.Future$$anonfun$flatMap$1.apply(Future.scala:249)

at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)

at
org.spark-project.guava.util.concurrent.MoreExecutors$SameThreadExecutorService.execute(MoreExecutors.java:293)

at
scala.concurrent.impl.ExecutionContextImpl$$anon$1.execute(ExecutionContextImpl.scala:133)

at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)

at
scala.concurrent.impl.Promise$DefaultPromise.scala$concurrent$impl$Promise$DefaultPromise$$dispatchOrAddCallback(Promise.scala:280)

at
scala.concurrent.impl.Promise$DefaultPromise.onComplete(Promise.scala:270)

at scala.concurrent.Future$class.flatMap(Future.scala:249)

at scala.concurrent.impl.Promise$DefaultPromise.flatMap(Promise.scala:153)

at
org.apache.spark.rpc.netty.NettyRpcEnv.asyncSetupEndpointRefByURI(NettyRpcEnv.scala:150)

at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:97)

at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:106)

at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36)

at org.apache.spark.executor.Executor.<init>(Executor.scala:115)

at
org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalBackend.scala:58)

at
org.apache.spark.scheduler.local.LocalBackend.start(LocalBackend.scala:125)

at
org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)

at org.apache.spark.SparkContext.<init>(SparkContext.scala:530)


Anyone any idea?


Thanks Jason