You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by amit karmakar <am...@gmail.com> on 2014/04/16 01:09:35 UTC

java.net.SocketException: Network is unreachable while connecting to HBase

I am getting a java.net.SocketException: Network is unreachable whenever i
do a count on one of my tables.
If i just do a take(1), i see the task status as killed on the master UI
but i get back the results.
My driver runs on my local system which is accessible over the public
internet and connects to a remote cluster.

This is the code i am trying out.

    Configuration hbaseConf = HBaseConfiguration.create();
    hbaseConf.set("hbase.zookeeper.quorum",
"xx.xx.xx.xx,xx.xx.xx.xx,xx.xx.xx.xx");
    hbaseConf.set(TableInputFormat.INPUT_TABLE, "table");
    JavaPairRDD<ImmutableBytesWritable, Result> rdd =
        sparkContext.newAPIHadoopRDD(hbaseConf, TableInputFormat.class,
            ImmutableBytesWritable.class, Result.class);
    System.out.println("Count="+rdd.count());

Please suggest what i am missing and how to fix this issue.

Thanks a lot.

14/04/15 22:39:22 INFO scheduler.TaskSetManager: Starting task 0.0:0 as TID
0 on executor 2: xxxxx (PROCESS_LOCAL)
14/04/15 22:39:22 INFO scheduler.TaskSetManager: Serialized task 0.0:0 as
1731 bytes in 22 ms
14/04/15 22:39:24 WARN scheduler.TaskSetManager: Lost TID 0 (task 0.0:0)
14/04/15 22:39:24 WARN scheduler.TaskSetManager: Loss was due to
java.net.SocketException
java.net.SocketException: Network is unreachable
        at java.net.PlainSocketImpl.socketConnect(Native Method)
        at
java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
        at
java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
        at
java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
        at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:391)
        at java.net.Socket.connect(Socket.java:579)
        at java.net.Socket.connect(Socket.java:528)
        at sun.net.NetworkClient.doConnect(NetworkClient.java:180)
        at sun.net.www.http.HttpClient.openServer(HttpClient.java:378)
        at sun.net.www.http.HttpClient.openServer(HttpClient.java:473)
        at sun.net.www.http.HttpClient.<init>(HttpClient.java:203)
        at sun.net.www.http.HttpClient.New(HttpClient.java:290)
        at sun.net.www.http.HttpClient.New(HttpClient.java:306)
        at
sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:995)
        at
sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:931)
        at
sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:849)
        at
sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1299)
        at
org.apache.spark.broadcast.HttpBroadcast$.read(HttpBroadcast.scala:156)
        at
org.apache.spark.broadcast.HttpBroadcast.readObject(HttpBroadcast.scala:56)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1004)
        at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1872)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1777)
        at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
        at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1970)
        at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1894)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1777)
        at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
        at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1970)
        at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1894)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1777)
        at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:369)
        at
scala.collection.immutable.$colon$colon.readObject(List.scala:362)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at
java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1004)
        at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1872)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1777)
        at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
        at
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1970)
        at
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1894)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1777)
        at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:369)
        at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:40)
        at
org.apache.spark.scheduler.ResultTask$.deserializeInfo(ResultTask.scala:63)
        at
org.apache.spark.scheduler.ResultTask.readExternal(ResultTask.scala:139)
        at
java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:1816)
        at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
        at
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:369)
        at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:40)
        at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:62)
        at
org.apache.spark.executor.Executor$TaskRunner$$anonfun$run$1.apply$mcV$sp(Executor.scala:193)
        at
org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:42)
        at
org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:41)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at
org.apache.spark.deploy.SparkHadoopUtil.runAsUser(SparkHadoopUtil.scala:41)
        at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:176)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:722)

Re: java.net.SocketException: Network is unreachable while connecting to HBase

Posted by amit <am...@gmail.com>.
In the worker logs i can see,

14/04/16 01:02:47 ERROR EndpointWriter: AssociationError
[akka.tcp://sparkWorker@xxxxxx:10548] ->
[akka.tcp://sparkExecutor@xxxxxx:16041]: Error [Association failed with
[akka.tcp://sparkExecutor@xxxxxx:16041]] [
akka.remote.EndpointAssociationException: Association failed with
[akka.tcp://sparkExecutor@xxxxxx:16041]
Caused by:
akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2:
Connection refused: xxxxxx/xx.xx.xx.xx:16041
]




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/java-net-SocketException-Network-is-unreachable-while-connecting-to-HBase-tp4301p4310.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.