You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by tombin <to...@gmail.com> on 2016/07/14 19:59:30 UTC

New cluster help

I am setting up a new hadoop cluster for the first time.  My setup
currently looks as follows:

hdfs cluster:
1 namenode
2 datanodes

hbase:
1 hbase node

zookeeper cluster:
3 zookeeper nodes

I have enabled ssl on the hdfs cluster.  When trying to connect from base i
see the following error:

2016-07-14 19:38:58,333 WARN  [Thread-73] hdfs.DFSClient: DataStreamer
Exception

java.lang.NullPointerException

at
org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferEncryptor.getEncryptedStreams(DataTransferEncryptor.java:191)

at
org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335)

at
org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1281)

at
org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:526)

2016-07-14 19:39:04,341 INFO  [hb01:16000.activeMasterManager]
hdfs.DFSClient: Could not complete /hbase/.tmp/hbase.version retrying...



this will repeat several time and then i'll throw the following exception:


2016-07-14 19:39:58,772 FATAL [hb01:16000.activeMasterManager]
master.HMaster: Unhandled exception. Starting shutdown.

java.io.IOException: Unable to close file because the last block does not
have enough number of replicas.

at
org.apache.hadoop.hdfs.DFSOutputStream.completeFile(DFSOutputStream.java:2151)

at org.apache.hadoop.hdfs.DFSOutputStream.close(DFSOutputStream.java:2119)

at
org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:72)

at
org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:106)

at org.apache.hadoop.hbase.util.FSUtils.setVersion(FSUtils.java:730)

at org.apache.hadoop.hbase.util.FSUtils.setVersion(FSUtils.java:705)

at org.apache.hadoop.hbase.util.FSUtils.checkVersion(FSUtils.java:662)

at
org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:462)

at
org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:153)

at
org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:128)

at
org.apache.hadoop.hbase.master.HMaster.finishActiveMasterInitialization(HMaster.java:652)

at org.apache.hadoop.hbase.master.HMaster.access$600(HMaster.java:185)

at org.apache.hadoop.hbase.master.HMaster$1.run(HMaster.java:1750)

at java.lang.Thread.run(Thread.java:745)


hbase shuts down at this point.


on the datanode i side i see the following in the logs that looks like it
may be related:


2016-07-14 19:38:23,132 ERROR
org.apache.hadoop.hdfs.server.datanode.DataNode:
hd03.domain.com:50010:DataXceiver
error processing unknown operation  src: /10.0.0.10:34893 dst: /
10.0.1.10:50010

java.io.EOFException

        at java.io.DataInputStream.readInt(DataInputStream.java:392)

        at
org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.doSaslHandshake(SaslDataTransferServer.java:358)

        at
org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.getEncryptedStreams(SaslDataTransferServer.java:178)

        at
org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.receive(SaslDataTransferServer.java:110)

        at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:193)

        at java.lang.Thread.run(Thread.java:745)

2016-07-14 19:39:33,575 ERROR
org.apache.hadoop.hdfs.server.datanode.DataNode:
hd03.domain.com:50010:DataXceiver
error processing unknown operation  src: /10.0.0.10:34898 dst: /
10.0.1.10:50010

java.io.EOFException

        at java.io.DataInputStream.readInt(DataInputStream.java:392)

        at
org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.doSaslHandshake(SaslDataTransferServer.java:358)

        at
org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.getEncryptedStreams(SaslDataTransferServer.java:178)

Is this in relation to my ssl configuration ?
I'm confused about whats going on here.  Thank you in advance for any help.

Re: New cluster help

Posted by Ravi Prakash <ra...@gmail.com>.
Hi Tombin!

Is this the first cluster you're ever setting up? Are you able to run an
"hdfs dfs -ls /" successfully? How about putting files into HDFS? I'd take
it one step at a time if I were you. i.e.

1. Set up a simple HDFS cluster (without SSL)
2. Turn on SSL
3. Then try to run HBase.

Is step 1 working for you?

Ravi

On Thu, Jul 14, 2016 at 12:59 PM, tombin <to...@gmail.com> wrote:

> I am setting up a new hadoop cluster for the first time.  My setup
> currently looks as follows:
>
> hdfs cluster:
> 1 namenode
> 2 datanodes
>
> hbase:
> 1 hbase node
>
> zookeeper cluster:
> 3 zookeeper nodes
>
> I have enabled ssl on the hdfs cluster.  When trying to connect from base
> i see the following error:
>
> 2016-07-14 19:38:58,333 WARN  [Thread-73] hdfs.DFSClient: DataStreamer
> Exception
>
> java.lang.NullPointerException
>
> at
> org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferEncryptor.getEncryptedStreams(DataTransferEncryptor.java:191)
>
> at
> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1335)
>
> at
> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1281)
>
> at
> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:526)
>
> 2016-07-14 19:39:04,341 INFO  [hb01:16000.activeMasterManager]
> hdfs.DFSClient: Could not complete /hbase/.tmp/hbase.version retrying...
>
>
>
> this will repeat several time and then i'll throw the following exception:
>
>
> 2016-07-14 19:39:58,772 FATAL [hb01:16000.activeMasterManager]
> master.HMaster: Unhandled exception. Starting shutdown.
>
> java.io.IOException: Unable to close file because the last block does not
> have enough number of replicas.
>
> at
> org.apache.hadoop.hdfs.DFSOutputStream.completeFile(DFSOutputStream.java:2151)
>
> at org.apache.hadoop.hdfs.DFSOutputStream.close(DFSOutputStream.java:2119)
>
> at
> org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:72)
>
> at
> org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:106)
>
> at org.apache.hadoop.hbase.util.FSUtils.setVersion(FSUtils.java:730)
>
> at org.apache.hadoop.hbase.util.FSUtils.setVersion(FSUtils.java:705)
>
> at org.apache.hadoop.hbase.util.FSUtils.checkVersion(FSUtils.java:662)
>
> at
> org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:462)
>
> at
> org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:153)
>
> at
> org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:128)
>
> at
> org.apache.hadoop.hbase.master.HMaster.finishActiveMasterInitialization(HMaster.java:652)
>
> at org.apache.hadoop.hbase.master.HMaster.access$600(HMaster.java:185)
>
> at org.apache.hadoop.hbase.master.HMaster$1.run(HMaster.java:1750)
>
> at java.lang.Thread.run(Thread.java:745)
>
>
> hbase shuts down at this point.
>
>
> on the datanode i side i see the following in the logs that looks like it
> may be related:
>
>
> 2016-07-14 19:38:23,132 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode: hd03.domain.com:50010:DataXceiver
> error processing unknown operation  src: /10.0.0.10:34893 dst: /
> 10.0.1.10:50010
>
> java.io.EOFException
>
>         at java.io.DataInputStream.readInt(DataInputStream.java:392)
>
>         at
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.doSaslHandshake(SaslDataTransferServer.java:358)
>
>         at
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.getEncryptedStreams(SaslDataTransferServer.java:178)
>
>         at
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.receive(SaslDataTransferServer.java:110)
>
>         at
> org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:193)
>
>         at java.lang.Thread.run(Thread.java:745)
>
> 2016-07-14 19:39:33,575 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode: hd03.domain.com:50010:DataXceiver
> error processing unknown operation  src: /10.0.0.10:34898 dst: /
> 10.0.1.10:50010
>
> java.io.EOFException
>
>         at java.io.DataInputStream.readInt(DataInputStream.java:392)
>
>         at
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.doSaslHandshake(SaslDataTransferServer.java:358)
>
>         at
> org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.getEncryptedStreams(SaslDataTransferServer.java:178)
>
> Is this in relation to my ssl configuration ?
> I'm confused about whats going on here.  Thank you in advance for any help.
>