You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Colin Ma <co...@gmail.com> on 2015/07/03 10:37:29 UTC

Problem when configure the security in hadoop

Hi,

         I do the security configuration for Hadoop these days, the
Kerberos works fine, but there maybe has some problems on sasl
configuration.

         The following is the related configuration in hdfs-site.xml:

    <property>

      <name>dfs.http.policy</name>

      <value>HTTPS_ONLY</value>

    </property>

    <property>

      <name>dfs.data.transfer.protection</name>

      <value>authentication</value>

    </property>



There is no problem to execute the command like:   hdfs dfs –ls /

But when I execute the command:   hdfs dfs –copyToLocal /temp/test.txt .  The
following exception will be thrown:



         015-07-03 14:02:54,715 INFO
org.apache.hadoop.hdfs.server.datanode.DataBlockScanner: Added
bpid=BP-271423801-192.168.20.28-1423724265164 to blockPoolScannerMap, new
size=1

2015-07-03 14:03:39,963 ERROR
org.apache.hadoop.hdfs.server.datanode.DataNode:
server-511:50010:DataXceiver error processing unknown operation  src: /
192.168.20.28:58422 dst: /192.168.20.28:50010

java.io.EOFException: Premature EOF: no length prefix available

         at
org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:2203)

         at
org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.readSaslMessageAndNegotiationCipherOptions(DataTransferSaslUtil.java:233)

         at
org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.doSaslHandshake(SaslDataTransferServer.java:369)

         at
org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.getSaslStreams(SaslDataTransferServer.java:297)

         at
org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.receive(SaslDataTransferServer.java:124)

         at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:183)

         at java.lang.Thread.run(Thread.java:745)

2015-07-03 15:34:39,917 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: Sent 1 blockreports 145
blocks total. Took 1 msec to generate and 6 msecs for RPC and NN
processing.  Got back commands
org.apache.hadoop.hdfs.server.protocol.FinalizeCommand@1b3bce82



         Just take a look the method doSaslHandshake() of
SaslDataTransferClient.java and SaslDataTransferServer.java, maybe
SaslDataTransferClient send a empty response cause this exception, and I
think some mistakes in the configuration caused this problem.

Is there anyone can help to check this problem?

Thanks for your help.



Best regards,



Colin Ma

Re: Problem when configure the security in hadoop

Posted by Zhijie Shen <zs...@hortonworks.com>.
?Not sure about HDFS special setup, but in general, to use HTTPs, you should have your keystore/truststore generated and config ssl-client.xml and ssl-server.xml properly.


- Zhijie

________________________________
From: Colin Ma <co...@gmail.com>
Sent: Friday, July 03, 2015 1:37 AM
To: user@hadoop.apache.org
Subject: Problem when configure the security in hadoop

Hi,
         I do the security configuration for Hadoop these days, the Kerberos works fine, but there maybe has some problems on sasl configuration.
         The following is the related configuration in hdfs-site.xml:
    <property>
      <name>dfs.http.policy</name>
      <value>HTTPS_ONLY</value>
    </property>
    <property>
      <name>dfs.data.transfer.protection</name>
      <value>authentication</value>
    </property>

There is no problem to execute the command like:   hdfs dfs -ls /
But when I execute the command:   hdfs dfs -copyToLocal /temp/test.txt .  The following exception will be thrown:

         015-07-03 14:02:54,715 INFO org.apache.hadoop.hdfs.server.datanode.DataBlockScanner: Added bpid=BP-271423801-192.168.20.28-1423724265164 to blockPoolScannerMap, new size=1
2015-07-03 14:03:39,963 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: server-511:50010:DataXceiver error processing unknown operation  src: /192.168.20.28:58422<http://192.168.20.28:58422> dst: /192.168.20.28:50010<http://192.168.20.28:50010>
java.io.EOFException: Premature EOF: no length prefix available
         at org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:2203)
         at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.readSaslMessageAndNegotiationCipherOptions(DataTransferSaslUtil.java:233)
         at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.doSaslHandshake(SaslDataTransferServer.java:369)
         at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.getSaslStreams(SaslDataTransferServer.java:297)
         at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.receive(SaslDataTransferServer.java:124)
         at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:183)
         at java.lang.Thread.run(Thread.java:745)
2015-07-03 15:34:39,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Sent 1 blockreports 145 blocks total. Took 1 msec to generate and 6 msecs for RPC and NN processing.  Got back commands org.apache.hadoop.hdfs.server.protocol.FinalizeCommand@1b3bce82<ma...@1b3bce82>

         Just take a look the method doSaslHandshake() of SaslDataTransferClient.java and SaslDataTransferServer.java, maybe SaslDataTransferClient send a empty response cause this exception, and I think some mistakes in the configuration caused this problem.
Is there anyone can help to check this problem?
Thanks for your help.

Best regards,

Colin Ma

Re: Problem when configure the security in hadoop

Posted by Zhijie Shen <zs...@hortonworks.com>.
?Not sure about HDFS special setup, but in general, to use HTTPs, you should have your keystore/truststore generated and config ssl-client.xml and ssl-server.xml properly.


- Zhijie

________________________________
From: Colin Ma <co...@gmail.com>
Sent: Friday, July 03, 2015 1:37 AM
To: user@hadoop.apache.org
Subject: Problem when configure the security in hadoop

Hi,
         I do the security configuration for Hadoop these days, the Kerberos works fine, but there maybe has some problems on sasl configuration.
         The following is the related configuration in hdfs-site.xml:
    <property>
      <name>dfs.http.policy</name>
      <value>HTTPS_ONLY</value>
    </property>
    <property>
      <name>dfs.data.transfer.protection</name>
      <value>authentication</value>
    </property>

There is no problem to execute the command like:   hdfs dfs -ls /
But when I execute the command:   hdfs dfs -copyToLocal /temp/test.txt .  The following exception will be thrown:

         015-07-03 14:02:54,715 INFO org.apache.hadoop.hdfs.server.datanode.DataBlockScanner: Added bpid=BP-271423801-192.168.20.28-1423724265164 to blockPoolScannerMap, new size=1
2015-07-03 14:03:39,963 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: server-511:50010:DataXceiver error processing unknown operation  src: /192.168.20.28:58422<http://192.168.20.28:58422> dst: /192.168.20.28:50010<http://192.168.20.28:50010>
java.io.EOFException: Premature EOF: no length prefix available
         at org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:2203)
         at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.readSaslMessageAndNegotiationCipherOptions(DataTransferSaslUtil.java:233)
         at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.doSaslHandshake(SaslDataTransferServer.java:369)
         at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.getSaslStreams(SaslDataTransferServer.java:297)
         at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.receive(SaslDataTransferServer.java:124)
         at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:183)
         at java.lang.Thread.run(Thread.java:745)
2015-07-03 15:34:39,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Sent 1 blockreports 145 blocks total. Took 1 msec to generate and 6 msecs for RPC and NN processing.  Got back commands org.apache.hadoop.hdfs.server.protocol.FinalizeCommand@1b3bce82<ma...@1b3bce82>

         Just take a look the method doSaslHandshake() of SaslDataTransferClient.java and SaslDataTransferServer.java, maybe SaslDataTransferClient send a empty response cause this exception, and I think some mistakes in the configuration caused this problem.
Is there anyone can help to check this problem?
Thanks for your help.

Best regards,

Colin Ma

Re: Problem when configure the security in hadoop

Posted by Zhijie Shen <zs...@hortonworks.com>.
?Not sure about HDFS special setup, but in general, to use HTTPs, you should have your keystore/truststore generated and config ssl-client.xml and ssl-server.xml properly.


- Zhijie

________________________________
From: Colin Ma <co...@gmail.com>
Sent: Friday, July 03, 2015 1:37 AM
To: user@hadoop.apache.org
Subject: Problem when configure the security in hadoop

Hi,
         I do the security configuration for Hadoop these days, the Kerberos works fine, but there maybe has some problems on sasl configuration.
         The following is the related configuration in hdfs-site.xml:
    <property>
      <name>dfs.http.policy</name>
      <value>HTTPS_ONLY</value>
    </property>
    <property>
      <name>dfs.data.transfer.protection</name>
      <value>authentication</value>
    </property>

There is no problem to execute the command like:   hdfs dfs -ls /
But when I execute the command:   hdfs dfs -copyToLocal /temp/test.txt .  The following exception will be thrown:

         015-07-03 14:02:54,715 INFO org.apache.hadoop.hdfs.server.datanode.DataBlockScanner: Added bpid=BP-271423801-192.168.20.28-1423724265164 to blockPoolScannerMap, new size=1
2015-07-03 14:03:39,963 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: server-511:50010:DataXceiver error processing unknown operation  src: /192.168.20.28:58422<http://192.168.20.28:58422> dst: /192.168.20.28:50010<http://192.168.20.28:50010>
java.io.EOFException: Premature EOF: no length prefix available
         at org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:2203)
         at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.readSaslMessageAndNegotiationCipherOptions(DataTransferSaslUtil.java:233)
         at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.doSaslHandshake(SaslDataTransferServer.java:369)
         at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.getSaslStreams(SaslDataTransferServer.java:297)
         at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.receive(SaslDataTransferServer.java:124)
         at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:183)
         at java.lang.Thread.run(Thread.java:745)
2015-07-03 15:34:39,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Sent 1 blockreports 145 blocks total. Took 1 msec to generate and 6 msecs for RPC and NN processing.  Got back commands org.apache.hadoop.hdfs.server.protocol.FinalizeCommand@1b3bce82<ma...@1b3bce82>

         Just take a look the method doSaslHandshake() of SaslDataTransferClient.java and SaslDataTransferServer.java, maybe SaslDataTransferClient send a empty response cause this exception, and I think some mistakes in the configuration caused this problem.
Is there anyone can help to check this problem?
Thanks for your help.

Best regards,

Colin Ma

Re: Problem when configure the security in hadoop

Posted by Zhijie Shen <zs...@hortonworks.com>.
?Not sure about HDFS special setup, but in general, to use HTTPs, you should have your keystore/truststore generated and config ssl-client.xml and ssl-server.xml properly.


- Zhijie

________________________________
From: Colin Ma <co...@gmail.com>
Sent: Friday, July 03, 2015 1:37 AM
To: user@hadoop.apache.org
Subject: Problem when configure the security in hadoop

Hi,
         I do the security configuration for Hadoop these days, the Kerberos works fine, but there maybe has some problems on sasl configuration.
         The following is the related configuration in hdfs-site.xml:
    <property>
      <name>dfs.http.policy</name>
      <value>HTTPS_ONLY</value>
    </property>
    <property>
      <name>dfs.data.transfer.protection</name>
      <value>authentication</value>
    </property>

There is no problem to execute the command like:   hdfs dfs -ls /
But when I execute the command:   hdfs dfs -copyToLocal /temp/test.txt .  The following exception will be thrown:

         015-07-03 14:02:54,715 INFO org.apache.hadoop.hdfs.server.datanode.DataBlockScanner: Added bpid=BP-271423801-192.168.20.28-1423724265164 to blockPoolScannerMap, new size=1
2015-07-03 14:03:39,963 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: server-511:50010:DataXceiver error processing unknown operation  src: /192.168.20.28:58422<http://192.168.20.28:58422> dst: /192.168.20.28:50010<http://192.168.20.28:50010>
java.io.EOFException: Premature EOF: no length prefix available
         at org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:2203)
         at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.readSaslMessageAndNegotiationCipherOptions(DataTransferSaslUtil.java:233)
         at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.doSaslHandshake(SaslDataTransferServer.java:369)
         at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.getSaslStreams(SaslDataTransferServer.java:297)
         at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferServer.receive(SaslDataTransferServer.java:124)
         at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:183)
         at java.lang.Thread.run(Thread.java:745)
2015-07-03 15:34:39,917 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Sent 1 blockreports 145 blocks total. Took 1 msec to generate and 6 msecs for RPC and NN processing.  Got back commands org.apache.hadoop.hdfs.server.protocol.FinalizeCommand@1b3bce82<ma...@1b3bce82>

         Just take a look the method doSaslHandshake() of SaslDataTransferClient.java and SaslDataTransferServer.java, maybe SaslDataTransferClient send a empty response cause this exception, and I think some mistakes in the configuration caused this problem.
Is there anyone can help to check this problem?
Thanks for your help.

Best regards,

Colin Ma