You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by GitBox <gi...@apache.org> on 2020/07/12 06:57:12 UTC

[GitHub] [hadoop] touchida opened a new pull request #2135: HDFS-15465. Support WebHDFS accesses to the data stored in secure Dat…

touchida opened a new pull request #2135:
URL: https://github.com/apache/hadoop/pull/2135


   …anode through insecure Namenode.
   
   ## NOTICE
   
   Please create an issue in ASF JIRA before opening a pull request,
   and you need to set the title of the pull request which starts with
   the corresponding JIRA issue number. (e.g. HADOOP-XXXXX. Fix a typo in YYY.)
   For more details, please see https://cwiki.apache.org/confluence/display/HADOOP/How+To+Contribute
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] hadoop-yetus commented on pull request #2135: HDFS-15465. Support WebHDFS accesses to the data stored in secure Dat…

Posted by GitBox <gi...@apache.org>.
hadoop-yetus commented on pull request #2135:
URL: https://github.com/apache/hadoop/pull/2135#issuecomment-657201214


   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime | Comment |
   |:----:|----------:|--------:|:--------|
   | +0 :ok: |  reexec  |  20m 34s |  Docker mode activated.  |
   ||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  No case conflicting files found.  |
   | +1 :green_heart: |  @author  |   0m  0s |  The patch does not contain any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  The patch appears to include 2 new or modified test files.  |
   ||| _ trunk Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |  21m 59s |  trunk passed  |
   | +1 :green_heart: |  compile  |   1m 16s |  trunk passed with JDK Ubuntu-11.0.7+10-post-Ubuntu-2ubuntu218.04  |
   | +1 :green_heart: |  compile  |   1m  6s |  trunk passed with JDK Private Build-1.8.0_252-8u252-b09-1~18.04-b09  |
   | +1 :green_heart: |  checkstyle  |   0m 48s |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   1m 15s |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  17m  9s |  branch has no errors when building and testing our client artifacts.  |
   | -1 :x: |  javadoc  |   0m 32s |  hadoop-hdfs in trunk failed with JDK Ubuntu-11.0.7+10-post-Ubuntu-2ubuntu218.04.  |
   | +1 :green_heart: |  javadoc  |   0m 40s |  trunk passed with JDK Private Build-1.8.0_252-8u252-b09-1~18.04-b09  |
   | +0 :ok: |  spotbugs  |   3m  3s |  Used deprecated FindBugs config; considering switching to SpotBugs.  |
   | +1 :green_heart: |  findbugs  |   3m  1s |  trunk passed  |
   ||| _ Patch Compile Tests _ |
   | +1 :green_heart: |  mvninstall  |   1m  9s |  the patch passed  |
   | +1 :green_heart: |  compile  |   1m 12s |  the patch passed with JDK Ubuntu-11.0.7+10-post-Ubuntu-2ubuntu218.04  |
   | +1 :green_heart: |  javac  |   1m 12s |  the patch passed  |
   | +1 :green_heart: |  compile  |   1m  4s |  the patch passed with JDK Private Build-1.8.0_252-8u252-b09-1~18.04-b09  |
   | +1 :green_heart: |  javac  |   1m  4s |  the patch passed  |
   | +1 :green_heart: |  checkstyle  |   0m 42s |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   1m  8s |  the patch passed  |
   | +1 :green_heart: |  whitespace  |   0m  0s |  The patch has no whitespace issues.  |
   | +1 :green_heart: |  shadedclient  |  15m 14s |  patch has no errors when building and testing our client artifacts.  |
   | -1 :x: |  javadoc  |   0m 28s |  hadoop-hdfs in the patch failed with JDK Ubuntu-11.0.7+10-post-Ubuntu-2ubuntu218.04.  |
   | +1 :green_heart: |  javadoc  |   0m 38s |  the patch passed with JDK Private Build-1.8.0_252-8u252-b09-1~18.04-b09  |
   | +1 :green_heart: |  findbugs  |   3m  3s |  the patch passed  |
   ||| _ Other Tests _ |
   | -1 :x: |  unit  |  91m 40s |  hadoop-hdfs in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 34s |  The patch does not generate ASF License warnings.  |
   |  |   | 186m 51s |   |
   
   
   | Reason | Tests |
   |-------:|:------|
   | Failed junit tests | hadoop.hdfs.server.blockmanagement.TestBlockTokenWithDFSStriped |
   |   | hadoop.hdfs.server.namenode.TestNameNodeRetryCacheMetrics |
   |   | hadoop.hdfs.server.sps.TestExternalStoragePolicySatisfier |
   |   | hadoop.hdfs.TestGetFileChecksum |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | ClientAPI=1.40 ServerAPI=1.40 base: https://builds.apache.org/job/hadoop-multibranch/job/PR-2135/1/artifact/out/Dockerfile |
   | GITHUB PR | https://github.com/apache/hadoop/pull/2135 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient findbugs checkstyle |
   | uname | Linux ad29aa6754ad 4.15.0-101-generic #102-Ubuntu SMP Mon May 11 10:07:26 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | personality/hadoop.sh |
   | git revision | trunk / 84b74b335c0 |
   | Default Java | Private Build-1.8.0_252-8u252-b09-1~18.04-b09 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.7+10-post-Ubuntu-2ubuntu218.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_252-8u252-b09-1~18.04-b09 |
   | javadoc | https://builds.apache.org/job/hadoop-multibranch/job/PR-2135/1/artifact/out/branch-javadoc-hadoop-hdfs-project_hadoop-hdfs-jdkUbuntu-11.0.7+10-post-Ubuntu-2ubuntu218.04.txt |
   | javadoc | https://builds.apache.org/job/hadoop-multibranch/job/PR-2135/1/artifact/out/patch-javadoc-hadoop-hdfs-project_hadoop-hdfs-jdkUbuntu-11.0.7+10-post-Ubuntu-2ubuntu218.04.txt |
   | unit | https://builds.apache.org/job/hadoop-multibranch/job/PR-2135/1/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt |
   |  Test Results | https://builds.apache.org/job/hadoop-multibranch/job/PR-2135/1/testReport/ |
   | Max. process+thread count | 3460 (vs. ulimit of 5500) |
   | modules | C: hadoop-hdfs-project/hadoop-hdfs U: hadoop-hdfs-project/hadoop-hdfs |
   | Console output | https://builds.apache.org/job/hadoop-multibranch/job/PR-2135/1/console |
   | versions | git=2.17.1 maven=3.6.0 findbugs=3.1.0-RC1 |
   | Powered by | Apache Yetus 0.12.0 https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] sunchao commented on pull request #2135: HDFS-15465. Support WebHDFS accesses to the data stored in secure Dat…

Posted by GitBox <gi...@apache.org>.
sunchao commented on pull request #2135:
URL: https://github.com/apache/hadoop/pull/2135#issuecomment-659031551


   @touchida thanks for the PR. What is the implication of this when both NN and DN are secure, and some insecure client try the following:
   ```
   curl -i "http://<SECURE_DATANODE>:<PORT>/webhdfs/v1/<PATH>?op=OPEN&namenoderpcaddress=<SECURE_NAMENODE>&offset=0"
   ```
   will it still work and cause security breach since secure DN can launch a DFS client talking to the secure NN?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] touchida commented on pull request #2135: HDFS-15465. Support WebHDFS accesses to the data stored in secure Dat…

Posted by GitBox <gi...@apache.org>.
touchida commented on pull request #2135:
URL: https://github.com/apache/hadoop/pull/2135#issuecomment-661825717


   @sunchao Thanks for your comment!
   > curl -i "http://<SECURE_DATANODE>:<PORT>/webhdfs/v1/<PATH>?op=OPEN&namenoderpcaddress=<SECURE_NAMENODE>&offset=0"
   
   No, it won't work.
   It will result in `AccessControlException` with `403` response code, as follows.
   ```
   $ curl -i "http://<SECURE_DATANODE>:<PORT>/webhdfs/v1/<PATH>?op=OPEN&namenoderpcaddress=<SECURE_NAMENODE>&offset=0"
   HTTP/1.1 403 Forbidden
   (omitted)
   {"RemoteException":{"exception":"IOException","javaClassName":"java.io.IOException","message":"DestHost:destPort <SECURE_NAMENODE>:<PORT> , LocalHost:localPort <SECURE_DATANODE>:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]"}}
   ```
   The corresponding Datanode log is as follows:
   ```
   2020-07-21 09:16:02,559 WARN org.apache.hadoop.ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
   2020-07-21 09:16:02,577 WARN org.apache.hadoop.ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
   2020-07-21 09:16:02,578 INFO org.apache.hadoop.io.retry.RetryInvocationHandler: java.io.IOException: DestHost:destPort <SECURE_NAMNODE>:<PORT> , LocalHost:localPort <SECURE_DATANODE>:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS], while invoking ClientNamenodeProtocolTranslatorPB.getBlockLocations over <SECURE_NAMENODE>:<PORT> after 1 failover attempts. Trying to failover after sleeping for 1224ms.
   (omitted)
   2020-07-21 09:18:40,881 INFO org.apache.hadoop.io.retry.RetryInvocationHandler: java.io.IOException: DestHost:destPort <SECURE_NAMNODE>:<PORT> , LocalHost:localPort <SECURE_DATANODE>:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS], while invoking ClientNamenodeProtocolTranslatorPB.getBlockLocations over <SECURE_NAMENODE>:<PORT> after 14 failover attempts. Trying to failover after sleeping for 20346ms.
   (omitted)
   2020-07-21 09:19:01,243 WARN org.apache.hadoop.ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
   ```
   This is because in the absence of delegation tokens, `org.apache.hadoop.hdfs.server.datanode.web.webhdfs.WebHdfsHandler#channelRead0` will create insecure `DFSClient`, which cannot talk to secure Namenode.
   - https://github.com/apache/hadoop/blob/da0006f/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/datanode/web/webhdfs/WebHdfsHandler.java#L261


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org