You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by GitBox <gi...@apache.org> on 2021/07/15 13:38:07 UTC

[GitHub] [hadoop] tomicooler opened a new pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

tomicooler opened a new pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209


   The branch depends on YARN-10814.
   
   ## NOTICE
   
   Please create an issue in ASF JIRA before opening a pull request,
   and you need to set the title of the pull request which starts with
   the corresponding JIRA issue number. (e.g. HADOOP-XXXXX. Fix a typo in YYY.)
   For more details, please see https://cwiki.apache.org/confluence/display/HADOOP/How+To+Contribute
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] hadoop-yetus commented on pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
hadoop-yetus commented on pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#issuecomment-891708421


   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |:----:|----------:|--------:|:--------:|:-------:|
   | +0 :ok: |  reexec  |   0m 52s |  |  Docker mode activated.  |
   |||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  1s |  |  No case conflicting files found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to include 8 new or modified test files.  |
   |||| _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  13m 19s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  20m 19s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  21m 35s |  |  trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  compile  |  18m 25s |  |  trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  checkstyle  |   3m 37s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   3m 14s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   2m 33s |  |  trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javadoc  |   3m  1s |  |  trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  spotbugs  |   4m 21s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  14m 39s |  |  branch has no errors when building and testing our client artifacts.  |
   |||| _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 27s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 44s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  20m 28s |  |  the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javac  |  20m 28s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  19m 13s |  |  the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  javac  |  19m 13s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks issues.  |
   | +1 :green_heart: |  checkstyle  |   3m 48s |  |  root: The patch generated 0 new + 97 unchanged - 2 fixed = 97 total (was 99)  |
   | +1 :green_heart: |  mvnsite  |   3m  1s |  |  the patch passed  |
   | +1 :green_heart: |  xml  |   0m  1s |  |  The patch has no ill-formed XML file.  |
   | +1 :green_heart: |  javadoc  |   2m 20s |  |  the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javadoc  |   2m 56s |  |  the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | -1 :x: |  spotbugs  |   1m 12s | [/new-spotbugs-hadoop-hdfs-project_hadoop-hdfs-httpfs.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/6/artifact/out/new-spotbugs-hadoop-hdfs-project_hadoop-hdfs-httpfs.html) |  hadoop-hdfs-project/hadoop-hdfs-httpfs generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0)  |
   | +1 :green_heart: |  shadedclient  |  15m 27s |  |  patch has no errors when building and testing our client artifacts.  |
   |||| _ Other Tests _ |
   | +1 :green_heart: |  unit  |  17m 50s |  |  hadoop-common in the patch passed.  |
   | +1 :green_heart: |  unit  |   3m 38s |  |  hadoop-kms in the patch passed.  |
   | +1 :green_heart: |  unit  |   9m 14s |  |  hadoop-hdfs-httpfs in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   1m  2s |  |  The patch does not generate ASF License warnings.  |
   |  |   | 216m 12s |  |  |
   
   
   | Reason | Tests |
   |-------:|:------|
   | SpotBugs | module:hadoop-hdfs-project/hadoop-hdfs-httpfs |
   |  |  org.apache.hadoop.fs.http.server.HttpFSAuthenticationFilter.CONF_PREFIXES should be package protected  At HttpFSAuthenticationFilter.java: At HttpFSAuthenticationFilter.java:[line 54] |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/6/artifact/out/Dockerfile |
   | GITHUB PR | https://github.com/apache/hadoop/pull/3209 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell xml |
   | uname | Linux 5703e8d2d241 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 56a259a893114a7ae8ef8fce4de45c239f605387 |
   | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
   |  Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/6/testReport/ |
   | Max. process+thread count | 1282 (vs. ulimit of 5500) |
   | modules | C: hadoop-common-project/hadoop-common hadoop-common-project/hadoop-kms hadoop-hdfs-project/hadoop-hdfs-httpfs U: . |
   | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/6/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] hadoop-yetus commented on pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
hadoop-yetus commented on pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#issuecomment-892859873


   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |:----:|----------:|--------:|:--------:|:-------:|
   | +0 :ok: |  reexec  |   1m  3s |  |  Docker mode activated.  |
   |||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to include 8 new or modified test files.  |
   |||| _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  13m 59s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  20m 51s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  22m 31s |  |  trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  compile  |  19m 18s |  |  trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  checkstyle  |   3m 41s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   3m 18s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   2m 34s |  |  trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javadoc  |   3m  4s |  |  trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  spotbugs  |   4m 22s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  14m 33s |  |  branch has no errors when building and testing our client artifacts.  |
   |||| _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 26s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 42s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  20m 26s |  |  the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javac  |  20m 26s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  18m 33s |  |  the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  javac  |  18m 33s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks issues.  |
   | -0 :warning: |  checkstyle  |   3m 36s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/10/artifact/out/results-checkstyle-root.txt) |  root: The patch generated 2 new + 99 unchanged - 0 fixed = 101 total (was 99)  |
   | +1 :green_heart: |  mvnsite  |   3m 14s |  |  the patch passed  |
   | +1 :green_heart: |  xml  |   0m  2s |  |  The patch has no ill-formed XML file.  |
   | +1 :green_heart: |  javadoc  |   2m 35s |  |  the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javadoc  |   3m  2s |  |  the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  spotbugs  |   4m 55s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  14m 47s |  |  patch has no errors when building and testing our client artifacts.  |
   |||| _ Other Tests _ |
   | +1 :green_heart: |  unit  |  17m  9s |  |  hadoop-common in the patch passed.  |
   | +1 :green_heart: |  unit  |   3m 44s |  |  hadoop-kms in the patch passed.  |
   | +1 :green_heart: |  unit  |   8m 56s |  |  hadoop-hdfs-httpfs in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 59s |  |  The patch does not generate ASF License warnings.  |
   |  |   | 217m 52s |  |  |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/10/artifact/out/Dockerfile |
   | GITHUB PR | https://github.com/apache/hadoop/pull/3209 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell xml |
   | uname | Linux 19e7e9fc61c2 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 8e4b4938bb0ad85386166d2a506cb9e7a8cb8eb4 |
   | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
   |  Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/10/testReport/ |
   | Max. process+thread count | 1282 (vs. ulimit of 5500) |
   | modules | C: hadoop-common-project/hadoop-common hadoop-common-project/hadoop-kms hadoop-hdfs-project/hadoop-hdfs-httpfs U: . |
   | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/10/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] hadoop-yetus commented on pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
hadoop-yetus commented on pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#issuecomment-892788124


   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |:----:|----------:|--------:|:--------:|:-------:|
   | +0 :ok: |  reexec  |   0m 44s |  |  Docker mode activated.  |
   |||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to include 8 new or modified test files.  |
   |||| _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  13m 22s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  20m 12s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  21m 17s |  |  trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  compile  |  18m 30s |  |  trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  checkstyle  |   3m 40s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   3m 14s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   2m 34s |  |  trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javadoc  |   3m  0s |  |  trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  spotbugs  |   4m 20s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  14m 23s |  |  branch has no errors when building and testing our client artifacts.  |
   |||| _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 27s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 40s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  23m  8s |  |  the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javac  |  23m  8s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  19m  5s |  |  the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  javac  |  19m  5s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks issues.  |
   | -0 :warning: |  checkstyle  |   3m 33s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/9/artifact/out/results-checkstyle-root.txt) |  root: The patch generated 2 new + 99 unchanged - 0 fixed = 101 total (was 99)  |
   | +1 :green_heart: |  mvnsite  |   3m 16s |  |  the patch passed  |
   | +1 :green_heart: |  xml  |   0m  1s |  |  The patch has no ill-formed XML file.  |
   | +1 :green_heart: |  javadoc  |   2m 32s |  |  the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javadoc  |   3m  5s |  |  the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  spotbugs  |   4m 56s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  14m 35s |  |  patch has no errors when building and testing our client artifacts.  |
   |||| _ Other Tests _ |
   | +1 :green_heart: |  unit  |  17m 10s |  |  hadoop-common in the patch passed.  |
   | +1 :green_heart: |  unit  |   3m 43s |  |  hadoop-kms in the patch passed.  |
   | +1 :green_heart: |  unit  |   6m 13s |  |  hadoop-hdfs-httpfs in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   1m  0s |  |  The patch does not generate ASF License warnings.  |
   |  |   | 213m 57s |  |  |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/9/artifact/out/Dockerfile |
   | GITHUB PR | https://github.com/apache/hadoop/pull/3209 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell xml |
   | uname | Linux e0edd9148df7 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / e228b412101491ed351253b0670bfe9564dc049e |
   | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
   |  Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/9/testReport/ |
   | Max. process+thread count | 3158 (vs. ulimit of 5500) |
   | modules | C: hadoop-common-project/hadoop-common hadoop-common-project/hadoop-kms hadoop-hdfs-project/hadoop-hdfs-httpfs U: . |
   | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/9/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] brumi1024 commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
brumi1024 commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682489108



##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -811,18 +822,26 @@ private static SignerSecretProvider constructSecretProvider(final Builder b,
       throws Exception {
     final Configuration conf = b.conf;
     Properties config = getFilterProperties(conf,
-                                            b.authFilterConfigurationPrefix);
+        b.authFilterConfigurationPrefixes);
     return AuthenticationFilter.constructSecretProvider(
         ctx, config, b.disallowFallbackToRandomSignerSecretProvider);
   }
 
-  private static Properties getFilterProperties(Configuration conf, String
-      prefix) {
-    Properties prop = new Properties();
-    Map<String, String> filterConfig = AuthenticationFilterInitializer
-        .getFilterConfigMap(conf, prefix);
-    prop.putAll(filterConfig);
-    return prop;
+  public static Properties getFilterProperties(Configuration conf,
+                                                ArrayList<String> prefixes) {
+    Properties props = new Properties();
+    prefixes.forEach(prefix -> {
+      Map<String, String> filterConfigMap =
+          AuthenticationFilterInitializer.getFilterConfigMap(conf, prefix);
+      filterConfigMap.forEach((key, value) -> {
+        Object previous = props.setProperty(key, value);
+        if (previous != null && !previous.equals(value)) {
+          LOG.warn("Overwriting configuration for key='{}' with value='{}' " +
+              "previous_value='{}'", key, value, previous);
+        }
+      });
+    });

Review comment:
       I generally don't have anything against using lambdas, but this doesn't seem like an improvement over a simple for or for-each loop. Performance wise (doesn't matter too much in this case) it's similar, but this I think is a bit harder to read, and debugging it is harder (because the unusual stacktrace that comes from the lambda), and in the future if for some reason someone wants to throw a checked exception when parsing the config he/she will need to rewrite this to a for loop.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] szilard-nemeth commented on pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
szilard-nemeth commented on pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#issuecomment-889038363


   Retriggered build: https://ci-hadoop.apache.org/job/hadoop-multibranch/view/change-requests/job/PR-3209/2/console


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] hadoop-yetus commented on pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
hadoop-yetus commented on pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#issuecomment-891916492


   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |:----:|----------:|--------:|:--------:|:-------:|
   | +0 :ok: |  reexec  |   0m 56s |  |  Docker mode activated.  |
   |||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to include 8 new or modified test files.  |
   |||| _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  12m 38s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  21m 56s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  22m 20s |  |  trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  compile  |  18m 27s |  |  trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  checkstyle  |   3m 40s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   3m 15s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   2m 34s |  |  trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javadoc  |   3m  3s |  |  trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  spotbugs  |   4m 22s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  14m 33s |  |  branch has no errors when building and testing our client artifacts.  |
   |||| _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 26s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 39s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  20m 30s |  |  the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javac  |  20m 30s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  18m 27s |  |  the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  javac  |  18m 27s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks issues.  |
   | +1 :green_heart: |  checkstyle  |   3m 32s |  |  root: The patch generated 0 new + 97 unchanged - 2 fixed = 97 total (was 99)  |
   | +1 :green_heart: |  mvnsite  |   3m 14s |  |  the patch passed  |
   | +1 :green_heart: |  xml  |   0m  1s |  |  The patch has no ill-formed XML file.  |
   | +1 :green_heart: |  javadoc  |   2m 34s |  |  the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javadoc  |   3m  4s |  |  the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  spotbugs  |   4m 52s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  14m 44s |  |  patch has no errors when building and testing our client artifacts.  |
   |||| _ Other Tests _ |
   | +1 :green_heart: |  unit  |  17m 11s |  |  hadoop-common in the patch passed.  |
   | +1 :green_heart: |  unit  |   3m 44s |  |  hadoop-kms in the patch passed.  |
   | +1 :green_heart: |  unit  |   8m 51s |  |  hadoop-hdfs-httpfs in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   1m  1s |  |  The patch does not generate ASF License warnings.  |
   |  |   | 216m 11s |  |  |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/8/artifact/out/Dockerfile |
   | GITHUB PR | https://github.com/apache/hadoop/pull/3209 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell xml |
   | uname | Linux b21da6338d4e 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 2c497078f45b93822a9b7e2e0c28278d1b585370 |
   | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
   |  Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/8/testReport/ |
   | Max. process+thread count | 3152 (vs. ulimit of 5500) |
   | modules | C: hadoop-common-project/hadoop-common hadoop-common-project/hadoop-kms hadoop-hdfs-project/hadoop-hdfs-httpfs U: . |
   | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/8/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] hadoop-yetus commented on pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
hadoop-yetus commented on pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#issuecomment-889985548


   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |:----:|----------:|--------:|:--------:|:-------:|
   | +0 :ok: |  reexec  |  27m 51s |  |  Docker mode activated.  |
   |||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to include 2 new or modified test files.  |
   |||| _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  12m 51s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  24m 29s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  21m 51s |  |  trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  compile  |  19m 38s |  |  trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  checkstyle  |   4m 47s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   3m 30s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   2m 30s |  |  trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javadoc  |   2m 59s |  |  trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  spotbugs  |   4m 24s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  18m 43s |  |  branch has no errors when building and testing our client artifacts.  |
   |||| _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 29s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 56s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  27m 43s |  |  the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javac  |  27m 43s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  23m 41s |  |  the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  javac  |  23m 41s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks issues.  |
   | +1 :green_heart: |  checkstyle  |   4m 26s |  |  the patch passed  |
   | +1 :green_heart: |  mvnsite  |   3m 23s |  |  the patch passed  |
   | +1 :green_heart: |  xml  |   0m  2s |  |  The patch has no ill-formed XML file.  |
   | +1 :green_heart: |  javadoc  |   2m 31s |  |  the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javadoc  |   3m 15s |  |  the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  spotbugs  |   5m 38s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  19m 13s |  |  patch has no errors when building and testing our client artifacts.  |
   |||| _ Other Tests _ |
   | +1 :green_heart: |  unit  |  19m 18s |  |  hadoop-common in the patch passed.  |
   | +1 :green_heart: |  unit  |   3m 46s |  |  hadoop-kms in the patch passed.  |
   | -1 :x: |  unit  |  13m 41s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/4/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt) |  hadoop-hdfs-httpfs in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 50s |  |  The patch does not generate ASF License warnings.  |
   |  |   | 277m 59s |  |  |
   
   
   | Reason | Tests |
   |-------:|:------|
   | Failed junit tests | hadoop.fs.http.server.TestHttpFSServerNoACLs |
   |   | hadoop.fs.http.server.TestHttpFSAccessControlled |
   |   | hadoop.fs.http.server.TestHttpFSServer |
   |   | hadoop.fs.http.client.TestHttpFSFWithWebhdfsFileSystem |
   |   | hadoop.fs.http.client.TestHttpFSFileSystemLocalFileSystem |
   |   | hadoop.fs.http.client.TestHttpFSFWithSWebhdfsFileSystem |
   |   | hadoop.fs.http.client.TestHttpFSWithHttpFSFileSystem |
   |   | hadoop.fs.http.server.TestHttpFSServerNoXAttrs |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/4/artifact/out/Dockerfile |
   | GITHUB PR | https://github.com/apache/hadoop/pull/3209 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell xml |
   | uname | Linux c05dcdf3ca77 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 63a847c98d1829c8786efbd7e6553c427e87d93b |
   | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
   |  Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/4/testReport/ |
   | Max. process+thread count | 2065 (vs. ulimit of 5500) |
   | modules | C: hadoop-common-project/hadoop-common hadoop-common-project/hadoop-kms hadoop-hdfs-project/hadoop-hdfs-httpfs U: . |
   | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/4/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] hadoop-yetus commented on pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
hadoop-yetus commented on pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#issuecomment-889439594


   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |:----:|----------:|--------:|:--------:|:-------:|
   | +0 :ok: |  reexec  |   1m  3s |  |  Docker mode activated.  |
   |||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files found.  |
   | +0 :ok: |  codespell  |   0m  2s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to include 4 new or modified test files.  |
   |||| _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  13m  7s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  23m  9s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  29m 25s |  |  trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | -1 :x: |  compile  |  22m 56s | [/branch-compile-root-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/branch-compile-root-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt) |  root in trunk failed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.  |
   | -0 :warning: |  checkstyle  |   0m 38s | [/buildtool-branch-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/buildtool-branch-checkstyle-root.txt) |  The patch fails to run checkstyle in root  |
   | -1 :x: |  mvnsite  |   0m 39s | [/branch-mvnsite-hadoop-common-project_hadoop-auth.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-auth.txt) |  hadoop-auth in trunk failed.  |
   | -1 :x: |  mvnsite  |   0m 38s | [/branch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt) |  hadoop-common in trunk failed.  |
   | -1 :x: |  mvnsite  |   0m 38s | [/branch-mvnsite-hadoop-common-project_hadoop-kms.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-kms.txt) |  hadoop-kms in trunk failed.  |
   | -1 :x: |  mvnsite  |   0m 38s | [/branch-mvnsite-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/branch-mvnsite-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt) |  hadoop-hdfs-httpfs in trunk failed.  |
   | -1 :x: |  javadoc  |   0m 39s | [/branch-javadoc-hadoop-common-project_hadoop-auth-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/branch-javadoc-hadoop-common-project_hadoop-auth-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt) |  hadoop-auth in trunk failed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.  |
   | -1 :x: |  javadoc  |   0m 38s | [/branch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/branch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt) |  hadoop-common in trunk failed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.  |
   | -1 :x: |  javadoc  |   0m 39s | [/branch-javadoc-hadoop-common-project_hadoop-kms-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/branch-javadoc-hadoop-common-project_hadoop-kms-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt) |  hadoop-kms in trunk failed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.  |
   | -1 :x: |  javadoc  |   0m 39s | [/branch-javadoc-hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/branch-javadoc-hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt) |  hadoop-hdfs-httpfs in trunk failed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.  |
   | -1 :x: |  javadoc  |   0m 39s | [/branch-javadoc-hadoop-common-project_hadoop-auth-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/branch-javadoc-hadoop-common-project_hadoop-auth-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt) |  hadoop-auth in trunk failed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.  |
   | -1 :x: |  javadoc  |   0m 39s | [/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt) |  hadoop-common in trunk failed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.  |
   | -1 :x: |  javadoc  |   0m 41s | [/branch-javadoc-hadoop-common-project_hadoop-kms-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/branch-javadoc-hadoop-common-project_hadoop-kms-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt) |  hadoop-kms in trunk failed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.  |
   | -1 :x: |  javadoc  |   0m 39s | [/branch-javadoc-hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/branch-javadoc-hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt) |  hadoop-hdfs-httpfs in trunk failed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.  |
   | -1 :x: |  spotbugs  |   0m 40s | [/branch-spotbugs-hadoop-common-project_hadoop-auth.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/branch-spotbugs-hadoop-common-project_hadoop-auth.txt) |  hadoop-auth in trunk failed.  |
   | -1 :x: |  spotbugs  |   0m 39s | [/branch-spotbugs-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/branch-spotbugs-hadoop-common-project_hadoop-common.txt) |  hadoop-common in trunk failed.  |
   | -1 :x: |  spotbugs  |   0m 39s | [/branch-spotbugs-hadoop-common-project_hadoop-kms.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/branch-spotbugs-hadoop-common-project_hadoop-kms.txt) |  hadoop-kms in trunk failed.  |
   | -1 :x: |  spotbugs  |   0m 39s | [/branch-spotbugs-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/branch-spotbugs-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt) |  hadoop-hdfs-httpfs in trunk failed.  |
   | +1 :green_heart: |  shadedclient  |  29m 22s |  |  branch has no errors when building and testing our client artifacts.  |
   |||| _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 27s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   2m 25s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  23m 11s |  |  the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javac  |  23m 11s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  19m 27s |  |  the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | -1 :x: |  javac  |  19m 27s | [/results-compile-javac-root-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/results-compile-javac-root-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt) |  root-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 generated 45 new + 1744 unchanged - 0 fixed = 1789 total (was 1744)  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks issues.  |
   | -0 :warning: |  checkstyle  |   3m 58s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/results-checkstyle-root.txt) |  root: The patch generated 156 new + 0 unchanged - 0 fixed = 156 total (was 0)  |
   | +1 :green_heart: |  mvnsite  |   4m 15s |  |  the patch passed  |
   | +1 :green_heart: |  xml  |   0m  2s |  |  The patch has no ill-formed XML file.  |
   | -1 :x: |  javadoc  |   1m 10s | [/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt) |  hadoop-common-project_hadoop-common-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 generated 99 new + 0 unchanged - 0 fixed = 99 total (was 0)  |
   | -1 :x: |  javadoc  |   0m 39s | [/results-javadoc-javadoc-hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/results-javadoc-javadoc-hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt) |  hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 generated 55 new + 0 unchanged - 0 fixed = 55 total (was 0)  |
   | -1 :x: |  javadoc  |   0m 38s | [/results-javadoc-javadoc-hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/results-javadoc-javadoc-hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt) |  hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 generated 55 new + 0 unchanged - 0 fixed = 55 total (was 0)  |
   | +1 :green_heart: |  spotbugs  |   7m  2s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  19m 25s |  |  patch has no errors when building and testing our client artifacts.  |
   |||| _ Other Tests _ |
   | +1 :green_heart: |  unit  |   3m 38s |  |  hadoop-auth in the patch passed.  |
   | -1 :x: |  unit  |  19m 30s | [/patch-unit-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/patch-unit-hadoop-common-project_hadoop-common.txt) |  hadoop-common in the patch passed.  |
   | -1 :x: |  unit  |  10m 58s | [/patch-unit-hadoop-common-project_hadoop-kms.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/patch-unit-hadoop-common-project_hadoop-kms.txt) |  hadoop-kms in the patch passed.  |
   | -1 :x: |  unit  |   0m 40s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt) |  hadoop-hdfs-httpfs in the patch failed.  |
   | +0 :ok: |  asflicense  |   0m 41s |  |  ASF License check generated no output?  |
   |  |   | 245m 59s |  |  |
   
   
   | Reason | Tests |
   |-------:|:------|
   | Failed junit tests | hadoop.metrics2.source.TestJvmMetrics |
   |   | hadoop.ipc.TestCallQueueManager |
   |   | hadoop.crypto.key.kms.server.TestKMS |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/artifact/out/Dockerfile |
   | GITHUB PR | https://github.com/apache/hadoop/pull/3209 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell xml |
   | uname | Linux 7f4dfdbefddd 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 6a396cea44c28a82f504ba0255ac794787a7d720 |
   | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
   |  Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/testReport/ |
   | Max. process+thread count | 833 (vs. ulimit of 5500) |
   | modules | C: hadoop-common-project/hadoop-auth hadoop-common-project/hadoop-common hadoop-common-project/hadoop-kms hadoop-hdfs-project/hadoop-hdfs-httpfs U: . |
   | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/3/console |
   | versions | git=2.25.1 maven=3.6.3 |
   | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] szilard-nemeth commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
szilard-nemeth commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682698137



##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -473,8 +482,10 @@ public HttpServer2 build() throws IOException {
       HttpServer2 server = new HttpServer2(this);
 
       if (this.securityEnabled &&
-          !this.conf.get(authFilterConfigurationPrefix + "type").
-          equals(PseudoAuthenticationHandler.TYPE)) {
+          authFilterConfigurationPrefixes.stream().noneMatch(
+              prefix -> (prefix + "type")
+                  .equals(PseudoAuthenticationHandler.TYPE))
+      ) {

Review comment:
       I see. You can file a follow up jira for this if you want to :)




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] brumi1024 commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
brumi1024 commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682480203



##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSAuthenticationFilter.java
##########
@@ -69,27 +72,11 @@
   @Override
   protected Properties getConfiguration(String configPrefix,
       FilterConfig filterConfig) throws ServletException{
-    Properties props = new Properties();
+    System.out.println("getConfiguration1");
     Configuration conf = HttpFSServerWebApp.get().getConfig();
-
-    props.setProperty(AuthenticationFilter.COOKIE_PATH, "/");
-    for (Map.Entry<String, String> entry : conf) {
-      String name = entry.getKey();
-      if (name.startsWith(HADOOP_HTTP_CONF_PREFIX)) {
-        name = name.substring(HADOOP_HTTP_CONF_PREFIX.length());
-        props.setProperty(name, entry.getValue());
-      }
-    }
-
-    // Replace Hadoop Http Authentication Configs with HttpFS specific Configs
-    for (Map.Entry<String, String> entry : conf) {
-      String name = entry.getKey();
-      if (name.startsWith(CONF_PREFIX)) {
-        String value = conf.get(name);
-        name = name.substring(CONF_PREFIX.length());
-        props.setProperty(name, value);
-      }
-    }
+    Properties props = HttpServer2.getFilterProperties(conf,
+        new ArrayList<>(Arrays.asList(CONF_PREFIXES)));
+    System.out.println("getConfiguration2");

Review comment:
       Nit: I think this could be removed.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSAuthenticationFilter.java
##########
@@ -98,6 +85,7 @@ protected Properties getConfiguration(String configPrefix,
     }
 
     if (!isRandomSecret(filterConfig)) {
+      System.out.println("FILE: " + signatureSecretFile);

Review comment:
       Nit: I think this could be removed.

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -243,7 +243,9 @@
 
     private String hostName;
     private boolean disallowFallbackToRandomSignerSecretProvider;
-    private String authFilterConfigurationPrefix = "hadoop.http.authentication.";
+    private final ArrayList<String> authFilterConfigurationPrefixes =
+        new ArrayList<>(Collections.singletonList(
+            "hadoop.http.authentication."));

Review comment:
       Nit: can you use a common final variable here as well?

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -811,18 +822,26 @@ private static SignerSecretProvider constructSecretProvider(final Builder b,
       throws Exception {
     final Configuration conf = b.conf;
     Properties config = getFilterProperties(conf,
-                                            b.authFilterConfigurationPrefix);
+        b.authFilterConfigurationPrefixes);
     return AuthenticationFilter.constructSecretProvider(
         ctx, config, b.disallowFallbackToRandomSignerSecretProvider);
   }
 
-  private static Properties getFilterProperties(Configuration conf, String
-      prefix) {
-    Properties prop = new Properties();
-    Map<String, String> filterConfig = AuthenticationFilterInitializer
-        .getFilterConfigMap(conf, prefix);
-    prop.putAll(filterConfig);
-    return prop;
+  public static Properties getFilterProperties(Configuration conf,
+                                                ArrayList<String> prefixes) {
+    Properties props = new Properties();
+    prefixes.forEach(prefix -> {
+      Map<String, String> filterConfigMap =
+          AuthenticationFilterInitializer.getFilterConfigMap(conf, prefix);
+      filterConfigMap.forEach((key, value) -> {
+        Object previous = props.setProperty(key, value);
+        if (previous != null && !previous.equals(value)) {
+          LOG.warn("Overwriting configuration for key='{}' with value='{}' " +
+              "previous_value='{}'", key, value, previous);
+        }
+      });
+    });

Review comment:
       I generally don't have anything against using lambdas, but this doesn't seem like an improvement over a simple for or for-each cycle. Performance wise (doesn't matter too much in this case) it's similar, but this I think is a bit harder to read, and debugging it is harder (because the unusual stacktrace that comes from the lambda), and in the future if for some reason someone wants to throw a checked exception when parsing the config he/she will need to rewrite this to a for cycle.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSAuthenticationFilter.java
##########
@@ -69,27 +72,11 @@
   @Override
   protected Properties getConfiguration(String configPrefix,
       FilterConfig filterConfig) throws ServletException{
-    Properties props = new Properties();
+    System.out.println("getConfiguration1");

Review comment:
       Nit: I think this could be removed.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] hadoop-yetus commented on pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
hadoop-yetus commented on pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#issuecomment-893585180


   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |:----:|----------:|--------:|:--------:|:-------:|
   | +0 :ok: |  reexec  |   0m 43s |  |  Docker mode activated.  |
   |||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  1s |  |  No case conflicting files found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to include 7 new or modified test files.  |
   |||| _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  12m 27s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  21m 25s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  23m 18s |  |  trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  compile  |  21m  4s |  |  trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  checkstyle  |   4m  1s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   3m 16s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   2m 31s |  |  trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javadoc  |   2m 51s |  |  trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  spotbugs  |   4m 23s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  16m 34s |  |  branch has no errors when building and testing our client artifacts.  |
   |||| _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 28s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 52s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  23m 27s |  |  the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javac  |  23m 27s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  20m 45s |  |  the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  javac  |  20m 45s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks issues.  |
   | -0 :warning: |  checkstyle  |   4m  4s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/12/artifact/out/results-checkstyle-root.txt) |  root: The patch generated 1 new + 95 unchanged - 0 fixed = 96 total (was 95)  |
   | +1 :green_heart: |  mvnsite  |   3m  9s |  |  the patch passed  |
   | +1 :green_heart: |  xml  |   0m  1s |  |  The patch has no ill-formed XML file.  |
   | +1 :green_heart: |  javadoc  |   2m 30s |  |  the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javadoc  |   2m 59s |  |  the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  spotbugs  |   5m  5s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  16m 25s |  |  patch has no errors when building and testing our client artifacts.  |
   |||| _ Other Tests _ |
   | +1 :green_heart: |  unit  |  18m  3s |  |  hadoop-common in the patch passed.  |
   | +1 :green_heart: |  unit  |   3m 43s |  |  hadoop-kms in the patch passed.  |
   | +1 :green_heart: |  unit  |   6m 30s |  |  hadoop-hdfs-httpfs in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 56s |  |  The patch does not generate ASF License warnings.  |
   |  |   | 226m 55s |  |  |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/12/artifact/out/Dockerfile |
   | GITHUB PR | https://github.com/apache/hadoop/pull/3209 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell xml |
   | uname | Linux b4349a829922 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / d2b5b76fe3bf242d8031489dbced91d7fa17307e |
   | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
   |  Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/12/testReport/ |
   | Max. process+thread count | 1251 (vs. ulimit of 5500) |
   | modules | C: hadoop-common-project/hadoop-common hadoop-common-project/hadoop-kms hadoop-hdfs-project/hadoop-hdfs-httpfs U: . |
   | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/12/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] tomicooler commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
tomicooler commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682508533



##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSAuthenticationFilter.java
##########
@@ -69,27 +72,11 @@
   @Override
   protected Properties getConfiguration(String configPrefix,
       FilterConfig filterConfig) throws ServletException{
-    Properties props = new Properties();
+    System.out.println("getConfiguration1");

Review comment:
       Somehow I managed to commit the debug logs.. :) I'll remove them.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] hadoop-yetus commented on pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
hadoop-yetus commented on pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#issuecomment-892788124






-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] tomicooler commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
tomicooler commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682508533



##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSAuthenticationFilter.java
##########
@@ -69,27 +72,11 @@
   @Override
   protected Properties getConfiguration(String configPrefix,
       FilterConfig filterConfig) throws ServletException{
-    Properties props = new Properties();
+    System.out.println("getConfiguration1");

Review comment:
       Somehow I managed to commit the debug logs.. :) I'll remove them.

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -811,18 +822,26 @@ private static SignerSecretProvider constructSecretProvider(final Builder b,
       throws Exception {
     final Configuration conf = b.conf;
     Properties config = getFilterProperties(conf,
-                                            b.authFilterConfigurationPrefix);
+        b.authFilterConfigurationPrefixes);
     return AuthenticationFilter.constructSecretProvider(
         ctx, config, b.disallowFallbackToRandomSignerSecretProvider);
   }
 
-  private static Properties getFilterProperties(Configuration conf, String
-      prefix) {
-    Properties prop = new Properties();
-    Map<String, String> filterConfig = AuthenticationFilterInitializer
-        .getFilterConfigMap(conf, prefix);
-    prop.putAll(filterConfig);
-    return prop;
+  public static Properties getFilterProperties(Configuration conf,
+                                                ArrayList<String> prefixes) {
+    Properties props = new Properties();
+    prefixes.forEach(prefix -> {
+      Map<String, String> filterConfigMap =
+          AuthenticationFilterInitializer.getFilterConfigMap(conf, prefix);
+      filterConfigMap.forEach((key, value) -> {
+        Object previous = props.setProperty(key, value);
+        if (previous != null && !previous.equals(value)) {
+          LOG.warn("Overwriting configuration for key='{}' with value='{}' " +
+              "previous_value='{}'", key, value, previous);
+        }
+      });
+    });

Review comment:
       Ok, I'll rewrite it, thanks for the explanation. It seemed a little bit shorter with the forEach, and more readable with the key value names for example (instead of entry.getKey(), entry.getValue()).

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -243,7 +243,9 @@
 
     private String hostName;
     private boolean disallowFallbackToRandomSignerSecretProvider;
-    private String authFilterConfigurationPrefix = "hadoop.http.authentication.";
+    private final ArrayList<String> authFilterConfigurationPrefixes =
+        new ArrayList<>(Collections.singletonList(
+            "hadoop.http.authentication."));

Review comment:
       The array is final, but the elements are inside it will be changed, checkstyle gives me a warning if I rename it to upper case.

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -243,7 +243,9 @@
 
     private String hostName;
     private boolean disallowFallbackToRandomSignerSecretProvider;
-    private String authFilterConfigurationPrefix = "hadoop.http.authentication.";
+    private final ArrayList<String> authFilterConfigurationPrefixes =

Review comment:
       Thanks. Done.

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -473,8 +482,10 @@ public HttpServer2 build() throws IOException {
       HttpServer2 server = new HttpServer2(this);
 
       if (this.securityEnabled &&
-          !this.conf.get(authFilterConfigurationPrefix + "type").
-          equals(PseudoAuthenticationHandler.TYPE)) {
+          authFilterConfigurationPrefixes.stream().noneMatch(
+              prefix -> (prefix + "type")
+                  .equals(PseudoAuthenticationHandler.TYPE))
+      ) {

Review comment:
       Good catch. it should have been this.conf.get(prefix + "type").equals.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/test/java/org/apache/hadoop/fs/http/server/TestHttpFSServerWebServer.java
##########
@@ -71,53 +82,195 @@ public static void beforeClass() throws Exception {
     System.setProperty("httpfs.home.dir", homeDir.getAbsolutePath());
     System.setProperty("httpfs.log.dir", logsDir.getAbsolutePath());
     System.setProperty("httpfs.config.dir", confDir.getAbsolutePath());
-    FileUtils.writeStringToFile(new File(confDir, "httpfs-signature.secret"),
-        "foo", StandardCharsets.UTF_8);
+    secretFile = new File(System.getProperty("httpfs.config.dir"),
+        "httpfs-signature-custom.secret");
   }
 
-  @Before
-  public void setUp() throws Exception {
-    Configuration conf = new Configuration();
-    conf.set(HttpFSServerWebServer.HTTP_HOSTNAME_KEY, "localhost");
-    conf.setInt(HttpFSServerWebServer.HTTP_PORT_KEY, 0);
-    conf.set(AuthenticationFilter.SIGNATURE_SECRET_FILE,
-        "httpfs-signature.secret");
-    Configuration sslConf = new Configuration();
-    webServer = new HttpFSServerWebServer(conf, sslConf);
+  @After
+  public void teardown() throws Exception {
+    if (webServer != null) {
+      webServer.stop();
+    }
   }
 
   @Test
   public void testStartStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
-    String user = HadoopUsersConfTestHelper.getHadoopUsers()[0];
-    URL url = new URL(webServer.getUrl(), MessageFormat.format(
-        "/webhdfs/v1/?user.name={0}&op=liststatus", user));
-    HttpURLConnection conn = (HttpURLConnection) url.openConnection();
-    Assert.assertEquals(HttpURLConnection.HTTP_OK, conn.getResponseCode());
-    BufferedReader reader = new BufferedReader(
-        new InputStreamReader(conn.getInputStream()));
-    reader.readLine();
-    reader.close();

Review comment:
       The following tests were just testing if the server could be started/stopped without an exception (my guess):
     - testStartStop
     - testJustStop
     - testJustStop
     - testDoubleStart
   I think this was the intention of the original author.
   
   
   I introduced some other tests with a secret file, empty secret file, etc with additional assert that actually checks if the secret provider is file based and not random. That was actually not asserted before, the tests were green, but the secret file was not even used. These tests uses the assertServiceRespondsWithOK.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/test/java/org/apache/hadoop/fs/http/server/TestHttpFSServerWebServer.java
##########
@@ -71,53 +82,195 @@ public static void beforeClass() throws Exception {
     System.setProperty("httpfs.home.dir", homeDir.getAbsolutePath());
     System.setProperty("httpfs.log.dir", logsDir.getAbsolutePath());
     System.setProperty("httpfs.config.dir", confDir.getAbsolutePath());
-    FileUtils.writeStringToFile(new File(confDir, "httpfs-signature.secret"),
-        "foo", StandardCharsets.UTF_8);
+    secretFile = new File(System.getProperty("httpfs.config.dir"),
+        "httpfs-signature-custom.secret");
   }
 
-  @Before
-  public void setUp() throws Exception {
-    Configuration conf = new Configuration();
-    conf.set(HttpFSServerWebServer.HTTP_HOSTNAME_KEY, "localhost");
-    conf.setInt(HttpFSServerWebServer.HTTP_PORT_KEY, 0);
-    conf.set(AuthenticationFilter.SIGNATURE_SECRET_FILE,
-        "httpfs-signature.secret");
-    Configuration sslConf = new Configuration();
-    webServer = new HttpFSServerWebServer(conf, sslConf);
+  @After
+  public void teardown() throws Exception {
+    if (webServer != null) {
+      webServer.stop();
+    }
   }
 
   @Test
   public void testStartStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
-    String user = HadoopUsersConfTestHelper.getHadoopUsers()[0];
-    URL url = new URL(webServer.getUrl(), MessageFormat.format(
-        "/webhdfs/v1/?user.name={0}&op=liststatus", user));
-    HttpURLConnection conn = (HttpURLConnection) url.openConnection();
-    Assert.assertEquals(HttpURLConnection.HTTP_OK, conn.getResponseCode());
-    BufferedReader reader = new BufferedReader(
-        new InputStreamReader(conn.getInputStream()));
-    reader.readLine();
-    reader.close();
     webServer.stop();
   }
 
   @Test
   public void testJustStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.stop();
   }
 
   @Test
   public void testDoubleStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
     webServer.stop();
     webServer.stop();
   }
 
   @Test
   public void testDoubleStart() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
+    webServer.start();
+    webServer.start();
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFile() throws Exception {
+    createSecretFile("foo");
+    webServer = createWebServer(createConfigurationWithSecretFile());
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithDeprecatedConfigOnly()
+      throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfiguration();
+    setDeprecatedSecretFile(conf, secretFile.getAbsolutePath());
+    webServer = createWebServer(conf);
+
+    // HttpFSAuthenticationFilter gets the configuration from the webapp, the defaults are loaded
+    // so the custom file location is overwritten with a non-existing default one.
+    Exception exception =
+        Assert.assertThrows(IOException.class, new ThrowingRunnable() {
+          @Override
+          public void run() throws Throwable {
+            webServer.start();
+          }
+        });
+    Assert.assertTrue(exception.getMessage().startsWith(
+        "Unable to initialize"));
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithBothConfigOption() throws Exception {

Review comment:
       Done.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/test/java/org/apache/hadoop/fs/http/client/TestHttpFSFileSystemLocalFileSystem.java
##########
@@ -32,8 +32,6 @@
 import org.junit.runners.Parameterized;
 
 import java.io.File;
-import java.net.URI;

Review comment:
       Done.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSAuthenticationFilter.java
##########
@@ -69,27 +72,11 @@
   @Override
   protected Properties getConfiguration(String configPrefix,
       FilterConfig filterConfig) throws ServletException{
-    Properties props = new Properties();
+    System.out.println("getConfiguration1");

Review comment:
       Done.

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -811,18 +822,26 @@ private static SignerSecretProvider constructSecretProvider(final Builder b,
       throws Exception {
     final Configuration conf = b.conf;
     Properties config = getFilterProperties(conf,
-                                            b.authFilterConfigurationPrefix);
+        b.authFilterConfigurationPrefixes);
     return AuthenticationFilter.constructSecretProvider(
         ctx, config, b.disallowFallbackToRandomSignerSecretProvider);
   }
 
-  private static Properties getFilterProperties(Configuration conf, String
-      prefix) {
-    Properties prop = new Properties();
-    Map<String, String> filterConfig = AuthenticationFilterInitializer
-        .getFilterConfigMap(conf, prefix);
-    prop.putAll(filterConfig);
-    return prop;
+  public static Properties getFilterProperties(Configuration conf,
+                                                ArrayList<String> prefixes) {
+    Properties props = new Properties();
+    prefixes.forEach(prefix -> {
+      Map<String, String> filterConfigMap =
+          AuthenticationFilterInitializer.getFilterConfigMap(conf, prefix);
+      filterConfigMap.forEach((key, value) -> {
+        Object previous = props.setProperty(key, value);
+        if (previous != null && !previous.equals(value)) {
+          LOG.warn("Overwriting configuration for key='{}' with value='{}' " +
+              "previous_value='{}'", key, value, previous);

Review comment:
       Done.

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -811,18 +822,26 @@ private static SignerSecretProvider constructSecretProvider(final Builder b,
       throws Exception {
     final Configuration conf = b.conf;
     Properties config = getFilterProperties(conf,
-                                            b.authFilterConfigurationPrefix);
+        b.authFilterConfigurationPrefixes);
     return AuthenticationFilter.constructSecretProvider(
         ctx, config, b.disallowFallbackToRandomSignerSecretProvider);
   }
 
-  private static Properties getFilterProperties(Configuration conf, String
-      prefix) {
-    Properties prop = new Properties();
-    Map<String, String> filterConfig = AuthenticationFilterInitializer
-        .getFilterConfigMap(conf, prefix);
-    prop.putAll(filterConfig);
-    return prop;
+  public static Properties getFilterProperties(Configuration conf,
+                                                ArrayList<String> prefixes) {

Review comment:
       Done.

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -473,8 +482,10 @@ public HttpServer2 build() throws IOException {
       HttpServer2 server = new HttpServer2(this);
 
       if (this.securityEnabled &&
-          !this.conf.get(authFilterConfigurationPrefix + "type").
-          equals(PseudoAuthenticationHandler.TYPE)) {
+          authFilterConfigurationPrefixes.stream().noneMatch(
+              prefix -> (prefix + "type")
+                  .equals(PseudoAuthenticationHandler.TYPE))
+      ) {

Review comment:
       Good catch. it should have been this.conf.get(prefix + "type").equals. Here I find the stream().noneMatch useful with the lambda expression. I won't resolve the thread yet, so I can replace them with a helper function if needed.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/test/java/org/apache/hadoop/fs/http/server/TestHttpFSServerWebServer.java
##########
@@ -71,53 +82,195 @@ public static void beforeClass() throws Exception {
     System.setProperty("httpfs.home.dir", homeDir.getAbsolutePath());
     System.setProperty("httpfs.log.dir", logsDir.getAbsolutePath());
     System.setProperty("httpfs.config.dir", confDir.getAbsolutePath());
-    FileUtils.writeStringToFile(new File(confDir, "httpfs-signature.secret"),
-        "foo", StandardCharsets.UTF_8);
+    secretFile = new File(System.getProperty("httpfs.config.dir"),
+        "httpfs-signature-custom.secret");
   }
 
-  @Before
-  public void setUp() throws Exception {
-    Configuration conf = new Configuration();
-    conf.set(HttpFSServerWebServer.HTTP_HOSTNAME_KEY, "localhost");
-    conf.setInt(HttpFSServerWebServer.HTTP_PORT_KEY, 0);
-    conf.set(AuthenticationFilter.SIGNATURE_SECRET_FILE,
-        "httpfs-signature.secret");
-    Configuration sslConf = new Configuration();
-    webServer = new HttpFSServerWebServer(conf, sslConf);
+  @After
+  public void teardown() throws Exception {
+    if (webServer != null) {
+      webServer.stop();
+    }
   }
 
   @Test
   public void testStartStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
-    String user = HadoopUsersConfTestHelper.getHadoopUsers()[0];
-    URL url = new URL(webServer.getUrl(), MessageFormat.format(
-        "/webhdfs/v1/?user.name={0}&op=liststatus", user));
-    HttpURLConnection conn = (HttpURLConnection) url.openConnection();
-    Assert.assertEquals(HttpURLConnection.HTTP_OK, conn.getResponseCode());
-    BufferedReader reader = new BufferedReader(
-        new InputStreamReader(conn.getInputStream()));
-    reader.readLine();
-    reader.close();
     webServer.stop();
   }
 
   @Test
   public void testJustStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.stop();
   }
 
   @Test
   public void testDoubleStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
     webServer.stop();
     webServer.stop();
   }
 
   @Test
   public void testDoubleStart() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
+    webServer.start();
+    webServer.start();
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFile() throws Exception {
+    createSecretFile("foo");
+    webServer = createWebServer(createConfigurationWithSecretFile());
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithDeprecatedConfigOnly()
+      throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfiguration();
+    setDeprecatedSecretFile(conf, secretFile.getAbsolutePath());
+    webServer = createWebServer(conf);
+
+    // HttpFSAuthenticationFilter gets the configuration from the webapp, the defaults are loaded
+    // so the custom file location is overwritten with a non-existing default one.
+    Exception exception =
+        Assert.assertThrows(IOException.class, new ThrowingRunnable() {
+          @Override
+          public void run() throws Throwable {
+            webServer.start();
+          }
+        });
+    Assert.assertTrue(exception.getMessage().startsWith(
+        "Unable to initialize"));
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithBothConfigOption() throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfigurationWithSecretFile();
+    setDeprecatedSecretFile(conf, secretFile.getAbsolutePath());
+    webServer = createWebServer(conf);
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithBothConfigOptionOverWriteCheck()

Review comment:
       Done.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSAuthenticationFilter.java
##########
@@ -98,6 +85,7 @@ protected Properties getConfiguration(String configPrefix,
     }
 
     if (!isRandomSecret(filterConfig)) {
+      System.out.println("FILE: " + signatureSecretFile);

Review comment:
       Done

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSAuthenticationFilter.java
##########
@@ -69,27 +72,11 @@
   @Override
   protected Properties getConfiguration(String configPrefix,
       FilterConfig filterConfig) throws ServletException{
-    Properties props = new Properties();
+    System.out.println("getConfiguration1");
     Configuration conf = HttpFSServerWebApp.get().getConfig();
-
-    props.setProperty(AuthenticationFilter.COOKIE_PATH, "/");
-    for (Map.Entry<String, String> entry : conf) {
-      String name = entry.getKey();
-      if (name.startsWith(HADOOP_HTTP_CONF_PREFIX)) {
-        name = name.substring(HADOOP_HTTP_CONF_PREFIX.length());
-        props.setProperty(name, entry.getValue());
-      }
-    }
-
-    // Replace Hadoop Http Authentication Configs with HttpFS specific Configs
-    for (Map.Entry<String, String> entry : conf) {
-      String name = entry.getKey();
-      if (name.startsWith(CONF_PREFIX)) {
-        String value = conf.get(name);
-        name = name.substring(CONF_PREFIX.length());
-        props.setProperty(name, value);
-      }
-    }
+    Properties props = HttpServer2.getFilterProperties(conf,
+        new ArrayList<>(Arrays.asList(CONF_PREFIXES)));
+    System.out.println("getConfiguration2");

Review comment:
       Done.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/test/java/org/apache/hadoop/fs/http/server/TestHttpFSServerWebServer.java
##########
@@ -71,53 +82,195 @@ public static void beforeClass() throws Exception {
     System.setProperty("httpfs.home.dir", homeDir.getAbsolutePath());
     System.setProperty("httpfs.log.dir", logsDir.getAbsolutePath());
     System.setProperty("httpfs.config.dir", confDir.getAbsolutePath());
-    FileUtils.writeStringToFile(new File(confDir, "httpfs-signature.secret"),
-        "foo", StandardCharsets.UTF_8);
+    secretFile = new File(System.getProperty("httpfs.config.dir"),
+        "httpfs-signature-custom.secret");
   }
 
-  @Before
-  public void setUp() throws Exception {
-    Configuration conf = new Configuration();
-    conf.set(HttpFSServerWebServer.HTTP_HOSTNAME_KEY, "localhost");
-    conf.setInt(HttpFSServerWebServer.HTTP_PORT_KEY, 0);
-    conf.set(AuthenticationFilter.SIGNATURE_SECRET_FILE,
-        "httpfs-signature.secret");
-    Configuration sslConf = new Configuration();
-    webServer = new HttpFSServerWebServer(conf, sslConf);
+  @After
+  public void teardown() throws Exception {
+    if (webServer != null) {
+      webServer.stop();
+    }
   }
 
   @Test
   public void testStartStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
-    String user = HadoopUsersConfTestHelper.getHadoopUsers()[0];
-    URL url = new URL(webServer.getUrl(), MessageFormat.format(
-        "/webhdfs/v1/?user.name={0}&op=liststatus", user));
-    HttpURLConnection conn = (HttpURLConnection) url.openConnection();
-    Assert.assertEquals(HttpURLConnection.HTTP_OK, conn.getResponseCode());
-    BufferedReader reader = new BufferedReader(
-        new InputStreamReader(conn.getInputStream()));
-    reader.readLine();
-    reader.close();
     webServer.stop();
   }
 
   @Test
   public void testJustStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.stop();
   }
 
   @Test
   public void testDoubleStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
     webServer.stop();
     webServer.stop();
   }
 
   @Test
   public void testDoubleStart() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
+    webServer.start();
+    webServer.start();
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFile() throws Exception {
+    createSecretFile("foo");
+    webServer = createWebServer(createConfigurationWithSecretFile());
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithDeprecatedConfigOnly()
+      throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfiguration();
+    setDeprecatedSecretFile(conf, secretFile.getAbsolutePath());
+    webServer = createWebServer(conf);
+
+    // HttpFSAuthenticationFilter gets the configuration from the webapp, the defaults are loaded
+    // so the custom file location is overwritten with a non-existing default one.
+    Exception exception =
+        Assert.assertThrows(IOException.class, new ThrowingRunnable() {
+          @Override
+          public void run() throws Throwable {
+            webServer.start();
+          }
+        });
+    Assert.assertTrue(exception.getMessage().startsWith(
+        "Unable to initialize"));
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithBothConfigOption() throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfigurationWithSecretFile();
+    setDeprecatedSecretFile(conf, secretFile.getAbsolutePath());
+    webServer = createWebServer(conf);
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithBothConfigOptionOverWriteCheck()
+      throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfigurationWithSecretFile();
+    setDeprecatedSecretFile(conf, "will-be-overwritten");
+    webServer = createWebServer(conf);
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithMissingSecretFile() throws Exception {
+    webServer = createWebServer(createConfigurationWithSecretFile());
     webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        RandomSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithEmptySecretFile() throws Exception {
+    // The AuthenticationFilter.constructSecretProvider will do the fallback
+    // to the random secrets not the HttpFSAuthenticationFilter.
+    createSecretFile("");
+    webServer = createWebServer(createConfigurationWithSecretFile());
     webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        RandomSignerSecretProvider.class);
     webServer.stop();
   }
 
+  private <T extends SignerSecretProvider> void assertSignerSecretProviderType(
+      HttpServer2 server, Class<T> expected) {
+    SignerSecretProvider secretProvider = (SignerSecretProvider)
+        server.getWebAppContext().getServletContext()
+            .getAttribute(SIGNER_SECRET_PROVIDER_ATTRIBUTE);
+    Assert.assertNotNull(secretProvider);
+    Assert.assertEquals(expected, secretProvider.getClass());

Review comment:
       Done.

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -473,8 +482,10 @@ public HttpServer2 build() throws IOException {
       HttpServer2 server = new HttpServer2(this);
 
       if (this.securityEnabled &&
-          !this.conf.get(authFilterConfigurationPrefix + "type").
-          equals(PseudoAuthenticationHandler.TYPE)) {
+          authFilterConfigurationPrefixes.stream().noneMatch(
+              prefix -> (prefix + "type")
+                  .equals(PseudoAuthenticationHandler.TYPE))
+      ) {

Review comment:
       Good catch. it should have been this.conf.get(prefix + "type").equals. Here I find the stream().noneMatch useful with the lambda expression. I won't resolve the thread yet, so I can replace it with a helper function if needed.

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -473,8 +482,10 @@ public HttpServer2 build() throws IOException {
       HttpServer2 server = new HttpServer2(this);
 
       if (this.securityEnabled &&
-          !this.conf.get(authFilterConfigurationPrefix + "type").
-          equals(PseudoAuthenticationHandler.TYPE)) {
+          authFilterConfigurationPrefixes.stream().noneMatch(
+              prefix -> (prefix + "type")
+                  .equals(PseudoAuthenticationHandler.TYPE))
+      ) {

Review comment:
       Probably there is no test for this :/

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSAuthenticationFilter.java
##########
@@ -69,27 +72,11 @@
   @Override
   protected Properties getConfiguration(String configPrefix,
       FilterConfig filterConfig) throws ServletException{
-    Properties props = new Properties();
+    System.out.println("getConfiguration1");
     Configuration conf = HttpFSServerWebApp.get().getConfig();
-
-    props.setProperty(AuthenticationFilter.COOKIE_PATH, "/");

Review comment:
       Yes, the HttpServer2.getFilterProperties() will end up doing this.
   
   BTW the order of the overwriting has changed as I mentioned in the commit message. Also here the line 80 contained an error, calling the getValue() did not substituted the variable, but on the line 88 it did.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSServerWebServer.java
##########
@@ -178,6 +179,10 @@ public URL getUrl() {
     }
   }
 
+  public HttpServer2 getHttpServer() {

Review comment:
       Done.

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -243,7 +243,9 @@
 
     private String hostName;
     private boolean disallowFallbackToRandomSignerSecretProvider;
-    private String authFilterConfigurationPrefix = "hadoop.http.authentication.";
+    private final ArrayList<String> authFilterConfigurationPrefixes =
+        new ArrayList<>(Collections.singletonList(
+            "hadoop.http.authentication."));

Review comment:
       HttpServer2 is in the hadoop-common while HttpFSAuthenticationFilter is in the hadoop-hdfs-project/hadoop-hdfs-httpfs. The later depends on the hadoop-common :/.

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -811,18 +822,25 @@ private static SignerSecretProvider constructSecretProvider(final Builder b,
       throws Exception {
     final Configuration conf = b.conf;
     Properties config = getFilterProperties(conf,
-                                            b.authFilterConfigurationPrefix);
+        b.authFilterConfigurationPrefixes);
     return AuthenticationFilter.constructSecretProvider(
         ctx, config, b.disallowFallbackToRandomSignerSecretProvider);
   }
 
-  private static Properties getFilterProperties(Configuration conf, String
-      prefix) {
-    Properties prop = new Properties();
-    Map<String, String> filterConfig = AuthenticationFilterInitializer
-        .getFilterConfigMap(conf, prefix);
-    prop.putAll(filterConfig);
-    return prop;
+  public static Properties getFilterProperties(final Configuration conf, final List<String> prefixes) {

Review comment:
       Ok, I'll change it back and remove the finals from the inner lines too, it is quite short anyway. 
   
   I read some debates on this topic, if final is used consistently then the ones without final will catch your eyes more easily. (e.g.: http://www.javapractices.com/topic/TopicAction.do?Id=23)




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] tomicooler commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
tomicooler commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682568677



##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/test/java/org/apache/hadoop/fs/http/server/TestHttpFSServerWebServer.java
##########
@@ -71,53 +82,195 @@ public static void beforeClass() throws Exception {
     System.setProperty("httpfs.home.dir", homeDir.getAbsolutePath());
     System.setProperty("httpfs.log.dir", logsDir.getAbsolutePath());
     System.setProperty("httpfs.config.dir", confDir.getAbsolutePath());
-    FileUtils.writeStringToFile(new File(confDir, "httpfs-signature.secret"),
-        "foo", StandardCharsets.UTF_8);
+    secretFile = new File(System.getProperty("httpfs.config.dir"),
+        "httpfs-signature-custom.secret");
   }
 
-  @Before
-  public void setUp() throws Exception {
-    Configuration conf = new Configuration();
-    conf.set(HttpFSServerWebServer.HTTP_HOSTNAME_KEY, "localhost");
-    conf.setInt(HttpFSServerWebServer.HTTP_PORT_KEY, 0);
-    conf.set(AuthenticationFilter.SIGNATURE_SECRET_FILE,
-        "httpfs-signature.secret");
-    Configuration sslConf = new Configuration();
-    webServer = new HttpFSServerWebServer(conf, sslConf);
+  @After
+  public void teardown() throws Exception {
+    if (webServer != null) {
+      webServer.stop();
+    }
   }
 
   @Test
   public void testStartStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
-    String user = HadoopUsersConfTestHelper.getHadoopUsers()[0];
-    URL url = new URL(webServer.getUrl(), MessageFormat.format(
-        "/webhdfs/v1/?user.name={0}&op=liststatus", user));
-    HttpURLConnection conn = (HttpURLConnection) url.openConnection();
-    Assert.assertEquals(HttpURLConnection.HTTP_OK, conn.getResponseCode());
-    BufferedReader reader = new BufferedReader(
-        new InputStreamReader(conn.getInputStream()));
-    reader.readLine();
-    reader.close();
     webServer.stop();
   }
 
   @Test
   public void testJustStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.stop();
   }
 
   @Test
   public void testDoubleStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
     webServer.stop();
     webServer.stop();
   }
 
   @Test
   public void testDoubleStart() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
+    webServer.start();
+    webServer.start();
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFile() throws Exception {
+    createSecretFile("foo");
+    webServer = createWebServer(createConfigurationWithSecretFile());
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithDeprecatedConfigOnly()
+      throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfiguration();
+    setDeprecatedSecretFile(conf, secretFile.getAbsolutePath());
+    webServer = createWebServer(conf);
+
+    // HttpFSAuthenticationFilter gets the configuration from the webapp, the defaults are loaded
+    // so the custom file location is overwritten with a non-existing default one.
+    Exception exception =
+        Assert.assertThrows(IOException.class, new ThrowingRunnable() {
+          @Override
+          public void run() throws Throwable {
+            webServer.start();
+          }
+        });
+    Assert.assertTrue(exception.getMessage().startsWith(
+        "Unable to initialize"));
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithBothConfigOption() throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfigurationWithSecretFile();
+    setDeprecatedSecretFile(conf, secretFile.getAbsolutePath());
+    webServer = createWebServer(conf);
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithBothConfigOptionOverWriteCheck()

Review comment:
       Done.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] hadoop-yetus commented on pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
hadoop-yetus commented on pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#issuecomment-891134317


   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |:----:|----------:|--------:|:--------:|:-------:|
   | +0 :ok: |  reexec  |   1m 31s |  |  Docker mode activated.  |
   |||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files found.  |
   | +0 :ok: |  codespell  |   0m  2s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to include 8 new or modified test files.  |
   |||| _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  13m 19s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  35m  6s |  |  trunk passed  |
   | -1 :x: |  compile  |   7m 31s | [/branch-compile-root-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/branch-compile-root-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt) |  root in trunk failed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.  |
   | -1 :x: |  compile  |   0m 32s | [/branch-compile-root-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/branch-compile-root-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt) |  root in trunk failed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.  |
   | -0 :warning: |  checkstyle  |   0m 29s | [/buildtool-branch-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/buildtool-branch-checkstyle-root.txt) |  The patch fails to run checkstyle in root  |
   | -1 :x: |  mvnsite  |   0m 32s | [/branch-mvnsite-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-common.txt) |  hadoop-common in trunk failed.  |
   | -1 :x: |  mvnsite  |   0m 28s | [/branch-mvnsite-hadoop-common-project_hadoop-kms.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/branch-mvnsite-hadoop-common-project_hadoop-kms.txt) |  hadoop-kms in trunk failed.  |
   | -1 :x: |  mvnsite  |   0m 28s | [/branch-mvnsite-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/branch-mvnsite-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt) |  hadoop-hdfs-httpfs in trunk failed.  |
   | -1 :x: |  javadoc  |   0m 28s | [/branch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/branch-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt) |  hadoop-common in trunk failed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.  |
   | -1 :x: |  javadoc  |   0m 30s | [/branch-javadoc-hadoop-common-project_hadoop-kms-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/branch-javadoc-hadoop-common-project_hadoop-kms-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt) |  hadoop-kms in trunk failed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.  |
   | -1 :x: |  javadoc  |   0m 30s | [/branch-javadoc-hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/branch-javadoc-hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt) |  hadoop-hdfs-httpfs in trunk failed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.  |
   | -1 :x: |  javadoc  |   0m 34s | [/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/branch-javadoc-hadoop-common-project_hadoop-common-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt) |  hadoop-common in trunk failed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.  |
   | -1 :x: |  javadoc  |   0m 33s | [/branch-javadoc-hadoop-common-project_hadoop-kms-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/branch-javadoc-hadoop-common-project_hadoop-kms-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt) |  hadoop-kms in trunk failed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.  |
   | -1 :x: |  javadoc  |   0m 28s | [/branch-javadoc-hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/branch-javadoc-hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt) |  hadoop-hdfs-httpfs in trunk failed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.  |
   | -1 :x: |  spotbugs  |   0m 30s | [/branch-spotbugs-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/branch-spotbugs-hadoop-common-project_hadoop-common.txt) |  hadoop-common in trunk failed.  |
   | -1 :x: |  spotbugs  |   0m 31s | [/branch-spotbugs-hadoop-common-project_hadoop-kms.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/branch-spotbugs-hadoop-common-project_hadoop-kms.txt) |  hadoop-kms in trunk failed.  |
   | -1 :x: |  spotbugs  |   0m 30s | [/branch-spotbugs-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/branch-spotbugs-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt) |  hadoop-hdfs-httpfs in trunk failed.  |
   | +1 :green_heart: |  shadedclient  |   7m 19s |  |  branch has no errors when building and testing our client artifacts.  |
   |||| _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 20s |  |  Maven dependency ordering for patch  |
   | -1 :x: |  mvninstall  |   0m 24s | [/patch-mvninstall-hadoop-common-project_hadoop-common.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/patch-mvninstall-hadoop-common-project_hadoop-common.txt) |  hadoop-common in the patch failed.  |
   | -1 :x: |  mvninstall  |   0m 23s | [/patch-mvninstall-hadoop-common-project_hadoop-kms.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/patch-mvninstall-hadoop-common-project_hadoop-kms.txt) |  hadoop-kms in the patch failed.  |
   | -1 :x: |  mvninstall  |   0m 24s | [/patch-mvninstall-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/patch-mvninstall-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt) |  hadoop-hdfs-httpfs in the patch failed.  |
   | -1 :x: |  compile  |  18m  8s | [/patch-compile-root-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/patch-compile-root-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt) |  root in the patch failed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.  |
   | -1 :x: |  javac  |  18m  8s | [/patch-compile-root-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/patch-compile-root-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt) |  root in the patch failed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.  |
   | +1 :green_heart: |  compile  |  25m 48s |  |  the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | -1 :x: |  javac  |  25m 48s | [/results-compile-javac-root-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/results-compile-javac-root-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt) |  root-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 generated 1788 new + 0 unchanged - 0 fixed = 1788 total (was 0)  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks issues.  |
   | -0 :warning: |  checkstyle  |   4m  2s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/results-checkstyle-root.txt) |  root: The patch generated 97 new + 0 unchanged - 0 fixed = 97 total (was 0)  |
   | -1 :x: |  mvnsite  |   0m 30s | [/patch-mvnsite-hadoop-common-project_hadoop-kms.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/patch-mvnsite-hadoop-common-project_hadoop-kms.txt) |  hadoop-kms in the patch failed.  |
   | -1 :x: |  mvnsite  |   0m 30s | [/patch-mvnsite-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/patch-mvnsite-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt) |  hadoop-hdfs-httpfs in the patch failed.  |
   | +1 :green_heart: |  xml  |   0m  2s |  |  The patch has no ill-formed XML file.  |
   | -1 :x: |  javadoc  |   1m  3s | [/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/results-javadoc-javadoc-hadoop-common-project_hadoop-common-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt) |  hadoop-common-project_hadoop-common-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 generated 99 new + 0 unchanged - 0 fixed = 99 total (was 0)  |
   | -1 :x: |  javadoc  |   0m 34s | [/results-javadoc-javadoc-hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/results-javadoc-javadoc-hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04.txt) |  hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkUbuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 generated 55 new + 0 unchanged - 0 fixed = 55 total (was 0)  |
   | -1 :x: |  javadoc  |   0m 32s | [/results-javadoc-javadoc-hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/results-javadoc-javadoc-hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10.txt) |  hadoop-hdfs-project_hadoop-hdfs-httpfs-jdkPrivateBuild-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 generated 55 new + 0 unchanged - 0 fixed = 55 total (was 0)  |
   | -1 :x: |  spotbugs  |   0m 28s | [/patch-spotbugs-hadoop-common-project_hadoop-kms.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/patch-spotbugs-hadoop-common-project_hadoop-kms.txt) |  hadoop-kms in the patch failed.  |
   | -1 :x: |  spotbugs  |   0m 30s | [/patch-spotbugs-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/patch-spotbugs-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt) |  hadoop-hdfs-httpfs in the patch failed.  |
   | +1 :green_heart: |  shadedclient  |  18m 51s |  |  patch has no errors when building and testing our client artifacts.  |
   |||| _ Other Tests _ |
   | +1 :green_heart: |  unit  |  17m 36s |  |  hadoop-common in the patch passed.  |
   | -1 :x: |  unit  |   0m 31s | [/patch-unit-hadoop-common-project_hadoop-kms.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/patch-unit-hadoop-common-project_hadoop-kms.txt) |  hadoop-kms in the patch failed.  |
   | -1 :x: |  unit  |   0m 33s | [/patch-unit-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs-httpfs.txt) |  hadoop-hdfs-httpfs in the patch failed.  |
   | +1 :green_heart: |  asflicense  |   0m 49s |  |  The patch does not generate ASF License warnings.  |
   |  |   | 166m 47s |  |  |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/artifact/out/Dockerfile |
   | GITHUB PR | https://github.com/apache/hadoop/pull/3209 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell xml |
   | uname | Linux 519c683cfdd8 4.15.0-128-generic #131-Ubuntu SMP Wed Dec 9 06:57:35 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 47d9b69ccb9ef077d0da7ed94c225a9403deac97 |
   | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
   |  Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/testReport/ |
   | Max. process+thread count | 1664 (vs. ulimit of 5500) |
   | modules | C: hadoop-common-project/hadoop-common hadoop-common-project/hadoop-kms hadoop-hdfs-project/hadoop-hdfs-httpfs U: . |
   | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/5/console |
   | versions | git=2.25.1 maven=3.6.3 |
   | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] brumi1024 commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
brumi1024 commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682480203



##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSAuthenticationFilter.java
##########
@@ -69,27 +72,11 @@
   @Override
   protected Properties getConfiguration(String configPrefix,
       FilterConfig filterConfig) throws ServletException{
-    Properties props = new Properties();
+    System.out.println("getConfiguration1");
     Configuration conf = HttpFSServerWebApp.get().getConfig();
-
-    props.setProperty(AuthenticationFilter.COOKIE_PATH, "/");
-    for (Map.Entry<String, String> entry : conf) {
-      String name = entry.getKey();
-      if (name.startsWith(HADOOP_HTTP_CONF_PREFIX)) {
-        name = name.substring(HADOOP_HTTP_CONF_PREFIX.length());
-        props.setProperty(name, entry.getValue());
-      }
-    }
-
-    // Replace Hadoop Http Authentication Configs with HttpFS specific Configs
-    for (Map.Entry<String, String> entry : conf) {
-      String name = entry.getKey();
-      if (name.startsWith(CONF_PREFIX)) {
-        String value = conf.get(name);
-        name = name.substring(CONF_PREFIX.length());
-        props.setProperty(name, value);
-      }
-    }
+    Properties props = HttpServer2.getFilterProperties(conf,
+        new ArrayList<>(Arrays.asList(CONF_PREFIXES)));
+    System.out.println("getConfiguration2");

Review comment:
       Nit: I think this could be removed.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSAuthenticationFilter.java
##########
@@ -98,6 +85,7 @@ protected Properties getConfiguration(String configPrefix,
     }
 
     if (!isRandomSecret(filterConfig)) {
+      System.out.println("FILE: " + signatureSecretFile);

Review comment:
       Nit: I think this could be removed.

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -243,7 +243,9 @@
 
     private String hostName;
     private boolean disallowFallbackToRandomSignerSecretProvider;
-    private String authFilterConfigurationPrefix = "hadoop.http.authentication.";
+    private final ArrayList<String> authFilterConfigurationPrefixes =
+        new ArrayList<>(Collections.singletonList(
+            "hadoop.http.authentication."));

Review comment:
       Nit: can you use a common final variable here as well?

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -811,18 +822,26 @@ private static SignerSecretProvider constructSecretProvider(final Builder b,
       throws Exception {
     final Configuration conf = b.conf;
     Properties config = getFilterProperties(conf,
-                                            b.authFilterConfigurationPrefix);
+        b.authFilterConfigurationPrefixes);
     return AuthenticationFilter.constructSecretProvider(
         ctx, config, b.disallowFallbackToRandomSignerSecretProvider);
   }
 
-  private static Properties getFilterProperties(Configuration conf, String
-      prefix) {
-    Properties prop = new Properties();
-    Map<String, String> filterConfig = AuthenticationFilterInitializer
-        .getFilterConfigMap(conf, prefix);
-    prop.putAll(filterConfig);
-    return prop;
+  public static Properties getFilterProperties(Configuration conf,
+                                                ArrayList<String> prefixes) {
+    Properties props = new Properties();
+    prefixes.forEach(prefix -> {
+      Map<String, String> filterConfigMap =
+          AuthenticationFilterInitializer.getFilterConfigMap(conf, prefix);
+      filterConfigMap.forEach((key, value) -> {
+        Object previous = props.setProperty(key, value);
+        if (previous != null && !previous.equals(value)) {
+          LOG.warn("Overwriting configuration for key='{}' with value='{}' " +
+              "previous_value='{}'", key, value, previous);
+        }
+      });
+    });

Review comment:
       I generally don't have anything against using lambdas, but this doesn't seem like an improvement over a simple for or for-each cycle. Performance wise (doesn't matter too much in this case) it's similar, but this I think is a bit harder to read, and debugging it is harder (because the unusual stacktrace that comes from the lambda), and in the future if for some reason someone wants to throw a checked exception when parsing the config he/she will need to rewrite this to a for cycle.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSAuthenticationFilter.java
##########
@@ -69,27 +72,11 @@
   @Override
   protected Properties getConfiguration(String configPrefix,
       FilterConfig filterConfig) throws ServletException{
-    Properties props = new Properties();
+    System.out.println("getConfiguration1");

Review comment:
       Nit: I think this could be removed.

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -811,18 +822,26 @@ private static SignerSecretProvider constructSecretProvider(final Builder b,
       throws Exception {
     final Configuration conf = b.conf;
     Properties config = getFilterProperties(conf,
-                                            b.authFilterConfigurationPrefix);
+        b.authFilterConfigurationPrefixes);
     return AuthenticationFilter.constructSecretProvider(
         ctx, config, b.disallowFallbackToRandomSignerSecretProvider);
   }
 
-  private static Properties getFilterProperties(Configuration conf, String
-      prefix) {
-    Properties prop = new Properties();
-    Map<String, String> filterConfig = AuthenticationFilterInitializer
-        .getFilterConfigMap(conf, prefix);
-    prop.putAll(filterConfig);
-    return prop;
+  public static Properties getFilterProperties(Configuration conf,
+                                                ArrayList<String> prefixes) {
+    Properties props = new Properties();
+    prefixes.forEach(prefix -> {
+      Map<String, String> filterConfigMap =
+          AuthenticationFilterInitializer.getFilterConfigMap(conf, prefix);
+      filterConfigMap.forEach((key, value) -> {
+        Object previous = props.setProperty(key, value);
+        if (previous != null && !previous.equals(value)) {
+          LOG.warn("Overwriting configuration for key='{}' with value='{}' " +
+              "previous_value='{}'", key, value, previous);
+        }
+      });
+    });

Review comment:
       I generally don't have anything against using lambdas, but this doesn't seem like an improvement over a simple for or for-each loop. Performance wise (doesn't matter too much in this case) it's similar, but this I think is a bit harder to read, and debugging it is harder (because the unusual stacktrace that comes from the lambda), and in the future if for some reason someone wants to throw a checked exception when parsing the config he/she will need to rewrite this to a for loop.

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -243,7 +243,9 @@
 
     private String hostName;
     private boolean disallowFallbackToRandomSignerSecretProvider;
-    private String authFilterConfigurationPrefix = "hadoop.http.authentication.";
+    private final ArrayList<String> authFilterConfigurationPrefixes =
+        new ArrayList<>(Collections.singletonList(
+            "hadoop.http.authentication."));

Review comment:
       I meant using the HttpFSAuthenticationFilter.HADOOP_HTTP_CONF_PREFIX instead of the literal :) 

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -811,18 +822,25 @@ private static SignerSecretProvider constructSecretProvider(final Builder b,
       throws Exception {
     final Configuration conf = b.conf;
     Properties config = getFilterProperties(conf,
-                                            b.authFilterConfigurationPrefix);
+        b.authFilterConfigurationPrefixes);
     return AuthenticationFilter.constructSecretProvider(
         ctx, config, b.disallowFallbackToRandomSignerSecretProvider);
   }
 
-  private static Properties getFilterProperties(Configuration conf, String
-      prefix) {
-    Properties prop = new Properties();
-    Map<String, String> filterConfig = AuthenticationFilterInitializer
-        .getFilterConfigMap(conf, prefix);
-    prop.putAll(filterConfig);
-    return prop;
+  public static Properties getFilterProperties(final Configuration conf, final List<String> prefixes) {

Review comment:
       Another nit: I know there are two "schools" about using final, but since this class doesn't use it excessively I think it shouldn't be changed in this simple method. This many finals seems more like a noise to me, and - as you've said it in another comment - it won't state that the collections themselves won't be modified just that the reference won't be reassigned.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] brumi1024 commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
brumi1024 commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682662434



##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -243,7 +243,9 @@
 
     private String hostName;
     private boolean disallowFallbackToRandomSignerSecretProvider;
-    private String authFilterConfigurationPrefix = "hadoop.http.authentication.";
+    private final ArrayList<String> authFilterConfigurationPrefixes =
+        new ArrayList<>(Collections.singletonList(
+            "hadoop.http.authentication."));

Review comment:
       I meant using the HttpFSAuthenticationFilter.HADOOP_HTTP_CONF_PREFIX instead of the literal :) 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] shuzirra merged pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
shuzirra merged pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] tomicooler commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
tomicooler commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682674656



##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -243,7 +243,9 @@
 
     private String hostName;
     private boolean disallowFallbackToRandomSignerSecretProvider;
-    private String authFilterConfigurationPrefix = "hadoop.http.authentication.";
+    private final ArrayList<String> authFilterConfigurationPrefixes =
+        new ArrayList<>(Collections.singletonList(
+            "hadoop.http.authentication."));

Review comment:
       HttpServer2 is in the hadoop-common while HttpFSAuthenticationFilter is in the hadoop-hdfs-project/hadoop-hdfs-httpfs. The later depends on the hadoop-common :/.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] szilard-nemeth commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
szilard-nemeth commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682637027



##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -473,8 +482,10 @@ public HttpServer2 build() throws IOException {
       HttpServer2 server = new HttpServer2(this);
 
       if (this.securityEnabled &&
-          !this.conf.get(authFilterConfigurationPrefix + "type").
-          equals(PseudoAuthenticationHandler.TYPE)) {
+          authFilterConfigurationPrefixes.stream().noneMatch(
+              prefix -> (prefix + "type")
+                  .equals(PseudoAuthenticationHandler.TYPE))
+      ) {

Review comment:
       Follow-up question then: How come that this mistake was not caught by any of the tests?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] tomicooler commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
tomicooler commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682661960



##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSAuthenticationFilter.java
##########
@@ -69,27 +72,11 @@
   @Override
   protected Properties getConfiguration(String configPrefix,
       FilterConfig filterConfig) throws ServletException{
-    Properties props = new Properties();
+    System.out.println("getConfiguration1");
     Configuration conf = HttpFSServerWebApp.get().getConfig();
-
-    props.setProperty(AuthenticationFilter.COOKIE_PATH, "/");

Review comment:
       Yes, the HttpServer2.getFilterProperties() will end up doing this.
   
   BTW the order of the overwriting has changed as I mentioned in the commit message. Also here the line 80 contained an error, calling the getValue() did not substituted the variable, but on the line 88 it did.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] tomicooler commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
tomicooler commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682577222



##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/test/java/org/apache/hadoop/fs/http/server/TestHttpFSServerWebServer.java
##########
@@ -71,53 +82,195 @@ public static void beforeClass() throws Exception {
     System.setProperty("httpfs.home.dir", homeDir.getAbsolutePath());
     System.setProperty("httpfs.log.dir", logsDir.getAbsolutePath());
     System.setProperty("httpfs.config.dir", confDir.getAbsolutePath());
-    FileUtils.writeStringToFile(new File(confDir, "httpfs-signature.secret"),
-        "foo", StandardCharsets.UTF_8);
+    secretFile = new File(System.getProperty("httpfs.config.dir"),
+        "httpfs-signature-custom.secret");
   }
 
-  @Before
-  public void setUp() throws Exception {
-    Configuration conf = new Configuration();
-    conf.set(HttpFSServerWebServer.HTTP_HOSTNAME_KEY, "localhost");
-    conf.setInt(HttpFSServerWebServer.HTTP_PORT_KEY, 0);
-    conf.set(AuthenticationFilter.SIGNATURE_SECRET_FILE,
-        "httpfs-signature.secret");
-    Configuration sslConf = new Configuration();
-    webServer = new HttpFSServerWebServer(conf, sslConf);
+  @After
+  public void teardown() throws Exception {
+    if (webServer != null) {
+      webServer.stop();
+    }
   }
 
   @Test
   public void testStartStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
-    String user = HadoopUsersConfTestHelper.getHadoopUsers()[0];
-    URL url = new URL(webServer.getUrl(), MessageFormat.format(
-        "/webhdfs/v1/?user.name={0}&op=liststatus", user));
-    HttpURLConnection conn = (HttpURLConnection) url.openConnection();
-    Assert.assertEquals(HttpURLConnection.HTTP_OK, conn.getResponseCode());
-    BufferedReader reader = new BufferedReader(
-        new InputStreamReader(conn.getInputStream()));
-    reader.readLine();
-    reader.close();
     webServer.stop();
   }
 
   @Test
   public void testJustStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.stop();
   }
 
   @Test
   public void testDoubleStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
     webServer.stop();
     webServer.stop();
   }
 
   @Test
   public void testDoubleStart() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
+    webServer.start();
+    webServer.start();
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFile() throws Exception {
+    createSecretFile("foo");
+    webServer = createWebServer(createConfigurationWithSecretFile());
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithDeprecatedConfigOnly()
+      throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfiguration();
+    setDeprecatedSecretFile(conf, secretFile.getAbsolutePath());
+    webServer = createWebServer(conf);
+
+    // HttpFSAuthenticationFilter gets the configuration from the webapp, the defaults are loaded
+    // so the custom file location is overwritten with a non-existing default one.
+    Exception exception =
+        Assert.assertThrows(IOException.class, new ThrowingRunnable() {
+          @Override
+          public void run() throws Throwable {
+            webServer.start();
+          }
+        });
+    Assert.assertTrue(exception.getMessage().startsWith(
+        "Unable to initialize"));
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithBothConfigOption() throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfigurationWithSecretFile();
+    setDeprecatedSecretFile(conf, secretFile.getAbsolutePath());
+    webServer = createWebServer(conf);
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithBothConfigOptionOverWriteCheck()
+      throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfigurationWithSecretFile();
+    setDeprecatedSecretFile(conf, "will-be-overwritten");
+    webServer = createWebServer(conf);
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithMissingSecretFile() throws Exception {
+    webServer = createWebServer(createConfigurationWithSecretFile());
     webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        RandomSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithEmptySecretFile() throws Exception {
+    // The AuthenticationFilter.constructSecretProvider will do the fallback
+    // to the random secrets not the HttpFSAuthenticationFilter.
+    createSecretFile("");
+    webServer = createWebServer(createConfigurationWithSecretFile());
     webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        RandomSignerSecretProvider.class);
     webServer.stop();
   }
 
+  private <T extends SignerSecretProvider> void assertSignerSecretProviderType(
+      HttpServer2 server, Class<T> expected) {
+    SignerSecretProvider secretProvider = (SignerSecretProvider)
+        server.getWebAppContext().getServletContext()
+            .getAttribute(SIGNER_SECRET_PROVIDER_ATTRIBUTE);
+    Assert.assertNotNull(secretProvider);
+    Assert.assertEquals(expected, secretProvider.getClass());

Review comment:
       Done.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] szilard-nemeth commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
szilard-nemeth commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682491876



##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -473,8 +482,10 @@ public HttpServer2 build() throws IOException {
       HttpServer2 server = new HttpServer2(this);
 
       if (this.securityEnabled &&
-          !this.conf.get(authFilterConfigurationPrefix + "type").
-          equals(PseudoAuthenticationHandler.TYPE)) {
+          authFilterConfigurationPrefixes.stream().noneMatch(
+              prefix -> (prefix + "type")
+                  .equals(PseudoAuthenticationHandler.TYPE))
+      ) {

Review comment:
       Maybe I'm missing something but this is not doing the same as the old code block.
   The old code checked if the configuration object has a key set for the prefix + type.
   With the new code, you are just checking if the Stream constructed from authFilterConfigurationPrefixes does not match the prefix + type. 
   Is this intentional?

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/test/java/org/apache/hadoop/fs/http/server/TestHttpFSServerWebServer.java
##########
@@ -71,53 +82,195 @@ public static void beforeClass() throws Exception {
     System.setProperty("httpfs.home.dir", homeDir.getAbsolutePath());
     System.setProperty("httpfs.log.dir", logsDir.getAbsolutePath());
     System.setProperty("httpfs.config.dir", confDir.getAbsolutePath());
-    FileUtils.writeStringToFile(new File(confDir, "httpfs-signature.secret"),
-        "foo", StandardCharsets.UTF_8);
+    secretFile = new File(System.getProperty("httpfs.config.dir"),
+        "httpfs-signature-custom.secret");
   }
 
-  @Before
-  public void setUp() throws Exception {
-    Configuration conf = new Configuration();
-    conf.set(HttpFSServerWebServer.HTTP_HOSTNAME_KEY, "localhost");
-    conf.setInt(HttpFSServerWebServer.HTTP_PORT_KEY, 0);
-    conf.set(AuthenticationFilter.SIGNATURE_SECRET_FILE,
-        "httpfs-signature.secret");
-    Configuration sslConf = new Configuration();
-    webServer = new HttpFSServerWebServer(conf, sslConf);
+  @After
+  public void teardown() throws Exception {
+    if (webServer != null) {
+      webServer.stop();
+    }
   }
 
   @Test
   public void testStartStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
-    String user = HadoopUsersConfTestHelper.getHadoopUsers()[0];
-    URL url = new URL(webServer.getUrl(), MessageFormat.format(
-        "/webhdfs/v1/?user.name={0}&op=liststatus", user));
-    HttpURLConnection conn = (HttpURLConnection) url.openConnection();
-    Assert.assertEquals(HttpURLConnection.HTTP_OK, conn.getResponseCode());
-    BufferedReader reader = new BufferedReader(
-        new InputStreamReader(conn.getInputStream()));
-    reader.readLine();
-    reader.close();

Review comment:
       I don't get why this code block was deleted? 
   Also I don't see a call to assertServiceRespondsWithOK either.
   Please elaborate.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSAuthenticationFilter.java
##########
@@ -69,27 +72,11 @@
   @Override
   protected Properties getConfiguration(String configPrefix,
       FilterConfig filterConfig) throws ServletException{
-    Properties props = new Properties();
+    System.out.println("getConfiguration1");
     Configuration conf = HttpFSServerWebApp.get().getConfig();
-
-    props.setProperty(AuthenticationFilter.COOKIE_PATH, "/");

Review comment:
       Have you intentionally deleted this line?
   ```
   props.setProperty(AuthenticationFilter.COOKIE_PATH, "/");
   ```

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -811,18 +822,26 @@ private static SignerSecretProvider constructSecretProvider(final Builder b,
       throws Exception {
     final Configuration conf = b.conf;
     Properties config = getFilterProperties(conf,
-                                            b.authFilterConfigurationPrefix);
+        b.authFilterConfigurationPrefixes);
     return AuthenticationFilter.constructSecretProvider(
         ctx, config, b.disallowFallbackToRandomSignerSecretProvider);
   }
 
-  private static Properties getFilterProperties(Configuration conf, String
-      prefix) {
-    Properties prop = new Properties();
-    Map<String, String> filterConfig = AuthenticationFilterInitializer
-        .getFilterConfigMap(conf, prefix);
-    prop.putAll(filterConfig);
-    return prop;
+  public static Properties getFilterProperties(Configuration conf,
+                                                ArrayList<String> prefixes) {

Review comment:
       Same as above, please use List instead of ArrayList, interface types over concrete types.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/test/java/org/apache/hadoop/fs/http/client/TestHttpFSFileSystemLocalFileSystem.java
##########
@@ -32,8 +32,6 @@
 import org.junit.runners.Parameterized;
 
 import java.io.File;
-import java.net.URI;

Review comment:
       These removals seem unrelated to the patch, please revert these changes if they are not crucial for the patch.

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -811,18 +822,26 @@ private static SignerSecretProvider constructSecretProvider(final Builder b,
       throws Exception {
     final Configuration conf = b.conf;
     Properties config = getFilterProperties(conf,
-                                            b.authFilterConfigurationPrefix);
+        b.authFilterConfigurationPrefixes);
     return AuthenticationFilter.constructSecretProvider(
         ctx, config, b.disallowFallbackToRandomSignerSecretProvider);
   }
 
-  private static Properties getFilterProperties(Configuration conf, String
-      prefix) {
-    Properties prop = new Properties();
-    Map<String, String> filterConfig = AuthenticationFilterInitializer
-        .getFilterConfigMap(conf, prefix);
-    prop.putAll(filterConfig);
-    return prop;
+  public static Properties getFilterProperties(Configuration conf,
+                                                ArrayList<String> prefixes) {
+    Properties props = new Properties();
+    prefixes.forEach(prefix -> {
+      Map<String, String> filterConfigMap =
+          AuthenticationFilterInitializer.getFilterConfigMap(conf, prefix);
+      filterConfigMap.forEach((key, value) -> {
+        Object previous = props.setProperty(key, value);
+        if (previous != null && !previous.equals(value)) {
+          LOG.warn("Overwriting configuration for key='{}' with value='{}' " +
+              "previous_value='{}'", key, value, previous);

Review comment:
       ```suggestion
                 "previous value='{}'", key, value, previous);
   ```

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -243,7 +243,9 @@
 
     private String hostName;
     private boolean disallowFallbackToRandomSignerSecretProvider;
-    private String authFilterConfigurationPrefix = "hadoop.http.authentication.";
+    private final ArrayList<String> authFilterConfigurationPrefixes =

Review comment:
       Use List instead of ArrayList<>. It's a better practice to define fields / local variables with interface types, see: https://stackoverflow.com/questions/9852831/polymorphism-why-use-list-list-new-arraylist-instead-of-arraylist-list-n

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSServerWebServer.java
##########
@@ -178,6 +179,10 @@ public URL getUrl() {
     }
   }
 
+  public HttpServer2 getHttpServer() {

Review comment:
       Was this only added for testing? If so, please add a VisibleForTesting annotation over the method.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSAuthenticationFilter.java
##########
@@ -69,27 +72,11 @@
   @Override
   protected Properties getConfiguration(String configPrefix,
       FilterConfig filterConfig) throws ServletException{
-    Properties props = new Properties();
+    System.out.println("getConfiguration1");

Review comment:
       Please get rid of all System.out.println calls or alternatively, use the Logger.info method.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/test/java/org/apache/hadoop/fs/http/server/TestHttpFSServerWebServer.java
##########
@@ -71,53 +82,195 @@ public static void beforeClass() throws Exception {
     System.setProperty("httpfs.home.dir", homeDir.getAbsolutePath());
     System.setProperty("httpfs.log.dir", logsDir.getAbsolutePath());
     System.setProperty("httpfs.config.dir", confDir.getAbsolutePath());
-    FileUtils.writeStringToFile(new File(confDir, "httpfs-signature.secret"),
-        "foo", StandardCharsets.UTF_8);
+    secretFile = new File(System.getProperty("httpfs.config.dir"),
+        "httpfs-signature-custom.secret");
   }
 
-  @Before
-  public void setUp() throws Exception {
-    Configuration conf = new Configuration();
-    conf.set(HttpFSServerWebServer.HTTP_HOSTNAME_KEY, "localhost");
-    conf.setInt(HttpFSServerWebServer.HTTP_PORT_KEY, 0);
-    conf.set(AuthenticationFilter.SIGNATURE_SECRET_FILE,
-        "httpfs-signature.secret");
-    Configuration sslConf = new Configuration();
-    webServer = new HttpFSServerWebServer(conf, sslConf);
+  @After
+  public void teardown() throws Exception {
+    if (webServer != null) {
+      webServer.stop();
+    }
   }
 
   @Test
   public void testStartStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
-    String user = HadoopUsersConfTestHelper.getHadoopUsers()[0];
-    URL url = new URL(webServer.getUrl(), MessageFormat.format(
-        "/webhdfs/v1/?user.name={0}&op=liststatus", user));
-    HttpURLConnection conn = (HttpURLConnection) url.openConnection();
-    Assert.assertEquals(HttpURLConnection.HTTP_OK, conn.getResponseCode());
-    BufferedReader reader = new BufferedReader(
-        new InputStreamReader(conn.getInputStream()));
-    reader.readLine();
-    reader.close();
     webServer.stop();
   }
 
   @Test
   public void testJustStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.stop();
   }
 
   @Test
   public void testDoubleStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
     webServer.stop();
     webServer.stop();
   }
 
   @Test
   public void testDoubleStart() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
+    webServer.start();
+    webServer.start();
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFile() throws Exception {
+    createSecretFile("foo");
+    webServer = createWebServer(createConfigurationWithSecretFile());
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithDeprecatedConfigOnly()
+      throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfiguration();
+    setDeprecatedSecretFile(conf, secretFile.getAbsolutePath());
+    webServer = createWebServer(conf);
+
+    // HttpFSAuthenticationFilter gets the configuration from the webapp, the defaults are loaded
+    // so the custom file location is overwritten with a non-existing default one.
+    Exception exception =
+        Assert.assertThrows(IOException.class, new ThrowingRunnable() {
+          @Override
+          public void run() throws Throwable {
+            webServer.start();
+          }
+        });
+    Assert.assertTrue(exception.getMessage().startsWith(
+        "Unable to initialize"));
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithBothConfigOption() throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfigurationWithSecretFile();
+    setDeprecatedSecretFile(conf, secretFile.getAbsolutePath());
+    webServer = createWebServer(conf);
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithBothConfigOptionOverWriteCheck()

Review comment:
       Nit: testServiceWithSecretFileWithBothConfigOptionsOverwriteCheck
   options: plural
   overwrite: one word

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/test/java/org/apache/hadoop/fs/http/server/TestHttpFSServerWebServer.java
##########
@@ -71,53 +82,195 @@ public static void beforeClass() throws Exception {
     System.setProperty("httpfs.home.dir", homeDir.getAbsolutePath());
     System.setProperty("httpfs.log.dir", logsDir.getAbsolutePath());
     System.setProperty("httpfs.config.dir", confDir.getAbsolutePath());
-    FileUtils.writeStringToFile(new File(confDir, "httpfs-signature.secret"),
-        "foo", StandardCharsets.UTF_8);
+    secretFile = new File(System.getProperty("httpfs.config.dir"),
+        "httpfs-signature-custom.secret");
   }
 
-  @Before
-  public void setUp() throws Exception {
-    Configuration conf = new Configuration();
-    conf.set(HttpFSServerWebServer.HTTP_HOSTNAME_KEY, "localhost");
-    conf.setInt(HttpFSServerWebServer.HTTP_PORT_KEY, 0);
-    conf.set(AuthenticationFilter.SIGNATURE_SECRET_FILE,
-        "httpfs-signature.secret");
-    Configuration sslConf = new Configuration();
-    webServer = new HttpFSServerWebServer(conf, sslConf);
+  @After
+  public void teardown() throws Exception {
+    if (webServer != null) {
+      webServer.stop();
+    }
   }
 
   @Test
   public void testStartStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
-    String user = HadoopUsersConfTestHelper.getHadoopUsers()[0];
-    URL url = new URL(webServer.getUrl(), MessageFormat.format(
-        "/webhdfs/v1/?user.name={0}&op=liststatus", user));
-    HttpURLConnection conn = (HttpURLConnection) url.openConnection();
-    Assert.assertEquals(HttpURLConnection.HTTP_OK, conn.getResponseCode());
-    BufferedReader reader = new BufferedReader(
-        new InputStreamReader(conn.getInputStream()));
-    reader.readLine();
-    reader.close();
     webServer.stop();
   }
 
   @Test
   public void testJustStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.stop();
   }
 
   @Test
   public void testDoubleStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
     webServer.stop();
     webServer.stop();
   }
 
   @Test
   public void testDoubleStart() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
+    webServer.start();
+    webServer.start();
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFile() throws Exception {
+    createSecretFile("foo");
+    webServer = createWebServer(createConfigurationWithSecretFile());
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithDeprecatedConfigOnly()
+      throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfiguration();
+    setDeprecatedSecretFile(conf, secretFile.getAbsolutePath());
+    webServer = createWebServer(conf);
+
+    // HttpFSAuthenticationFilter gets the configuration from the webapp, the defaults are loaded
+    // so the custom file location is overwritten with a non-existing default one.
+    Exception exception =
+        Assert.assertThrows(IOException.class, new ThrowingRunnable() {
+          @Override
+          public void run() throws Throwable {
+            webServer.start();
+          }
+        });
+    Assert.assertTrue(exception.getMessage().startsWith(
+        "Unable to initialize"));
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithBothConfigOption() throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfigurationWithSecretFile();
+    setDeprecatedSecretFile(conf, secretFile.getAbsolutePath());
+    webServer = createWebServer(conf);
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithBothConfigOptionOverWriteCheck()
+      throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfigurationWithSecretFile();
+    setDeprecatedSecretFile(conf, "will-be-overwritten");
+    webServer = createWebServer(conf);
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithMissingSecretFile() throws Exception {
+    webServer = createWebServer(createConfigurationWithSecretFile());
     webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        RandomSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithEmptySecretFile() throws Exception {
+    // The AuthenticationFilter.constructSecretProvider will do the fallback
+    // to the random secrets not the HttpFSAuthenticationFilter.
+    createSecretFile("");
+    webServer = createWebServer(createConfigurationWithSecretFile());
     webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        RandomSignerSecretProvider.class);
     webServer.stop();
   }
 
+  private <T extends SignerSecretProvider> void assertSignerSecretProviderType(
+      HttpServer2 server, Class<T> expected) {
+    SignerSecretProvider secretProvider = (SignerSecretProvider)
+        server.getWebAppContext().getServletContext()
+            .getAttribute(SIGNER_SECRET_PROVIDER_ATTRIBUTE);
+    Assert.assertNotNull(secretProvider);
+    Assert.assertEquals(expected, secretProvider.getClass());

Review comment:
       Please add assertion messages.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/test/java/org/apache/hadoop/fs/http/server/TestHttpFSServerWebServer.java
##########
@@ -71,53 +82,195 @@ public static void beforeClass() throws Exception {
     System.setProperty("httpfs.home.dir", homeDir.getAbsolutePath());
     System.setProperty("httpfs.log.dir", logsDir.getAbsolutePath());
     System.setProperty("httpfs.config.dir", confDir.getAbsolutePath());
-    FileUtils.writeStringToFile(new File(confDir, "httpfs-signature.secret"),
-        "foo", StandardCharsets.UTF_8);
+    secretFile = new File(System.getProperty("httpfs.config.dir"),
+        "httpfs-signature-custom.secret");
   }
 
-  @Before
-  public void setUp() throws Exception {
-    Configuration conf = new Configuration();
-    conf.set(HttpFSServerWebServer.HTTP_HOSTNAME_KEY, "localhost");
-    conf.setInt(HttpFSServerWebServer.HTTP_PORT_KEY, 0);
-    conf.set(AuthenticationFilter.SIGNATURE_SECRET_FILE,
-        "httpfs-signature.secret");
-    Configuration sslConf = new Configuration();
-    webServer = new HttpFSServerWebServer(conf, sslConf);
+  @After
+  public void teardown() throws Exception {
+    if (webServer != null) {
+      webServer.stop();
+    }
   }
 
   @Test
   public void testStartStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
-    String user = HadoopUsersConfTestHelper.getHadoopUsers()[0];
-    URL url = new URL(webServer.getUrl(), MessageFormat.format(
-        "/webhdfs/v1/?user.name={0}&op=liststatus", user));
-    HttpURLConnection conn = (HttpURLConnection) url.openConnection();
-    Assert.assertEquals(HttpURLConnection.HTTP_OK, conn.getResponseCode());
-    BufferedReader reader = new BufferedReader(
-        new InputStreamReader(conn.getInputStream()));
-    reader.readLine();
-    reader.close();
     webServer.stop();
   }
 
   @Test
   public void testJustStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.stop();
   }
 
   @Test
   public void testDoubleStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
     webServer.stop();
     webServer.stop();
   }
 
   @Test
   public void testDoubleStart() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
+    webServer.start();
+    webServer.start();
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFile() throws Exception {
+    createSecretFile("foo");
+    webServer = createWebServer(createConfigurationWithSecretFile());
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithDeprecatedConfigOnly()
+      throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfiguration();
+    setDeprecatedSecretFile(conf, secretFile.getAbsolutePath());
+    webServer = createWebServer(conf);
+
+    // HttpFSAuthenticationFilter gets the configuration from the webapp, the defaults are loaded
+    // so the custom file location is overwritten with a non-existing default one.
+    Exception exception =
+        Assert.assertThrows(IOException.class, new ThrowingRunnable() {
+          @Override
+          public void run() throws Throwable {
+            webServer.start();
+          }
+        });
+    Assert.assertTrue(exception.getMessage().startsWith(
+        "Unable to initialize"));
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithBothConfigOption() throws Exception {

Review comment:
       Nit: testServiceWithSecretFileWithBothConfigOptions (plural)




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] brumi1024 commented on pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
brumi1024 commented on pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#issuecomment-894222863


   @tomicooler thanks for the updates, it looks good to me, +1 (non-binding).


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] tomicooler commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
tomicooler commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682567701



##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -811,18 +822,26 @@ private static SignerSecretProvider constructSecretProvider(final Builder b,
       throws Exception {
     final Configuration conf = b.conf;
     Properties config = getFilterProperties(conf,
-                                            b.authFilterConfigurationPrefix);
+        b.authFilterConfigurationPrefixes);
     return AuthenticationFilter.constructSecretProvider(
         ctx, config, b.disallowFallbackToRandomSignerSecretProvider);
   }
 
-  private static Properties getFilterProperties(Configuration conf, String
-      prefix) {
-    Properties prop = new Properties();
-    Map<String, String> filterConfig = AuthenticationFilterInitializer
-        .getFilterConfigMap(conf, prefix);
-    prop.putAll(filterConfig);
-    return prop;
+  public static Properties getFilterProperties(Configuration conf,
+                                                ArrayList<String> prefixes) {

Review comment:
       Done.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] hadoop-yetus commented on pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
hadoop-yetus commented on pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#issuecomment-892877938


   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |:----:|----------:|--------:|:--------:|:-------:|
   | +0 :ok: |  reexec  |   0m 37s |  |  Docker mode activated.  |
   |||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  1s |  |  The patch does not contain any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to include 8 new or modified test files.  |
   |||| _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  13m 40s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  20m  7s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  21m  7s |  |  trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  compile  |  18m 30s |  |  trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  checkstyle  |   3m 39s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   3m 14s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   2m 34s |  |  trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javadoc  |   3m  7s |  |  trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  spotbugs  |   4m 21s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  14m 37s |  |  branch has no errors when building and testing our client artifacts.  |
   |||| _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 28s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 41s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  20m 22s |  |  the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javac  |  20m 22s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  18m 31s |  |  the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  javac  |  18m 31s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks issues.  |
   | -0 :warning: |  checkstyle  |   3m 37s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/11/artifact/out/results-checkstyle-root.txt) |  root: The patch generated 1 new + 100 unchanged - 0 fixed = 101 total (was 100)  |
   | +1 :green_heart: |  mvnsite  |   3m 13s |  |  the patch passed  |
   | +1 :green_heart: |  xml  |   0m  2s |  |  The patch has no ill-formed XML file.  |
   | +1 :green_heart: |  javadoc  |   2m 32s |  |  the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javadoc  |   3m  3s |  |  the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  spotbugs  |   4m 54s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  14m 43s |  |  patch has no errors when building and testing our client artifacts.  |
   |||| _ Other Tests _ |
   | +1 :green_heart: |  unit  |  17m  6s |  |  hadoop-common in the patch passed.  |
   | +1 :green_heart: |  unit  |   3m 43s |  |  hadoop-kms in the patch passed.  |
   | +1 :green_heart: |  unit  |   6m 16s |  |  hadoop-hdfs-httpfs in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   1m  0s |  |  The patch does not generate ASF License warnings.  |
   |  |   | 211m 10s |  |  |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/11/artifact/out/Dockerfile |
   | GITHUB PR | https://github.com/apache/hadoop/pull/3209 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell xml |
   | uname | Linux 8ea886ae5ac7 4.15.0-58-generic #64-Ubuntu SMP Tue Aug 6 11:12:41 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 7c42bad8258997e3c07676dfd67b5faadd22f05a |
   | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
   |  Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/11/testReport/ |
   | Max. process+thread count | 1295 (vs. ulimit of 5500) |
   | modules | C: hadoop-common-project/hadoop-common hadoop-common-project/hadoop-kms hadoop-hdfs-project/hadoop-hdfs-httpfs U: . |
   | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/11/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] szilard-nemeth commented on pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
szilard-nemeth commented on pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#issuecomment-890356871


   @tomicooler The latest build looks way better than before. Could you please check if any of the UT failures are related to your patch? 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] hadoop-yetus commented on pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
hadoop-yetus commented on pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#issuecomment-904733944


   :confetti_ball: **+1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |:----:|----------:|--------:|:--------:|:-------:|
   | +0 :ok: |  reexec  |   1m  6s |  |  Docker mode activated.  |
   |||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files found.  |
   | +0 :ok: |  codespell  |   0m  1s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to include 7 new or modified test files.  |
   |||| _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  12m 55s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  20m 29s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  30m 29s |  |  trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  compile  |  18m 34s |  |  trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  checkstyle  |   3m 42s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   3m 17s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   2m 35s |  |  trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javadoc  |   3m  4s |  |  trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  spotbugs  |   4m 23s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  14m 57s |  |  branch has no errors when building and testing our client artifacts.  |
   |||| _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 28s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 42s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  21m 29s |  |  the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javac  |  21m 29s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  18m 51s |  |  the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  javac  |  18m 51s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks issues.  |
   | -0 :warning: |  checkstyle  |   3m 33s | [/results-checkstyle-root.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/13/artifact/out/results-checkstyle-root.txt) |  root: The patch generated 1 new + 95 unchanged - 0 fixed = 96 total (was 95)  |
   | +1 :green_heart: |  mvnsite  |   3m 13s |  |  the patch passed  |
   | +1 :green_heart: |  xml  |   0m  1s |  |  The patch has no ill-formed XML file.  |
   | +1 :green_heart: |  javadoc  |   2m 29s |  |  the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javadoc  |   3m  2s |  |  the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  spotbugs  |   5m  1s |  |  the patch passed  |
   | +1 :green_heart: |  shadedclient  |  14m 43s |  |  patch has no errors when building and testing our client artifacts.  |
   |||| _ Other Tests _ |
   | +1 :green_heart: |  unit  |  17m  5s |  |  hadoop-common in the patch passed.  |
   | +1 :green_heart: |  unit  |   3m 43s |  |  hadoop-kms in the patch passed.  |
   | +1 :green_heart: |  unit  |   6m 17s |  |  hadoop-hdfs-httpfs in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   1m  0s |  |  The patch does not generate ASF License warnings.  |
   |  |   | 222m 51s |  |  |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/13/artifact/out/Dockerfile |
   | GITHUB PR | https://github.com/apache/hadoop/pull/3209 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell xml |
   | uname | Linux ffbe3d1b7034 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / f4cffb51d01bd310ceee065c90b7fb7e2589edaf |
   | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
   |  Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/13/testReport/ |
   | Max. process+thread count | 3104 (vs. ulimit of 5500) |
   | modules | C: hadoop-common-project/hadoop-common hadoop-common-project/hadoop-kms hadoop-hdfs-project/hadoop-hdfs-httpfs U: . |
   | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/13/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] tomicooler commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
tomicooler commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682557011



##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -473,8 +482,10 @@ public HttpServer2 build() throws IOException {
       HttpServer2 server = new HttpServer2(this);
 
       if (this.securityEnabled &&
-          !this.conf.get(authFilterConfigurationPrefix + "type").
-          equals(PseudoAuthenticationHandler.TYPE)) {
+          authFilterConfigurationPrefixes.stream().noneMatch(
+              prefix -> (prefix + "type")
+                  .equals(PseudoAuthenticationHandler.TYPE))
+      ) {

Review comment:
       Good catch. it should have been this.conf.get(prefix + "type").equals. Here I find the stream().noneMatch useful with the lambda expression. I won't resolve the thread yet, so I can replace it with a helper function if needed.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] tomicooler commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
tomicooler commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682509915



##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -243,7 +243,9 @@
 
     private String hostName;
     private boolean disallowFallbackToRandomSignerSecretProvider;
-    private String authFilterConfigurationPrefix = "hadoop.http.authentication.";
+    private final ArrayList<String> authFilterConfigurationPrefixes =
+        new ArrayList<>(Collections.singletonList(
+            "hadoop.http.authentication."));

Review comment:
       The array is final, but the elements are inside it will be changed, checkstyle gives me a warning if I rename it to upper case.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] tomicooler commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
tomicooler commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682695204



##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -811,18 +822,25 @@ private static SignerSecretProvider constructSecretProvider(final Builder b,
       throws Exception {
     final Configuration conf = b.conf;
     Properties config = getFilterProperties(conf,
-                                            b.authFilterConfigurationPrefix);
+        b.authFilterConfigurationPrefixes);
     return AuthenticationFilter.constructSecretProvider(
         ctx, config, b.disallowFallbackToRandomSignerSecretProvider);
   }
 
-  private static Properties getFilterProperties(Configuration conf, String
-      prefix) {
-    Properties prop = new Properties();
-    Map<String, String> filterConfig = AuthenticationFilterInitializer
-        .getFilterConfigMap(conf, prefix);
-    prop.putAll(filterConfig);
-    return prop;
+  public static Properties getFilterProperties(final Configuration conf, final List<String> prefixes) {

Review comment:
       Ok, I'll change it back and remove the finals from the inner lines too, it is quite short anyway. 
   
   I read some debates on this topic, if final is used consistently then the ones without final will catch your eyes more easily. (e.g.: http://www.javapractices.com/topic/TopicAction.do?Id=23)




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] tomicooler commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
tomicooler commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682557011



##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -473,8 +482,10 @@ public HttpServer2 build() throws IOException {
       HttpServer2 server = new HttpServer2(this);
 
       if (this.securityEnabled &&
-          !this.conf.get(authFilterConfigurationPrefix + "type").
-          equals(PseudoAuthenticationHandler.TYPE)) {
+          authFilterConfigurationPrefixes.stream().noneMatch(
+              prefix -> (prefix + "type")
+                  .equals(PseudoAuthenticationHandler.TYPE))
+      ) {

Review comment:
       Good catch. it should have been this.conf.get(prefix + "type").equals.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] brumi1024 commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
brumi1024 commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682682887



##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -811,18 +822,25 @@ private static SignerSecretProvider constructSecretProvider(final Builder b,
       throws Exception {
     final Configuration conf = b.conf;
     Properties config = getFilterProperties(conf,
-                                            b.authFilterConfigurationPrefix);
+        b.authFilterConfigurationPrefixes);
     return AuthenticationFilter.constructSecretProvider(
         ctx, config, b.disallowFallbackToRandomSignerSecretProvider);
   }
 
-  private static Properties getFilterProperties(Configuration conf, String
-      prefix) {
-    Properties prop = new Properties();
-    Map<String, String> filterConfig = AuthenticationFilterInitializer
-        .getFilterConfigMap(conf, prefix);
-    prop.putAll(filterConfig);
-    return prop;
+  public static Properties getFilterProperties(final Configuration conf, final List<String> prefixes) {

Review comment:
       Another nit: I know there are two "schools" about using final, but since this class doesn't use it excessively I think it shouldn't be changed in this simple method. This many finals seems more like a noise to me, and - as you've said it in another comment - it won't state that the collections themselves won't be modified just that the reference won't be reassigned.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] hadoop-yetus commented on pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
hadoop-yetus commented on pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#issuecomment-891795900


   :broken_heart: **-1 overall**
   
   
   
   
   
   
   | Vote | Subsystem | Runtime |  Logfile | Comment |
   |:----:|----------:|--------:|:--------:|:-------:|
   | +0 :ok: |  reexec  |   1m 21s |  |  Docker mode activated.  |
   |||| _ Prechecks _ |
   | +1 :green_heart: |  dupname  |   0m  0s |  |  No case conflicting files found.  |
   | +0 :ok: |  codespell  |   0m  0s |  |  codespell was not available.  |
   | +1 :green_heart: |  @author  |   0m  0s |  |  The patch does not contain any @author tags.  |
   | +1 :green_heart: |  test4tests  |   0m  0s |  |  The patch appears to include 8 new or modified test files.  |
   |||| _ trunk Compile Tests _ |
   | +0 :ok: |  mvndep  |  12m 42s |  |  Maven dependency ordering for branch  |
   | +1 :green_heart: |  mvninstall  |  22m  1s |  |  trunk passed  |
   | +1 :green_heart: |  compile  |  22m 18s |  |  trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  compile  |  19m 58s |  |  trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  checkstyle  |   3m 44s |  |  trunk passed  |
   | +1 :green_heart: |  mvnsite  |   3m 18s |  |  trunk passed  |
   | +1 :green_heart: |  javadoc  |   2m 34s |  |  trunk passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javadoc  |   3m  5s |  |  trunk passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  spotbugs  |   4m 30s |  |  trunk passed  |
   | +1 :green_heart: |  shadedclient  |  14m 27s |  |  branch has no errors when building and testing our client artifacts.  |
   |||| _ Patch Compile Tests _ |
   | +0 :ok: |  mvndep  |   0m 28s |  |  Maven dependency ordering for patch  |
   | +1 :green_heart: |  mvninstall  |   1m 43s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  20m 32s |  |  the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javac  |  20m 32s |  |  the patch passed  |
   | +1 :green_heart: |  compile  |  18m 24s |  |  the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | +1 :green_heart: |  javac  |  18m 24s |  |  the patch passed  |
   | +1 :green_heart: |  blanks  |   0m  0s |  |  The patch has no blanks issues.  |
   | +1 :green_heart: |  checkstyle  |   3m 31s |  |  root: The patch generated 0 new + 97 unchanged - 2 fixed = 97 total (was 99)  |
   | +1 :green_heart: |  mvnsite  |   3m 17s |  |  the patch passed  |
   | +1 :green_heart: |  xml  |   0m  1s |  |  The patch has no ill-formed XML file.  |
   | +1 :green_heart: |  javadoc  |   2m 33s |  |  the patch passed with JDK Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04  |
   | +1 :green_heart: |  javadoc  |   2m 52s |  |  the patch passed with JDK Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10  |
   | -1 :x: |  spotbugs  |   1m 16s | [/new-spotbugs-hadoop-hdfs-project_hadoop-hdfs-httpfs.html](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/7/artifact/out/new-spotbugs-hadoop-hdfs-project_hadoop-hdfs-httpfs.html) |  hadoop-hdfs-project/hadoop-hdfs-httpfs generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0)  |
   | +1 :green_heart: |  shadedclient  |  15m  8s |  |  patch has no errors when building and testing our client artifacts.  |
   |||| _ Other Tests _ |
   | +1 :green_heart: |  unit  |  17m 19s |  |  hadoop-common in the patch passed.  |
   | +1 :green_heart: |  unit  |   3m 42s |  |  hadoop-kms in the patch passed.  |
   | +1 :green_heart: |  unit  |   9m 43s |  |  hadoop-hdfs-httpfs in the patch passed.  |
   | +1 :green_heart: |  asflicense  |   0m 51s |  |  The patch does not generate ASF License warnings.  |
   |  |   | 219m 39s |  |  |
   
   
   | Reason | Tests |
   |-------:|:------|
   | SpotBugs | module:hadoop-hdfs-project/hadoop-hdfs-httpfs |
   |  |  org.apache.hadoop.fs.http.server.HttpFSAuthenticationFilter.CONF_PREFIXES should be package protected  At HttpFSAuthenticationFilter.java: At HttpFSAuthenticationFilter.java:[line 54] |
   
   
   | Subsystem | Report/Notes |
   |----------:|:-------------|
   | Docker | ClientAPI=1.41 ServerAPI=1.41 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/7/artifact/out/Dockerfile |
   | GITHUB PR | https://github.com/apache/hadoop/pull/3209 |
   | Optional Tests | dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell xml |
   | uname | Linux 69341df2e869 4.15.0-112-generic #113-Ubuntu SMP Thu Jul 9 23:41:39 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux |
   | Build tool | maven |
   | Personality | dev-support/bin/hadoop.sh |
   | git revision | trunk / 5f58bbd332c6d259e971700979bc164adc8952f3 |
   | Default Java | Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
   | Multi-JDK versions | /usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.11+9-Ubuntu-0ubuntu2.20.04 /usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_292-8u292-b10-0ubuntu1~20.04-b10 |
   |  Test Results | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/7/testReport/ |
   | Max. process+thread count | 3152 (vs. ulimit of 5500) |
   | modules | C: hadoop-common-project/hadoop-common hadoop-common-project/hadoop-kms hadoop-hdfs-project/hadoop-hdfs-httpfs U: . |
   | Console output | https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-3209/7/console |
   | versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
   | Powered by | Apache Yetus 0.14.0-SNAPSHOT https://yetus.apache.org |
   
   
   This message was automatically generated.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] tomicooler commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
tomicooler commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682567269



##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/test/java/org/apache/hadoop/fs/http/server/TestHttpFSServerWebServer.java
##########
@@ -71,53 +82,195 @@ public static void beforeClass() throws Exception {
     System.setProperty("httpfs.home.dir", homeDir.getAbsolutePath());
     System.setProperty("httpfs.log.dir", logsDir.getAbsolutePath());
     System.setProperty("httpfs.config.dir", confDir.getAbsolutePath());
-    FileUtils.writeStringToFile(new File(confDir, "httpfs-signature.secret"),
-        "foo", StandardCharsets.UTF_8);
+    secretFile = new File(System.getProperty("httpfs.config.dir"),
+        "httpfs-signature-custom.secret");
   }
 
-  @Before
-  public void setUp() throws Exception {
-    Configuration conf = new Configuration();
-    conf.set(HttpFSServerWebServer.HTTP_HOSTNAME_KEY, "localhost");
-    conf.setInt(HttpFSServerWebServer.HTTP_PORT_KEY, 0);
-    conf.set(AuthenticationFilter.SIGNATURE_SECRET_FILE,
-        "httpfs-signature.secret");
-    Configuration sslConf = new Configuration();
-    webServer = new HttpFSServerWebServer(conf, sslConf);
+  @After
+  public void teardown() throws Exception {
+    if (webServer != null) {
+      webServer.stop();
+    }
   }
 
   @Test
   public void testStartStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
-    String user = HadoopUsersConfTestHelper.getHadoopUsers()[0];
-    URL url = new URL(webServer.getUrl(), MessageFormat.format(
-        "/webhdfs/v1/?user.name={0}&op=liststatus", user));
-    HttpURLConnection conn = (HttpURLConnection) url.openConnection();
-    Assert.assertEquals(HttpURLConnection.HTTP_OK, conn.getResponseCode());
-    BufferedReader reader = new BufferedReader(
-        new InputStreamReader(conn.getInputStream()));
-    reader.readLine();
-    reader.close();

Review comment:
       The following tests were just testing if the server could be started/stopped without an exception (my guess):
     - testStartStop
     - testJustStop
     - testJustStop
     - testDoubleStart
   I think this was the intention of the original author.
   
   
   I introduced some other tests with a secret file, empty secret file, etc with additional assert that actually checks if the secret provider is file based and not random. That was actually not asserted before, the tests were green, but the secret file was not even used. These tests uses the assertServiceRespondsWithOK.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/test/java/org/apache/hadoop/fs/http/server/TestHttpFSServerWebServer.java
##########
@@ -71,53 +82,195 @@ public static void beforeClass() throws Exception {
     System.setProperty("httpfs.home.dir", homeDir.getAbsolutePath());
     System.setProperty("httpfs.log.dir", logsDir.getAbsolutePath());
     System.setProperty("httpfs.config.dir", confDir.getAbsolutePath());
-    FileUtils.writeStringToFile(new File(confDir, "httpfs-signature.secret"),
-        "foo", StandardCharsets.UTF_8);
+    secretFile = new File(System.getProperty("httpfs.config.dir"),
+        "httpfs-signature-custom.secret");
   }
 
-  @Before
-  public void setUp() throws Exception {
-    Configuration conf = new Configuration();
-    conf.set(HttpFSServerWebServer.HTTP_HOSTNAME_KEY, "localhost");
-    conf.setInt(HttpFSServerWebServer.HTTP_PORT_KEY, 0);
-    conf.set(AuthenticationFilter.SIGNATURE_SECRET_FILE,
-        "httpfs-signature.secret");
-    Configuration sslConf = new Configuration();
-    webServer = new HttpFSServerWebServer(conf, sslConf);
+  @After
+  public void teardown() throws Exception {
+    if (webServer != null) {
+      webServer.stop();
+    }
   }
 
   @Test
   public void testStartStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
-    String user = HadoopUsersConfTestHelper.getHadoopUsers()[0];
-    URL url = new URL(webServer.getUrl(), MessageFormat.format(
-        "/webhdfs/v1/?user.name={0}&op=liststatus", user));
-    HttpURLConnection conn = (HttpURLConnection) url.openConnection();
-    Assert.assertEquals(HttpURLConnection.HTTP_OK, conn.getResponseCode());
-    BufferedReader reader = new BufferedReader(
-        new InputStreamReader(conn.getInputStream()));
-    reader.readLine();
-    reader.close();
     webServer.stop();
   }
 
   @Test
   public void testJustStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.stop();
   }
 
   @Test
   public void testDoubleStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
     webServer.stop();
     webServer.stop();
   }
 
   @Test
   public void testDoubleStart() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
+    webServer.start();
+    webServer.start();
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFile() throws Exception {
+    createSecretFile("foo");
+    webServer = createWebServer(createConfigurationWithSecretFile());
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithDeprecatedConfigOnly()
+      throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfiguration();
+    setDeprecatedSecretFile(conf, secretFile.getAbsolutePath());
+    webServer = createWebServer(conf);
+
+    // HttpFSAuthenticationFilter gets the configuration from the webapp, the defaults are loaded
+    // so the custom file location is overwritten with a non-existing default one.
+    Exception exception =
+        Assert.assertThrows(IOException.class, new ThrowingRunnable() {
+          @Override
+          public void run() throws Throwable {
+            webServer.start();
+          }
+        });
+    Assert.assertTrue(exception.getMessage().startsWith(
+        "Unable to initialize"));
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithBothConfigOption() throws Exception {

Review comment:
       Done.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/test/java/org/apache/hadoop/fs/http/client/TestHttpFSFileSystemLocalFileSystem.java
##########
@@ -32,8 +32,6 @@
 import org.junit.runners.Parameterized;
 
 import java.io.File;
-import java.net.URI;

Review comment:
       Done.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] tomicooler commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
tomicooler commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682510551



##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -243,7 +243,9 @@
 
     private String hostName;
     private boolean disallowFallbackToRandomSignerSecretProvider;
-    private String authFilterConfigurationPrefix = "hadoop.http.authentication.";
+    private final ArrayList<String> authFilterConfigurationPrefixes =

Review comment:
       Thanks. Done.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] tomicooler commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
tomicooler commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682664079



##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSServerWebServer.java
##########
@@ -178,6 +179,10 @@ public URL getUrl() {
     }
   }
 
+  public HttpServer2 getHttpServer() {

Review comment:
       Done.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] tomicooler commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
tomicooler commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682656885



##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -473,8 +482,10 @@ public HttpServer2 build() throws IOException {
       HttpServer2 server = new HttpServer2(this);
 
       if (this.securityEnabled &&
-          !this.conf.get(authFilterConfigurationPrefix + "type").
-          equals(PseudoAuthenticationHandler.TYPE)) {
+          authFilterConfigurationPrefixes.stream().noneMatch(
+              prefix -> (prefix + "type")
+                  .equals(PseudoAuthenticationHandler.TYPE))
+      ) {

Review comment:
       Probably there is no test for this :/




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] szilard-nemeth commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
szilard-nemeth commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682504151



##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -811,18 +822,26 @@ private static SignerSecretProvider constructSecretProvider(final Builder b,
       throws Exception {
     final Configuration conf = b.conf;
     Properties config = getFilterProperties(conf,
-                                            b.authFilterConfigurationPrefix);
+        b.authFilterConfigurationPrefixes);
     return AuthenticationFilter.constructSecretProvider(
         ctx, config, b.disallowFallbackToRandomSignerSecretProvider);
   }
 
-  private static Properties getFilterProperties(Configuration conf, String
-      prefix) {
-    Properties prop = new Properties();
-    Map<String, String> filterConfig = AuthenticationFilterInitializer
-        .getFilterConfigMap(conf, prefix);
-    prop.putAll(filterConfig);
-    return prop;
+  public static Properties getFilterProperties(Configuration conf,
+                                                ArrayList<String> prefixes) {
+    Properties props = new Properties();
+    prefixes.forEach(prefix -> {
+      Map<String, String> filterConfigMap =
+          AuthenticationFilterInitializer.getFilterConfigMap(conf, prefix);
+      filterConfigMap.forEach((key, value) -> {
+        Object previous = props.setProperty(key, value);
+        if (previous != null && !previous.equals(value)) {
+          LOG.warn("Overwriting configuration for key='{}' with value='{}' " +
+              "previous_value='{}'", key, value, previous);
+        }
+      });
+    });

Review comment:
       I do agree with this.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] tomicooler commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
tomicooler commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682557011



##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -473,8 +482,10 @@ public HttpServer2 build() throws IOException {
       HttpServer2 server = new HttpServer2(this);
 
       if (this.securityEnabled &&
-          !this.conf.get(authFilterConfigurationPrefix + "type").
-          equals(PseudoAuthenticationHandler.TYPE)) {
+          authFilterConfigurationPrefixes.stream().noneMatch(
+              prefix -> (prefix + "type")
+                  .equals(PseudoAuthenticationHandler.TYPE))
+      ) {

Review comment:
       Good catch. it should have been this.conf.get(prefix + "type").equals. Here I find the stream().noneMatch useful with the lambda expression. I won't resolve the thread yet, so I can replace them with a helper function if needed.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] tomicooler commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
tomicooler commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682567551



##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSAuthenticationFilter.java
##########
@@ -69,27 +72,11 @@
   @Override
   protected Properties getConfiguration(String configPrefix,
       FilterConfig filterConfig) throws ServletException{
-    Properties props = new Properties();
+    System.out.println("getConfiguration1");

Review comment:
       Done.

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -811,18 +822,26 @@ private static SignerSecretProvider constructSecretProvider(final Builder b,
       throws Exception {
     final Configuration conf = b.conf;
     Properties config = getFilterProperties(conf,
-                                            b.authFilterConfigurationPrefix);
+        b.authFilterConfigurationPrefixes);
     return AuthenticationFilter.constructSecretProvider(
         ctx, config, b.disallowFallbackToRandomSignerSecretProvider);
   }
 
-  private static Properties getFilterProperties(Configuration conf, String
-      prefix) {
-    Properties prop = new Properties();
-    Map<String, String> filterConfig = AuthenticationFilterInitializer
-        .getFilterConfigMap(conf, prefix);
-    prop.putAll(filterConfig);
-    return prop;
+  public static Properties getFilterProperties(Configuration conf,
+                                                ArrayList<String> prefixes) {
+    Properties props = new Properties();
+    prefixes.forEach(prefix -> {
+      Map<String, String> filterConfigMap =
+          AuthenticationFilterInitializer.getFilterConfigMap(conf, prefix);
+      filterConfigMap.forEach((key, value) -> {
+        Object previous = props.setProperty(key, value);
+        if (previous != null && !previous.equals(value)) {
+          LOG.warn("Overwriting configuration for key='{}' with value='{}' " +
+              "previous_value='{}'", key, value, previous);

Review comment:
       Done.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] shuzirra commented on pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
shuzirra commented on pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#issuecomment-922883418


   Thank you @tomicooler for working on this, and updated to use the deprecation system of the config class. LGTM+1.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] szilard-nemeth commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
szilard-nemeth commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682491876



##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -473,8 +482,10 @@ public HttpServer2 build() throws IOException {
       HttpServer2 server = new HttpServer2(this);
 
       if (this.securityEnabled &&
-          !this.conf.get(authFilterConfigurationPrefix + "type").
-          equals(PseudoAuthenticationHandler.TYPE)) {
+          authFilterConfigurationPrefixes.stream().noneMatch(
+              prefix -> (prefix + "type")
+                  .equals(PseudoAuthenticationHandler.TYPE))
+      ) {

Review comment:
       Maybe I'm missing something but this is not doing the same as the old code block.
   The old code checked if the configuration object has a key set for the prefix + type.
   With the new code, you are just checking if the Stream constructed from authFilterConfigurationPrefixes does not match the prefix + type. 
   Is this intentional?

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/test/java/org/apache/hadoop/fs/http/server/TestHttpFSServerWebServer.java
##########
@@ -71,53 +82,195 @@ public static void beforeClass() throws Exception {
     System.setProperty("httpfs.home.dir", homeDir.getAbsolutePath());
     System.setProperty("httpfs.log.dir", logsDir.getAbsolutePath());
     System.setProperty("httpfs.config.dir", confDir.getAbsolutePath());
-    FileUtils.writeStringToFile(new File(confDir, "httpfs-signature.secret"),
-        "foo", StandardCharsets.UTF_8);
+    secretFile = new File(System.getProperty("httpfs.config.dir"),
+        "httpfs-signature-custom.secret");
   }
 
-  @Before
-  public void setUp() throws Exception {
-    Configuration conf = new Configuration();
-    conf.set(HttpFSServerWebServer.HTTP_HOSTNAME_KEY, "localhost");
-    conf.setInt(HttpFSServerWebServer.HTTP_PORT_KEY, 0);
-    conf.set(AuthenticationFilter.SIGNATURE_SECRET_FILE,
-        "httpfs-signature.secret");
-    Configuration sslConf = new Configuration();
-    webServer = new HttpFSServerWebServer(conf, sslConf);
+  @After
+  public void teardown() throws Exception {
+    if (webServer != null) {
+      webServer.stop();
+    }
   }
 
   @Test
   public void testStartStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
-    String user = HadoopUsersConfTestHelper.getHadoopUsers()[0];
-    URL url = new URL(webServer.getUrl(), MessageFormat.format(
-        "/webhdfs/v1/?user.name={0}&op=liststatus", user));
-    HttpURLConnection conn = (HttpURLConnection) url.openConnection();
-    Assert.assertEquals(HttpURLConnection.HTTP_OK, conn.getResponseCode());
-    BufferedReader reader = new BufferedReader(
-        new InputStreamReader(conn.getInputStream()));
-    reader.readLine();
-    reader.close();

Review comment:
       I don't get why this code block was deleted? 
   Also I don't see a call to assertServiceRespondsWithOK either.
   Please elaborate.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSAuthenticationFilter.java
##########
@@ -69,27 +72,11 @@
   @Override
   protected Properties getConfiguration(String configPrefix,
       FilterConfig filterConfig) throws ServletException{
-    Properties props = new Properties();
+    System.out.println("getConfiguration1");
     Configuration conf = HttpFSServerWebApp.get().getConfig();
-
-    props.setProperty(AuthenticationFilter.COOKIE_PATH, "/");

Review comment:
       Have you intentionally deleted this line?
   ```
   props.setProperty(AuthenticationFilter.COOKIE_PATH, "/");
   ```

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -811,18 +822,26 @@ private static SignerSecretProvider constructSecretProvider(final Builder b,
       throws Exception {
     final Configuration conf = b.conf;
     Properties config = getFilterProperties(conf,
-                                            b.authFilterConfigurationPrefix);
+        b.authFilterConfigurationPrefixes);
     return AuthenticationFilter.constructSecretProvider(
         ctx, config, b.disallowFallbackToRandomSignerSecretProvider);
   }
 
-  private static Properties getFilterProperties(Configuration conf, String
-      prefix) {
-    Properties prop = new Properties();
-    Map<String, String> filterConfig = AuthenticationFilterInitializer
-        .getFilterConfigMap(conf, prefix);
-    prop.putAll(filterConfig);
-    return prop;
+  public static Properties getFilterProperties(Configuration conf,
+                                                ArrayList<String> prefixes) {

Review comment:
       Same as above, please use List instead of ArrayList, interface types over concrete types.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/test/java/org/apache/hadoop/fs/http/client/TestHttpFSFileSystemLocalFileSystem.java
##########
@@ -32,8 +32,6 @@
 import org.junit.runners.Parameterized;
 
 import java.io.File;
-import java.net.URI;

Review comment:
       These removals seem unrelated to the patch, please revert these changes if they are not crucial for the patch.

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -811,18 +822,26 @@ private static SignerSecretProvider constructSecretProvider(final Builder b,
       throws Exception {
     final Configuration conf = b.conf;
     Properties config = getFilterProperties(conf,
-                                            b.authFilterConfigurationPrefix);
+        b.authFilterConfigurationPrefixes);
     return AuthenticationFilter.constructSecretProvider(
         ctx, config, b.disallowFallbackToRandomSignerSecretProvider);
   }
 
-  private static Properties getFilterProperties(Configuration conf, String
-      prefix) {
-    Properties prop = new Properties();
-    Map<String, String> filterConfig = AuthenticationFilterInitializer
-        .getFilterConfigMap(conf, prefix);
-    prop.putAll(filterConfig);
-    return prop;
+  public static Properties getFilterProperties(Configuration conf,
+                                                ArrayList<String> prefixes) {
+    Properties props = new Properties();
+    prefixes.forEach(prefix -> {
+      Map<String, String> filterConfigMap =
+          AuthenticationFilterInitializer.getFilterConfigMap(conf, prefix);
+      filterConfigMap.forEach((key, value) -> {
+        Object previous = props.setProperty(key, value);
+        if (previous != null && !previous.equals(value)) {
+          LOG.warn("Overwriting configuration for key='{}' with value='{}' " +
+              "previous_value='{}'", key, value, previous);

Review comment:
       ```suggestion
                 "previous value='{}'", key, value, previous);
   ```

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -243,7 +243,9 @@
 
     private String hostName;
     private boolean disallowFallbackToRandomSignerSecretProvider;
-    private String authFilterConfigurationPrefix = "hadoop.http.authentication.";
+    private final ArrayList<String> authFilterConfigurationPrefixes =

Review comment:
       Use List instead of ArrayList<>. It's a better practice to define fields / local variables with interface types, see: https://stackoverflow.com/questions/9852831/polymorphism-why-use-list-list-new-arraylist-instead-of-arraylist-list-n

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSServerWebServer.java
##########
@@ -178,6 +179,10 @@ public URL getUrl() {
     }
   }
 
+  public HttpServer2 getHttpServer() {

Review comment:
       Was this only added for testing? If so, please add a VisibleForTesting annotation over the method.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSAuthenticationFilter.java
##########
@@ -69,27 +72,11 @@
   @Override
   protected Properties getConfiguration(String configPrefix,
       FilterConfig filterConfig) throws ServletException{
-    Properties props = new Properties();
+    System.out.println("getConfiguration1");

Review comment:
       Please get rid of all System.out.println calls or alternatively, use the Logger.info method.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/test/java/org/apache/hadoop/fs/http/server/TestHttpFSServerWebServer.java
##########
@@ -71,53 +82,195 @@ public static void beforeClass() throws Exception {
     System.setProperty("httpfs.home.dir", homeDir.getAbsolutePath());
     System.setProperty("httpfs.log.dir", logsDir.getAbsolutePath());
     System.setProperty("httpfs.config.dir", confDir.getAbsolutePath());
-    FileUtils.writeStringToFile(new File(confDir, "httpfs-signature.secret"),
-        "foo", StandardCharsets.UTF_8);
+    secretFile = new File(System.getProperty("httpfs.config.dir"),
+        "httpfs-signature-custom.secret");
   }
 
-  @Before
-  public void setUp() throws Exception {
-    Configuration conf = new Configuration();
-    conf.set(HttpFSServerWebServer.HTTP_HOSTNAME_KEY, "localhost");
-    conf.setInt(HttpFSServerWebServer.HTTP_PORT_KEY, 0);
-    conf.set(AuthenticationFilter.SIGNATURE_SECRET_FILE,
-        "httpfs-signature.secret");
-    Configuration sslConf = new Configuration();
-    webServer = new HttpFSServerWebServer(conf, sslConf);
+  @After
+  public void teardown() throws Exception {
+    if (webServer != null) {
+      webServer.stop();
+    }
   }
 
   @Test
   public void testStartStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
-    String user = HadoopUsersConfTestHelper.getHadoopUsers()[0];
-    URL url = new URL(webServer.getUrl(), MessageFormat.format(
-        "/webhdfs/v1/?user.name={0}&op=liststatus", user));
-    HttpURLConnection conn = (HttpURLConnection) url.openConnection();
-    Assert.assertEquals(HttpURLConnection.HTTP_OK, conn.getResponseCode());
-    BufferedReader reader = new BufferedReader(
-        new InputStreamReader(conn.getInputStream()));
-    reader.readLine();
-    reader.close();
     webServer.stop();
   }
 
   @Test
   public void testJustStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.stop();
   }
 
   @Test
   public void testDoubleStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
     webServer.stop();
     webServer.stop();
   }
 
   @Test
   public void testDoubleStart() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
+    webServer.start();
+    webServer.start();
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFile() throws Exception {
+    createSecretFile("foo");
+    webServer = createWebServer(createConfigurationWithSecretFile());
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithDeprecatedConfigOnly()
+      throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfiguration();
+    setDeprecatedSecretFile(conf, secretFile.getAbsolutePath());
+    webServer = createWebServer(conf);
+
+    // HttpFSAuthenticationFilter gets the configuration from the webapp, the defaults are loaded
+    // so the custom file location is overwritten with a non-existing default one.
+    Exception exception =
+        Assert.assertThrows(IOException.class, new ThrowingRunnable() {
+          @Override
+          public void run() throws Throwable {
+            webServer.start();
+          }
+        });
+    Assert.assertTrue(exception.getMessage().startsWith(
+        "Unable to initialize"));
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithBothConfigOption() throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfigurationWithSecretFile();
+    setDeprecatedSecretFile(conf, secretFile.getAbsolutePath());
+    webServer = createWebServer(conf);
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithBothConfigOptionOverWriteCheck()

Review comment:
       Nit: testServiceWithSecretFileWithBothConfigOptionsOverwriteCheck
   options: plural
   overwrite: one word

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/test/java/org/apache/hadoop/fs/http/server/TestHttpFSServerWebServer.java
##########
@@ -71,53 +82,195 @@ public static void beforeClass() throws Exception {
     System.setProperty("httpfs.home.dir", homeDir.getAbsolutePath());
     System.setProperty("httpfs.log.dir", logsDir.getAbsolutePath());
     System.setProperty("httpfs.config.dir", confDir.getAbsolutePath());
-    FileUtils.writeStringToFile(new File(confDir, "httpfs-signature.secret"),
-        "foo", StandardCharsets.UTF_8);
+    secretFile = new File(System.getProperty("httpfs.config.dir"),
+        "httpfs-signature-custom.secret");
   }
 
-  @Before
-  public void setUp() throws Exception {
-    Configuration conf = new Configuration();
-    conf.set(HttpFSServerWebServer.HTTP_HOSTNAME_KEY, "localhost");
-    conf.setInt(HttpFSServerWebServer.HTTP_PORT_KEY, 0);
-    conf.set(AuthenticationFilter.SIGNATURE_SECRET_FILE,
-        "httpfs-signature.secret");
-    Configuration sslConf = new Configuration();
-    webServer = new HttpFSServerWebServer(conf, sslConf);
+  @After
+  public void teardown() throws Exception {
+    if (webServer != null) {
+      webServer.stop();
+    }
   }
 
   @Test
   public void testStartStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
-    String user = HadoopUsersConfTestHelper.getHadoopUsers()[0];
-    URL url = new URL(webServer.getUrl(), MessageFormat.format(
-        "/webhdfs/v1/?user.name={0}&op=liststatus", user));
-    HttpURLConnection conn = (HttpURLConnection) url.openConnection();
-    Assert.assertEquals(HttpURLConnection.HTTP_OK, conn.getResponseCode());
-    BufferedReader reader = new BufferedReader(
-        new InputStreamReader(conn.getInputStream()));
-    reader.readLine();
-    reader.close();
     webServer.stop();
   }
 
   @Test
   public void testJustStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.stop();
   }
 
   @Test
   public void testDoubleStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
     webServer.stop();
     webServer.stop();
   }
 
   @Test
   public void testDoubleStart() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
+    webServer.start();
+    webServer.start();
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFile() throws Exception {
+    createSecretFile("foo");
+    webServer = createWebServer(createConfigurationWithSecretFile());
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithDeprecatedConfigOnly()
+      throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfiguration();
+    setDeprecatedSecretFile(conf, secretFile.getAbsolutePath());
+    webServer = createWebServer(conf);
+
+    // HttpFSAuthenticationFilter gets the configuration from the webapp, the defaults are loaded
+    // so the custom file location is overwritten with a non-existing default one.
+    Exception exception =
+        Assert.assertThrows(IOException.class, new ThrowingRunnable() {
+          @Override
+          public void run() throws Throwable {
+            webServer.start();
+          }
+        });
+    Assert.assertTrue(exception.getMessage().startsWith(
+        "Unable to initialize"));
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithBothConfigOption() throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfigurationWithSecretFile();
+    setDeprecatedSecretFile(conf, secretFile.getAbsolutePath());
+    webServer = createWebServer(conf);
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithBothConfigOptionOverWriteCheck()
+      throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfigurationWithSecretFile();
+    setDeprecatedSecretFile(conf, "will-be-overwritten");
+    webServer = createWebServer(conf);
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithMissingSecretFile() throws Exception {
+    webServer = createWebServer(createConfigurationWithSecretFile());
     webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        RandomSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithEmptySecretFile() throws Exception {
+    // The AuthenticationFilter.constructSecretProvider will do the fallback
+    // to the random secrets not the HttpFSAuthenticationFilter.
+    createSecretFile("");
+    webServer = createWebServer(createConfigurationWithSecretFile());
     webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        RandomSignerSecretProvider.class);
     webServer.stop();
   }
 
+  private <T extends SignerSecretProvider> void assertSignerSecretProviderType(
+      HttpServer2 server, Class<T> expected) {
+    SignerSecretProvider secretProvider = (SignerSecretProvider)
+        server.getWebAppContext().getServletContext()
+            .getAttribute(SIGNER_SECRET_PROVIDER_ATTRIBUTE);
+    Assert.assertNotNull(secretProvider);
+    Assert.assertEquals(expected, secretProvider.getClass());

Review comment:
       Please add assertion messages.

##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/test/java/org/apache/hadoop/fs/http/server/TestHttpFSServerWebServer.java
##########
@@ -71,53 +82,195 @@ public static void beforeClass() throws Exception {
     System.setProperty("httpfs.home.dir", homeDir.getAbsolutePath());
     System.setProperty("httpfs.log.dir", logsDir.getAbsolutePath());
     System.setProperty("httpfs.config.dir", confDir.getAbsolutePath());
-    FileUtils.writeStringToFile(new File(confDir, "httpfs-signature.secret"),
-        "foo", StandardCharsets.UTF_8);
+    secretFile = new File(System.getProperty("httpfs.config.dir"),
+        "httpfs-signature-custom.secret");
   }
 
-  @Before
-  public void setUp() throws Exception {
-    Configuration conf = new Configuration();
-    conf.set(HttpFSServerWebServer.HTTP_HOSTNAME_KEY, "localhost");
-    conf.setInt(HttpFSServerWebServer.HTTP_PORT_KEY, 0);
-    conf.set(AuthenticationFilter.SIGNATURE_SECRET_FILE,
-        "httpfs-signature.secret");
-    Configuration sslConf = new Configuration();
-    webServer = new HttpFSServerWebServer(conf, sslConf);
+  @After
+  public void teardown() throws Exception {
+    if (webServer != null) {
+      webServer.stop();
+    }
   }
 
   @Test
   public void testStartStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
-    String user = HadoopUsersConfTestHelper.getHadoopUsers()[0];
-    URL url = new URL(webServer.getUrl(), MessageFormat.format(
-        "/webhdfs/v1/?user.name={0}&op=liststatus", user));
-    HttpURLConnection conn = (HttpURLConnection) url.openConnection();
-    Assert.assertEquals(HttpURLConnection.HTTP_OK, conn.getResponseCode());
-    BufferedReader reader = new BufferedReader(
-        new InputStreamReader(conn.getInputStream()));
-    reader.readLine();
-    reader.close();
     webServer.stop();
   }
 
   @Test
   public void testJustStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.stop();
   }
 
   @Test
   public void testDoubleStop() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
     webServer.start();
     webServer.stop();
     webServer.stop();
   }
 
   @Test
   public void testDoubleStart() throws Exception {
+    webServer = createWebServer(createConfigurationWithRandomSecret());
+    webServer.start();
+    webServer.start();
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFile() throws Exception {
+    createSecretFile("foo");
+    webServer = createWebServer(createConfigurationWithSecretFile());
+    webServer.start();
+    assertServiceRespondsWithOK(webServer.getUrl());
+    assertSignerSecretProviderType(webServer.getHttpServer(),
+        FileSignerSecretProvider.class);
+    webServer.stop();
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithDeprecatedConfigOnly()
+      throws Exception {
+    createSecretFile("foo");
+    Configuration conf = createConfiguration();
+    setDeprecatedSecretFile(conf, secretFile.getAbsolutePath());
+    webServer = createWebServer(conf);
+
+    // HttpFSAuthenticationFilter gets the configuration from the webapp, the defaults are loaded
+    // so the custom file location is overwritten with a non-existing default one.
+    Exception exception =
+        Assert.assertThrows(IOException.class, new ThrowingRunnable() {
+          @Override
+          public void run() throws Throwable {
+            webServer.start();
+          }
+        });
+    Assert.assertTrue(exception.getMessage().startsWith(
+        "Unable to initialize"));
+  }
+
+  @Test
+  public void testServiceWithSecretFileWithBothConfigOption() throws Exception {

Review comment:
       Nit: testServiceWithSecretFileWithBothConfigOptions (plural)

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -811,18 +822,26 @@ private static SignerSecretProvider constructSecretProvider(final Builder b,
       throws Exception {
     final Configuration conf = b.conf;
     Properties config = getFilterProperties(conf,
-                                            b.authFilterConfigurationPrefix);
+        b.authFilterConfigurationPrefixes);
     return AuthenticationFilter.constructSecretProvider(
         ctx, config, b.disallowFallbackToRandomSignerSecretProvider);
   }
 
-  private static Properties getFilterProperties(Configuration conf, String
-      prefix) {
-    Properties prop = new Properties();
-    Map<String, String> filterConfig = AuthenticationFilterInitializer
-        .getFilterConfigMap(conf, prefix);
-    prop.putAll(filterConfig);
-    return prop;
+  public static Properties getFilterProperties(Configuration conf,
+                                                ArrayList<String> prefixes) {
+    Properties props = new Properties();
+    prefixes.forEach(prefix -> {
+      Map<String, String> filterConfigMap =
+          AuthenticationFilterInitializer.getFilterConfigMap(conf, prefix);
+      filterConfigMap.forEach((key, value) -> {
+        Object previous = props.setProperty(key, value);
+        if (previous != null && !previous.equals(value)) {
+          LOG.warn("Overwriting configuration for key='{}' with value='{}' " +
+              "previous_value='{}'", key, value, previous);
+        }
+      });
+    });

Review comment:
       I do agree with this.

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -473,8 +482,10 @@ public HttpServer2 build() throws IOException {
       HttpServer2 server = new HttpServer2(this);
 
       if (this.securityEnabled &&
-          !this.conf.get(authFilterConfigurationPrefix + "type").
-          equals(PseudoAuthenticationHandler.TYPE)) {
+          authFilterConfigurationPrefixes.stream().noneMatch(
+              prefix -> (prefix + "type")
+                  .equals(PseudoAuthenticationHandler.TYPE))
+      ) {

Review comment:
       Follow-up question then: How come that this mistake was not caught by any of the tests?

##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -473,8 +482,10 @@ public HttpServer2 build() throws IOException {
       HttpServer2 server = new HttpServer2(this);
 
       if (this.securityEnabled &&
-          !this.conf.get(authFilterConfigurationPrefix + "type").
-          equals(PseudoAuthenticationHandler.TYPE)) {
+          authFilterConfigurationPrefixes.stream().noneMatch(
+              prefix -> (prefix + "type")
+                  .equals(PseudoAuthenticationHandler.TYPE))
+      ) {

Review comment:
       I see. You can file a follow up jira for this if you want to :)




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] tomicooler commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
tomicooler commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682576769



##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSAuthenticationFilter.java
##########
@@ -69,27 +72,11 @@
   @Override
   protected Properties getConfiguration(String configPrefix,
       FilterConfig filterConfig) throws ServletException{
-    Properties props = new Properties();
+    System.out.println("getConfiguration1");
     Configuration conf = HttpFSServerWebApp.get().getConfig();
-
-    props.setProperty(AuthenticationFilter.COOKIE_PATH, "/");
-    for (Map.Entry<String, String> entry : conf) {
-      String name = entry.getKey();
-      if (name.startsWith(HADOOP_HTTP_CONF_PREFIX)) {
-        name = name.substring(HADOOP_HTTP_CONF_PREFIX.length());
-        props.setProperty(name, entry.getValue());
-      }
-    }
-
-    // Replace Hadoop Http Authentication Configs with HttpFS specific Configs
-    for (Map.Entry<String, String> entry : conf) {
-      String name = entry.getKey();
-      if (name.startsWith(CONF_PREFIX)) {
-        String value = conf.get(name);
-        name = name.substring(CONF_PREFIX.length());
-        props.setProperty(name, value);
-      }
-    }
+    Properties props = HttpServer2.getFilterProperties(conf,
+        new ArrayList<>(Arrays.asList(CONF_PREFIXES)));
+    System.out.println("getConfiguration2");

Review comment:
       Done.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] tomicooler commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
tomicooler commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682576689



##########
File path: hadoop-hdfs-project/hadoop-hdfs-httpfs/src/main/java/org/apache/hadoop/fs/http/server/HttpFSAuthenticationFilter.java
##########
@@ -98,6 +85,7 @@ protected Properties getConfiguration(String configPrefix,
     }
 
     if (!isRandomSecret(filterConfig)) {
+      System.out.println("FILE: " + signatureSecretFile);

Review comment:
       Done




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org


[GitHub] [hadoop] tomicooler commented on a change in pull request #3209: HDFS-16129. Fixing the signature secret file misusage in HttpFS.

Posted by GitBox <gi...@apache.org>.
tomicooler commented on a change in pull request #3209:
URL: https://github.com/apache/hadoop/pull/3209#discussion_r682509467



##########
File path: hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/http/HttpServer2.java
##########
@@ -811,18 +822,26 @@ private static SignerSecretProvider constructSecretProvider(final Builder b,
       throws Exception {
     final Configuration conf = b.conf;
     Properties config = getFilterProperties(conf,
-                                            b.authFilterConfigurationPrefix);
+        b.authFilterConfigurationPrefixes);
     return AuthenticationFilter.constructSecretProvider(
         ctx, config, b.disallowFallbackToRandomSignerSecretProvider);
   }
 
-  private static Properties getFilterProperties(Configuration conf, String
-      prefix) {
-    Properties prop = new Properties();
-    Map<String, String> filterConfig = AuthenticationFilterInitializer
-        .getFilterConfigMap(conf, prefix);
-    prop.putAll(filterConfig);
-    return prop;
+  public static Properties getFilterProperties(Configuration conf,
+                                                ArrayList<String> prefixes) {
+    Properties props = new Properties();
+    prefixes.forEach(prefix -> {
+      Map<String, String> filterConfigMap =
+          AuthenticationFilterInitializer.getFilterConfigMap(conf, prefix);
+      filterConfigMap.forEach((key, value) -> {
+        Object previous = props.setProperty(key, value);
+        if (previous != null && !previous.equals(value)) {
+          LOG.warn("Overwriting configuration for key='{}' with value='{}' " +
+              "previous_value='{}'", key, value, previous);
+        }
+      });
+    });

Review comment:
       Ok, I'll rewrite it, thanks for the explanation. It seemed a little bit shorter with the forEach, and more readable with the key value names for example (instead of entry.getKey(), entry.getValue()).




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org