You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by "dannycjones (via GitHub)" <gi...@apache.org> on 2023/06/22 13:02:46 UTC

[GitHub] [hadoop] dannycjones commented on a diff in pull request #5763: HADOOP-18778. Fixes failing tests when CSE is enabled.

dannycjones commented on code in PR #5763:
URL: https://github.com/apache/hadoop/pull/5763#discussion_r1238487030


##########
hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/auth/ITestAssumeRole.java:
##########
@@ -448,7 +448,7 @@ public void testReadOnlyOperations() throws Throwable {
         policy(
             statement(false, S3_ALL_BUCKETS, S3_PATH_WRITE_OPERATIONS),
             STATEMENT_ALL_S3,
-            STATEMENT_ALLOW_SSE_KMS_READ));
+            STATEMENT_ALLOW_SSE_KMS_RW));

Review Comment:
   Why does this need changing? It's for Server Side Encryption. (Was it always broken?)



##########
hadoop-tools/hadoop-aws/src/test/java/org/apache/hadoop/fs/s3a/ITestS3APrefetchingInputStream.java:
##########
@@ -118,6 +118,7 @@ private static int calculateNumBlocks(long largeFileSize, int blockSize) {
   @Test
   public void testReadLargeFileFully() throws Throwable {
     describe("read a large file fully, uses S3ACachingInputStream");
+    skipIfClientSideEncryption();

Review Comment:
   Shall we just move these into `openFS()` since we're assuming the FS it provides is not compatible with CSE for now?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org