You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@ozone.apache.org by GitBox <gi...@apache.org> on 2021/09/02 10:08:06 UTC
[GitHub] [ozone] dombizita opened a new pull request #2609: HDDS-5615 Add a simple test suite for HTTPFS GW
dombizita opened a new pull request #2609:
URL: https://github.com/apache/ozone/pull/2609
## What changes were proposed in this pull request?
Added robot tests for the HttpFS gateway. In the tests we use curl command to try out the operations and check the response for the expected objects. The robot tests are added in the compose/ozone test.sh script.
## What is the link to the Apache JIRA
https://issues.apache.org/jira/browse/HDDS-5615
## How was this patch tested?
The robot tests were run by the test.sh script in the compose/ozone directory and all the test cases passed.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For additional commands, e-mail: issues-help@ozone.apache.org
[GitHub] [ozone] fapifta commented on pull request #2609: HDDS-5615 Add a simple test suite for HTTPFS GW
Posted by GitBox <gi...@apache.org>.
fapifta commented on pull request #2609:
URL: https://github.com/apache/ozone/pull/2609#issuecomment-914483666
Hi @dombizita I have took a look into the changes, and left a few inline comments requesting for some improvements and clarification, thank you for your work on this so far, I hope you don't mind to implement a few more requests, to make the test more comprehensive, it would also be nice to at least clarify weird or unexpected behaviour in comments where appropriate. For details see the inline comments plese.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For additional commands, e-mail: issues-help@ozone.apache.org
[GitHub] [ozone] dombizita commented on a change in pull request #2609: HDDS-5615 Add a simple test suite for HTTPFS GW
Posted by GitBox <gi...@apache.org>.
dombizita commented on a change in pull request #2609:
URL: https://github.com/apache/ozone/pull/2609#discussion_r708372222
##########
File path: hadoop-ozone/dist/src/main/smoketest/httpfs/operations_tests.robot
##########
@@ -0,0 +1,103 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements. See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+*** Settings ***
+Documentation HttpFS gateway test with curl commands
+Library Process
+Library String
+Library BuiltIn
+Resource operations.robot
+
+*** Variables ***
+${URL} http://httpfs:14000/webhdfs/v1/
+${USERNAME} hdfs
+
+*** Test Cases ***
+Create volume
+ ${volume} = Execute curl command vol1 MKDIRS -X PUT
+ Should contain ${volume} true
+
+Create first bucket
+ ${bucket} = Execute curl command vol1/buck1 MKDIRS -X PUT
+ Should contain ${bucket} true
+
+Create second bucket
+ ${bucket} = Execute curl command vol1/buck2 MKDIRS -X PUT
+ Should contain ${bucket} true
+
+Create local testfile
+ Create file testfile
+
+Create testfile
+ ${file} = Execute create file command vol1/buck1/testfile testfile
+ Should contain ${file} http://httpfs:14000/webhdfs/v1/vol1/buck1/testfile
+
+Read file
+ ${file} = Execute curl command vol1/buck1/testfile OPEN -L
+ Should contain ${file} Hello world!
+
+Delete bucket
+ ${bucket} = Execute curl command vol1/buck2 DELETE -X DELETE
+ Should contain ${bucket} true
+
+Get status of bucket
+ ${status} = Execute curl command vol1/buck1 GETFILESTATUS ${EMPTY}
+ Should contain ${status} FileStatus DIRECTORY
+
+Get status of file
+ ${status} = Execute curl command vol1/buck1/testfile GETFILESTATUS ${EMPTY}
+ Should contain ${status} FileStatus FILE 13
+
+List bucket
+ ${list} = Execute curl command vol1/buck1 LISTSTATUS ${EMPTY}
+ Should contain ${list} FileStatus testfile FILE 13
+
+List file
+ ${list} = Execute curl command vol1/buck1/testfile LISTSTATUS ${EMPTY}
+ Should contain ${list} FileStatus FILE 13
+
+List directory iteratively
+ ${list} = Execute curl command vol1 LISTSTATUS_BATCH&startAfter=buck1 ${EMPTY}
+ Should contain ${list} DirectoryListing buck1
+ Should not contain ${list} buck2
+
+Get content summary of directory
+ ${summary} = Execute curl command vol1 GETCONTENTSUMMARY ${EMPTY}
+ Should contain ${summary} ContentSummary "directoryCount":2 "fileCount":1
+
+Get quota usage of directory
+ ${usage} = Execute curl command vol1 GETQUOTAUSAGE ${EMPTY}
+ Should contain ${usage} QuotaUsage "fileAndDirectoryCount":3
+
+Get home directory
+ ${home} = Execute curl command ${EMPTY} GETHOMEDIRECTORY ${EMPTY}
+ Should contain ${home} "Path":"\\/user\\/hdfs"
+
+Get trash root
+ ${trash} = Execute curl command vol1/buck1/testfile GETTRASHROOT ${EMPTY}
+ Should contain ${trash} "Path":"\\/vol1\\/buck1\\/.Trash\\/hdfs"
+
+Set permission of bucket
+ Execute curl command vol1/buck1 SETPERMISSION&permission=7 -X PUT
Review comment:
Yes, it should be an octal, thank you for pointing it out. I modified the key words, so I was able to check the result json object of the list status operation. Even tho the results return code was 0 and the json returned with true, the change didn't happen. Thank you so much for the idea of testing the functionality other way. I commented these test cases until we fix the operation.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For additional commands, e-mail: issues-help@ozone.apache.org
[GitHub] [ozone] dombizita commented on a change in pull request #2609: HDDS-5615 Add a simple test suite for HTTPFS GW
Posted by GitBox <gi...@apache.org>.
dombizita commented on a change in pull request #2609:
URL: https://github.com/apache/ozone/pull/2609#discussion_r708373062
##########
File path: hadoop-ozone/dist/src/main/smoketest/httpfs/operations_tests.robot
##########
@@ -0,0 +1,103 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements. See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+*** Settings ***
+Documentation HttpFS gateway test with curl commands
+Library Process
+Library String
+Library BuiltIn
+Resource operations.robot
+
+*** Variables ***
+${URL} http://httpfs:14000/webhdfs/v1/
+${USERNAME} hdfs
+
+*** Test Cases ***
+Create volume
+ ${volume} = Execute curl command vol1 MKDIRS -X PUT
+ Should contain ${volume} true
+
+Create first bucket
+ ${bucket} = Execute curl command vol1/buck1 MKDIRS -X PUT
+ Should contain ${bucket} true
+
+Create second bucket
+ ${bucket} = Execute curl command vol1/buck2 MKDIRS -X PUT
+ Should contain ${bucket} true
+
+Create local testfile
+ Create file testfile
+
+Create testfile
+ ${file} = Execute create file command vol1/buck1/testfile testfile
+ Should contain ${file} http://httpfs:14000/webhdfs/v1/vol1/buck1/testfile
+
+Read file
+ ${file} = Execute curl command vol1/buck1/testfile OPEN -L
+ Should contain ${file} Hello world!
+
+Delete bucket
+ ${bucket} = Execute curl command vol1/buck2 DELETE -X DELETE
+ Should contain ${bucket} true
+
+Get status of bucket
+ ${status} = Execute curl command vol1/buck1 GETFILESTATUS ${EMPTY}
+ Should contain ${status} FileStatus DIRECTORY
+
+Get status of file
+ ${status} = Execute curl command vol1/buck1/testfile GETFILESTATUS ${EMPTY}
+ Should contain ${status} FileStatus FILE 13
+
+List bucket
+ ${list} = Execute curl command vol1/buck1 LISTSTATUS ${EMPTY}
+ Should contain ${list} FileStatus testfile FILE 13
+
+List file
+ ${list} = Execute curl command vol1/buck1/testfile LISTSTATUS ${EMPTY}
+ Should contain ${list} FileStatus FILE 13
+
+List directory iteratively
+ ${list} = Execute curl command vol1 LISTSTATUS_BATCH&startAfter=buck1 ${EMPTY}
+ Should contain ${list} DirectoryListing buck1
+ Should not contain ${list} buck2
+
+Get content summary of directory
+ ${summary} = Execute curl command vol1 GETCONTENTSUMMARY ${EMPTY}
+ Should contain ${summary} ContentSummary "directoryCount":2 "fileCount":1
+
+Get quota usage of directory
+ ${usage} = Execute curl command vol1 GETQUOTAUSAGE ${EMPTY}
+ Should contain ${usage} QuotaUsage "fileAndDirectoryCount":3
+
+Get home directory
+ ${home} = Execute curl command ${EMPTY} GETHOMEDIRECTORY ${EMPTY}
+ Should contain ${home} "Path":"\\/user\\/hdfs"
+
+Get trash root
+ ${trash} = Execute curl command vol1/buck1/testfile GETTRASHROOT ${EMPTY}
+ Should contain ${trash} "Path":"\\/vol1\\/buck1\\/.Trash\\/hdfs"
+
+Set permission of bucket
+ Execute curl command vol1/buck1 SETPERMISSION&permission=7 -X PUT
+
+Set replication factor of bucket
+ ${cmd} = Execute curl command vol1/buck1 SETREPLICATION&replication=2 -X PUT
Review comment:
Same as I mentioned before, the change didn't happened in this case too. I commented the test case until we fix it.
##########
File path: hadoop-ozone/dist/src/main/smoketest/httpfs/operations_tests.robot
##########
@@ -0,0 +1,103 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements. See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+*** Settings ***
+Documentation HttpFS gateway test with curl commands
+Library Process
+Library String
+Library BuiltIn
+Resource operations.robot
+
+*** Variables ***
+${URL} http://httpfs:14000/webhdfs/v1/
+${USERNAME} hdfs
+
+*** Test Cases ***
+Create volume
+ ${volume} = Execute curl command vol1 MKDIRS -X PUT
+ Should contain ${volume} true
+
+Create first bucket
+ ${bucket} = Execute curl command vol1/buck1 MKDIRS -X PUT
+ Should contain ${bucket} true
+
+Create second bucket
+ ${bucket} = Execute curl command vol1/buck2 MKDIRS -X PUT
+ Should contain ${bucket} true
+
+Create local testfile
+ Create file testfile
+
+Create testfile
+ ${file} = Execute create file command vol1/buck1/testfile testfile
+ Should contain ${file} http://httpfs:14000/webhdfs/v1/vol1/buck1/testfile
+
+Read file
+ ${file} = Execute curl command vol1/buck1/testfile OPEN -L
+ Should contain ${file} Hello world!
+
+Delete bucket
+ ${bucket} = Execute curl command vol1/buck2 DELETE -X DELETE
+ Should contain ${bucket} true
+
+Get status of bucket
+ ${status} = Execute curl command vol1/buck1 GETFILESTATUS ${EMPTY}
+ Should contain ${status} FileStatus DIRECTORY
+
+Get status of file
+ ${status} = Execute curl command vol1/buck1/testfile GETFILESTATUS ${EMPTY}
+ Should contain ${status} FileStatus FILE 13
+
+List bucket
+ ${list} = Execute curl command vol1/buck1 LISTSTATUS ${EMPTY}
+ Should contain ${list} FileStatus testfile FILE 13
+
+List file
+ ${list} = Execute curl command vol1/buck1/testfile LISTSTATUS ${EMPTY}
+ Should contain ${list} FileStatus FILE 13
+
+List directory iteratively
+ ${list} = Execute curl command vol1 LISTSTATUS_BATCH&startAfter=buck1 ${EMPTY}
+ Should contain ${list} DirectoryListing buck1
+ Should not contain ${list} buck2
+
+Get content summary of directory
+ ${summary} = Execute curl command vol1 GETCONTENTSUMMARY ${EMPTY}
+ Should contain ${summary} ContentSummary "directoryCount":2 "fileCount":1
+
+Get quota usage of directory
+ ${usage} = Execute curl command vol1 GETQUOTAUSAGE ${EMPTY}
+ Should contain ${usage} QuotaUsage "fileAndDirectoryCount":3
+
+Get home directory
+ ${home} = Execute curl command ${EMPTY} GETHOMEDIRECTORY ${EMPTY}
+ Should contain ${home} "Path":"\\/user\\/hdfs"
+
+Get trash root
+ ${trash} = Execute curl command vol1/buck1/testfile GETTRASHROOT ${EMPTY}
+ Should contain ${trash} "Path":"\\/vol1\\/buck1\\/.Trash\\/hdfs"
+
+Set permission of bucket
+ Execute curl command vol1/buck1 SETPERMISSION&permission=7 -X PUT
+
+Set replication factor of bucket
+ ${cmd} = Execute curl command vol1/buck1 SETREPLICATION&replication=2 -X PUT
+ Should contain ${cmd} true
+
+Set access and modification time of bucket
Review comment:
Same as I mentioned before, the change didn't happened in this case too. I commented the test case until we fix it.
##########
File path: hadoop-ozone/dist/src/main/smoketest/httpfs/operations_tests.robot
##########
@@ -0,0 +1,103 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements. See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+*** Settings ***
+Documentation HttpFS gateway test with curl commands
+Library Process
+Library String
+Library BuiltIn
+Resource operations.robot
+
+*** Variables ***
+${URL} http://httpfs:14000/webhdfs/v1/
+${USERNAME} hdfs
+
+*** Test Cases ***
+Create volume
+ ${volume} = Execute curl command vol1 MKDIRS -X PUT
+ Should contain ${volume} true
+
+Create first bucket
+ ${bucket} = Execute curl command vol1/buck1 MKDIRS -X PUT
+ Should contain ${bucket} true
+
+Create second bucket
+ ${bucket} = Execute curl command vol1/buck2 MKDIRS -X PUT
+ Should contain ${bucket} true
+
+Create local testfile
+ Create file testfile
+
+Create testfile
+ ${file} = Execute create file command vol1/buck1/testfile testfile
+ Should contain ${file} http://httpfs:14000/webhdfs/v1/vol1/buck1/testfile
+
+Read file
+ ${file} = Execute curl command vol1/buck1/testfile OPEN -L
+ Should contain ${file} Hello world!
+
+Delete bucket
+ ${bucket} = Execute curl command vol1/buck2 DELETE -X DELETE
+ Should contain ${bucket} true
+
+Get status of bucket
+ ${status} = Execute curl command vol1/buck1 GETFILESTATUS ${EMPTY}
+ Should contain ${status} FileStatus DIRECTORY
+
+Get status of file
+ ${status} = Execute curl command vol1/buck1/testfile GETFILESTATUS ${EMPTY}
+ Should contain ${status} FileStatus FILE 13
+
+List bucket
+ ${list} = Execute curl command vol1/buck1 LISTSTATUS ${EMPTY}
+ Should contain ${list} FileStatus testfile FILE 13
+
+List file
+ ${list} = Execute curl command vol1/buck1/testfile LISTSTATUS ${EMPTY}
+ Should contain ${list} FileStatus FILE 13
+
+List directory iteratively
+ ${list} = Execute curl command vol1 LISTSTATUS_BATCH&startAfter=buck1 ${EMPTY}
+ Should contain ${list} DirectoryListing buck1
+ Should not contain ${list} buck2
+
+Get content summary of directory
+ ${summary} = Execute curl command vol1 GETCONTENTSUMMARY ${EMPTY}
+ Should contain ${summary} ContentSummary "directoryCount":2 "fileCount":1
+
+Get quota usage of directory
+ ${usage} = Execute curl command vol1 GETQUOTAUSAGE ${EMPTY}
+ Should contain ${usage} QuotaUsage "fileAndDirectoryCount":3
+
+Get home directory
+ ${home} = Execute curl command ${EMPTY} GETHOMEDIRECTORY ${EMPTY}
+ Should contain ${home} "Path":"\\/user\\/hdfs"
+
+Get trash root
+ ${trash} = Execute curl command vol1/buck1/testfile GETTRASHROOT ${EMPTY}
+ Should contain ${trash} "Path":"\\/vol1\\/buck1\\/.Trash\\/hdfs"
+
+Set permission of bucket
+ Execute curl command vol1/buck1 SETPERMISSION&permission=7 -X PUT
+
+Set replication factor of bucket
+ ${cmd} = Execute curl command vol1/buck1 SETREPLICATION&replication=2 -X PUT
+ Should contain ${cmd} true
+
+Set access and modification time of bucket
+ Execute curl command vol1/buck1 SETTIMES&modificationtime=10&accesstime=10 -X PUT
+
+Set owner of bucket
Review comment:
Same as I mentioned before, the change didn't happened in this case too. I commented the test case until we fix it.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For additional commands, e-mail: issues-help@ozone.apache.org
[GitHub] [ozone] fapifta commented on pull request #2609: HDDS-5615 Add a simple test suite for HTTPFS GW
Posted by GitBox <gi...@apache.org>.
fapifta commented on pull request #2609:
URL: https://github.com/apache/ozone/pull/2609#issuecomment-931135838
Hi Zita, great work on this so far, so it turned out that some of the functionality is not working properly, I am happy to see that the effort put into those tests are preserved and just commenting them out is a good idea.
As a basic test set, I think it is good, we need to further check into the missing pieces and re-enable the tests but that we can do in a separate effort.
If CI is fixed after reverting the curator dependency removal from the httpfs branch, I am +1 to commit this.
Can you please trigger a new CI run?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For additional commands, e-mail: issues-help@ozone.apache.org
[GitHub] [ozone] dombizita commented on a change in pull request #2609: HDDS-5615 Add a simple test suite for HTTPFS GW
Posted by GitBox <gi...@apache.org>.
dombizita commented on a change in pull request #2609:
URL: https://github.com/apache/ozone/pull/2609#discussion_r708367878
##########
File path: hadoop-ozone/dist/src/main/smoketest/httpfs/operations_tests.robot
##########
@@ -0,0 +1,103 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements. See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+*** Settings ***
+Documentation HttpFS gateway test with curl commands
+Library Process
+Library String
+Library BuiltIn
+Resource operations.robot
+
+*** Variables ***
+${URL} http://httpfs:14000/webhdfs/v1/
+${USERNAME} hdfs
+
+*** Test Cases ***
+Create volume
+ ${volume} = Execute curl command vol1 MKDIRS -X PUT
+ Should contain ${volume} true
+
+Create first bucket
+ ${bucket} = Execute curl command vol1/buck1 MKDIRS -X PUT
+ Should contain ${bucket} true
+
+Create second bucket
+ ${bucket} = Execute curl command vol1/buck2 MKDIRS -X PUT
+ Should contain ${bucket} true
+
+Create local testfile
+ Create file testfile
+
+Create testfile
+ ${file} = Execute create file command vol1/buck1/testfile testfile
+ Should contain ${file} http://httpfs:14000/webhdfs/v1/vol1/buck1/testfile
+
+Read file
+ ${file} = Execute curl command vol1/buck1/testfile OPEN -L
+ Should contain ${file} Hello world!
+
+Delete bucket
+ ${bucket} = Execute curl command vol1/buck2 DELETE -X DELETE
+ Should contain ${bucket} true
+
+Get status of bucket
+ ${status} = Execute curl command vol1/buck1 GETFILESTATUS ${EMPTY}
+ Should contain ${status} FileStatus DIRECTORY
+
+Get status of file
+ ${status} = Execute curl command vol1/buck1/testfile GETFILESTATUS ${EMPTY}
+ Should contain ${status} FileStatus FILE 13
+
+List bucket
+ ${list} = Execute curl command vol1/buck1 LISTSTATUS ${EMPTY}
+ Should contain ${list} FileStatus testfile FILE 13
+
+List file
+ ${list} = Execute curl command vol1/buck1/testfile LISTSTATUS ${EMPTY}
+ Should contain ${list} FileStatus FILE 13
+
+List directory iteratively
Review comment:
I moved this test case before the deleting of buck2 and left the start at buck1, in that case it should only list buck2, because it shouldn't list the start as I understood from the documentation example (https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/WebHDFS.html#Iteratively_List_a_Directory). But unfortunately it doesn't work properly, it listed both buckets. Because of the missing functionality I commented this test case, thank you for your comment!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For additional commands, e-mail: issues-help@ozone.apache.org
[GitHub] [ozone] fapifta merged pull request #2609: HDDS-5615 Add a simple test suite for HTTPFS GW
Posted by GitBox <gi...@apache.org>.
fapifta merged pull request #2609:
URL: https://github.com/apache/ozone/pull/2609
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For additional commands, e-mail: issues-help@ozone.apache.org
[GitHub] [ozone] fapifta commented on a change in pull request #2609: HDDS-5615 Add a simple test suite for HTTPFS GW
Posted by GitBox <gi...@apache.org>.
fapifta commented on a change in pull request #2609:
URL: https://github.com/apache/ozone/pull/2609#discussion_r703689669
##########
File path: hadoop-ozone/dist/src/main/smoketest/httpfs/operations_tests.robot
##########
@@ -0,0 +1,103 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements. See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+*** Settings ***
+Documentation HttpFS gateway test with curl commands
+Library Process
+Library String
+Library BuiltIn
+Resource operations.robot
+
+*** Variables ***
+${URL} http://httpfs:14000/webhdfs/v1/
+${USERNAME} hdfs
+
+*** Test Cases ***
+Create volume
+ ${volume} = Execute curl command vol1 MKDIRS -X PUT
+ Should contain ${volume} true
+
+Create first bucket
+ ${bucket} = Execute curl command vol1/buck1 MKDIRS -X PUT
+ Should contain ${bucket} true
+
+Create second bucket
+ ${bucket} = Execute curl command vol1/buck2 MKDIRS -X PUT
+ Should contain ${bucket} true
+
+Create local testfile
+ Create file testfile
+
+Create testfile
+ ${file} = Execute create file command vol1/buck1/testfile testfile
+ Should contain ${file} http://httpfs:14000/webhdfs/v1/vol1/buck1/testfile
+
+Read file
+ ${file} = Execute curl command vol1/buck1/testfile OPEN -L
+ Should contain ${file} Hello world!
+
+Delete bucket
+ ${bucket} = Execute curl command vol1/buck2 DELETE -X DELETE
+ Should contain ${bucket} true
+
+Get status of bucket
+ ${status} = Execute curl command vol1/buck1 GETFILESTATUS ${EMPTY}
+ Should contain ${status} FileStatus DIRECTORY
+
+Get status of file
+ ${status} = Execute curl command vol1/buck1/testfile GETFILESTATUS ${EMPTY}
+ Should contain ${status} FileStatus FILE 13
+
+List bucket
+ ${list} = Execute curl command vol1/buck1 LISTSTATUS ${EMPTY}
+ Should contain ${list} FileStatus testfile FILE 13
+
+List file
+ ${list} = Execute curl command vol1/buck1/testfile LISTSTATUS ${EMPTY}
+ Should contain ${list} FileStatus FILE 13
+
+List directory iteratively
+ ${list} = Execute curl command vol1 LISTSTATUS_BATCH&startAfter=buck1 ${EMPTY}
+ Should contain ${list} DirectoryListing buck1
+ Should not contain ${list} buck2
+
+Get content summary of directory
+ ${summary} = Execute curl command vol1 GETCONTENTSUMMARY ${EMPTY}
+ Should contain ${summary} ContentSummary "directoryCount":2 "fileCount":1
+
+Get quota usage of directory
+ ${usage} = Execute curl command vol1 GETQUOTAUSAGE ${EMPTY}
+ Should contain ${usage} QuotaUsage "fileAndDirectoryCount":3
+
+Get home directory
+ ${home} = Execute curl command ${EMPTY} GETHOMEDIRECTORY ${EMPTY}
+ Should contain ${home} "Path":"\\/user\\/hdfs"
+
+Get trash root
+ ${trash} = Execute curl command vol1/buck1/testfile GETTRASHROOT ${EMPTY}
+ Should contain ${trash} "Path":"\\/vol1\\/buck1\\/.Trash\\/hdfs"
+
+Set permission of bucket
+ Execute curl command vol1/buck1 SETPERMISSION&permission=7 -X PUT
Review comment:
I believe the permission is a triplet of octal permissions here, so something like 755 or similar should be passed here as a parameter. Also we might want to test the current permissions returned are the default we expect before the test, and we should test if the permission is changed correctly after this test in a separate curl call. What do you think?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For additional commands, e-mail: issues-help@ozone.apache.org
[GitHub] [ozone] dombizita commented on a change in pull request #2609: HDDS-5615 Add a simple test suite for HTTPFS GW
Posted by GitBox <gi...@apache.org>.
dombizita commented on a change in pull request #2609:
URL: https://github.com/apache/ozone/pull/2609#discussion_r708363929
##########
File path: hadoop-ozone/dist/src/main/compose/ozone/test.sh
##########
@@ -28,6 +28,8 @@ source "$COMPOSE_DIR/../testlib.sh"
start_docker_env
+execute_robot_test scm httpfs
Review comment:
Good idea, I left it there because of the testing in my own environment. Moved the executing of the HttpFS tests after the CLI tests.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For additional commands, e-mail: issues-help@ozone.apache.org
[GitHub] [ozone] fapifta commented on a change in pull request #2609: HDDS-5615 Add a simple test suite for HTTPFS GW
Posted by GitBox <gi...@apache.org>.
fapifta commented on a change in pull request #2609:
URL: https://github.com/apache/ozone/pull/2609#discussion_r703672624
##########
File path: hadoop-ozone/dist/src/main/compose/ozone/test.sh
##########
@@ -28,6 +28,8 @@ source "$COMPOSE_DIR/../testlib.sh"
start_docker_env
+execute_robot_test scm httpfs
Review comment:
Should we try out the interface after the CLI tests at the end, when we already know that everything is working fine in Ozone to avoid false positives in httpfs gw tests?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For additional commands, e-mail: issues-help@ozone.apache.org
[GitHub] [ozone] fapifta commented on a change in pull request #2609: HDDS-5615 Add a simple test suite for HTTPFS GW
Posted by GitBox <gi...@apache.org>.
fapifta commented on a change in pull request #2609:
URL: https://github.com/apache/ozone/pull/2609#discussion_r703679143
##########
File path: hadoop-ozone/dist/src/main/smoketest/httpfs/operations_tests.robot
##########
@@ -0,0 +1,103 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements. See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+*** Settings ***
+Documentation HttpFS gateway test with curl commands
+Library Process
+Library String
+Library BuiltIn
+Resource operations.robot
+
+*** Variables ***
+${URL} http://httpfs:14000/webhdfs/v1/
+${USERNAME} hdfs
+
+*** Test Cases ***
+Create volume
+ ${volume} = Execute curl command vol1 MKDIRS -X PUT
+ Should contain ${volume} true
+
+Create first bucket
+ ${bucket} = Execute curl command vol1/buck1 MKDIRS -X PUT
+ Should contain ${bucket} true
+
+Create second bucket
+ ${bucket} = Execute curl command vol1/buck2 MKDIRS -X PUT
+ Should contain ${bucket} true
+
+Create local testfile
+ Create file testfile
+
+Create testfile
+ ${file} = Execute create file command vol1/buck1/testfile testfile
+ Should contain ${file} http://httpfs:14000/webhdfs/v1/vol1/buck1/testfile
+
+Read file
+ ${file} = Execute curl command vol1/buck1/testfile OPEN -L
+ Should contain ${file} Hello world!
+
+Delete bucket
+ ${bucket} = Execute curl command vol1/buck2 DELETE -X DELETE
+ Should contain ${bucket} true
+
+Get status of bucket
+ ${status} = Execute curl command vol1/buck1 GETFILESTATUS ${EMPTY}
+ Should contain ${status} FileStatus DIRECTORY
+
+Get status of file
+ ${status} = Execute curl command vol1/buck1/testfile GETFILESTATUS ${EMPTY}
+ Should contain ${status} FileStatus FILE 13
+
+List bucket
+ ${list} = Execute curl command vol1/buck1 LISTSTATUS ${EMPTY}
+ Should contain ${list} FileStatus testfile FILE 13
+
+List file
+ ${list} = Execute curl command vol1/buck1/testfile LISTSTATUS ${EMPTY}
+ Should contain ${list} FileStatus FILE 13
+
+List directory iteratively
Review comment:
This one is a bit interesting this way, I think this test should run before deleting buck2, and start at buck2, so we can verify that buck2 is listed, but buck1 is left out from the test.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For additional commands, e-mail: issues-help@ozone.apache.org
[GitHub] [ozone] fapifta commented on a change in pull request #2609: HDDS-5615 Add a simple test suite for HTTPFS GW
Posted by GitBox <gi...@apache.org>.
fapifta commented on a change in pull request #2609:
URL: https://github.com/apache/ozone/pull/2609#discussion_r703691962
##########
File path: hadoop-ozone/dist/src/main/smoketest/httpfs/operations_tests.robot
##########
@@ -0,0 +1,103 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements. See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+*** Settings ***
+Documentation HttpFS gateway test with curl commands
+Library Process
+Library String
+Library BuiltIn
+Resource operations.robot
+
+*** Variables ***
+${URL} http://httpfs:14000/webhdfs/v1/
+${USERNAME} hdfs
+
+*** Test Cases ***
+Create volume
+ ${volume} = Execute curl command vol1 MKDIRS -X PUT
+ Should contain ${volume} true
+
+Create first bucket
+ ${bucket} = Execute curl command vol1/buck1 MKDIRS -X PUT
+ Should contain ${bucket} true
+
+Create second bucket
+ ${bucket} = Execute curl command vol1/buck2 MKDIRS -X PUT
+ Should contain ${bucket} true
+
+Create local testfile
+ Create file testfile
+
+Create testfile
+ ${file} = Execute create file command vol1/buck1/testfile testfile
+ Should contain ${file} http://httpfs:14000/webhdfs/v1/vol1/buck1/testfile
+
+Read file
+ ${file} = Execute curl command vol1/buck1/testfile OPEN -L
+ Should contain ${file} Hello world!
+
+Delete bucket
+ ${bucket} = Execute curl command vol1/buck2 DELETE -X DELETE
+ Should contain ${bucket} true
+
+Get status of bucket
+ ${status} = Execute curl command vol1/buck1 GETFILESTATUS ${EMPTY}
+ Should contain ${status} FileStatus DIRECTORY
+
+Get status of file
+ ${status} = Execute curl command vol1/buck1/testfile GETFILESTATUS ${EMPTY}
+ Should contain ${status} FileStatus FILE 13
+
+List bucket
+ ${list} = Execute curl command vol1/buck1 LISTSTATUS ${EMPTY}
+ Should contain ${list} FileStatus testfile FILE 13
+
+List file
+ ${list} = Execute curl command vol1/buck1/testfile LISTSTATUS ${EMPTY}
+ Should contain ${list} FileStatus FILE 13
+
+List directory iteratively
+ ${list} = Execute curl command vol1 LISTSTATUS_BATCH&startAfter=buck1 ${EMPTY}
+ Should contain ${list} DirectoryListing buck1
+ Should not contain ${list} buck2
+
+Get content summary of directory
+ ${summary} = Execute curl command vol1 GETCONTENTSUMMARY ${EMPTY}
+ Should contain ${summary} ContentSummary "directoryCount":2 "fileCount":1
+
+Get quota usage of directory
+ ${usage} = Execute curl command vol1 GETQUOTAUSAGE ${EMPTY}
+ Should contain ${usage} QuotaUsage "fileAndDirectoryCount":3
+
+Get home directory
+ ${home} = Execute curl command ${EMPTY} GETHOMEDIRECTORY ${EMPTY}
+ Should contain ${home} "Path":"\\/user\\/hdfs"
+
+Get trash root
+ ${trash} = Execute curl command vol1/buck1/testfile GETTRASHROOT ${EMPTY}
+ Should contain ${trash} "Path":"\\/vol1\\/buck1\\/.Trash\\/hdfs"
+
+Set permission of bucket
+ Execute curl command vol1/buck1 SETPERMISSION&permission=7 -X PUT
+
+Set replication factor of bucket
+ ${cmd} = Execute curl command vol1/buck1 SETREPLICATION&replication=2 -X PUT
Review comment:
replication factor of 2 is unknown for Ozone, so we might expect a false here. If the code as it is returns true, which I would guess means a success, then that is wrong so at least we should note this here as a TODO comment to fix it later on.
Also we should be careful about what to expect here and check how the CLI handles this, then expect the changes to be the same, and document any incorrect/unimplemented behaviour with the expectation formulated as a subsequent test to check the result of this call.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For additional commands, e-mail: issues-help@ozone.apache.org
[GitHub] [ozone] fapifta commented on a change in pull request #2609: HDDS-5615 Add a simple test suite for HTTPFS GW
Posted by GitBox <gi...@apache.org>.
fapifta commented on a change in pull request #2609:
URL: https://github.com/apache/ozone/pull/2609#discussion_r703692974
##########
File path: hadoop-ozone/dist/src/main/smoketest/httpfs/operations_tests.robot
##########
@@ -0,0 +1,103 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements. See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+*** Settings ***
+Documentation HttpFS gateway test with curl commands
+Library Process
+Library String
+Library BuiltIn
+Resource operations.robot
+
+*** Variables ***
+${URL} http://httpfs:14000/webhdfs/v1/
+${USERNAME} hdfs
+
+*** Test Cases ***
+Create volume
+ ${volume} = Execute curl command vol1 MKDIRS -X PUT
+ Should contain ${volume} true
+
+Create first bucket
+ ${bucket} = Execute curl command vol1/buck1 MKDIRS -X PUT
+ Should contain ${bucket} true
+
+Create second bucket
+ ${bucket} = Execute curl command vol1/buck2 MKDIRS -X PUT
+ Should contain ${bucket} true
+
+Create local testfile
+ Create file testfile
+
+Create testfile
+ ${file} = Execute create file command vol1/buck1/testfile testfile
+ Should contain ${file} http://httpfs:14000/webhdfs/v1/vol1/buck1/testfile
+
+Read file
+ ${file} = Execute curl command vol1/buck1/testfile OPEN -L
+ Should contain ${file} Hello world!
+
+Delete bucket
+ ${bucket} = Execute curl command vol1/buck2 DELETE -X DELETE
+ Should contain ${bucket} true
+
+Get status of bucket
+ ${status} = Execute curl command vol1/buck1 GETFILESTATUS ${EMPTY}
+ Should contain ${status} FileStatus DIRECTORY
+
+Get status of file
+ ${status} = Execute curl command vol1/buck1/testfile GETFILESTATUS ${EMPTY}
+ Should contain ${status} FileStatus FILE 13
+
+List bucket
+ ${list} = Execute curl command vol1/buck1 LISTSTATUS ${EMPTY}
+ Should contain ${list} FileStatus testfile FILE 13
+
+List file
+ ${list} = Execute curl command vol1/buck1/testfile LISTSTATUS ${EMPTY}
+ Should contain ${list} FileStatus FILE 13
+
+List directory iteratively
+ ${list} = Execute curl command vol1 LISTSTATUS_BATCH&startAfter=buck1 ${EMPTY}
+ Should contain ${list} DirectoryListing buck1
+ Should not contain ${list} buck2
+
+Get content summary of directory
+ ${summary} = Execute curl command vol1 GETCONTENTSUMMARY ${EMPTY}
+ Should contain ${summary} ContentSummary "directoryCount":2 "fileCount":1
+
+Get quota usage of directory
+ ${usage} = Execute curl command vol1 GETQUOTAUSAGE ${EMPTY}
+ Should contain ${usage} QuotaUsage "fileAndDirectoryCount":3
+
+Get home directory
+ ${home} = Execute curl command ${EMPTY} GETHOMEDIRECTORY ${EMPTY}
+ Should contain ${home} "Path":"\\/user\\/hdfs"
+
+Get trash root
+ ${trash} = Execute curl command vol1/buck1/testfile GETTRASHROOT ${EMPTY}
+ Should contain ${trash} "Path":"\\/vol1\\/buck1\\/.Trash\\/hdfs"
+
+Set permission of bucket
+ Execute curl command vol1/buck1 SETPERMISSION&permission=7 -X PUT
+
+Set replication factor of bucket
+ ${cmd} = Execute curl command vol1/buck1 SETREPLICATION&replication=2 -X PUT
+ Should contain ${cmd} true
+
+Set access and modification time of bucket
+ Execute curl command vol1/buck1 SETTIMES&modificationtime=10&accesstime=10 -X PUT
+
+Set owner of bucket
Review comment:
Same goes here, it would be nice to test if the owner change happened properly in a subsequent call.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For additional commands, e-mail: issues-help@ozone.apache.org
[GitHub] [ozone] fapifta commented on a change in pull request #2609: HDDS-5615 Add a simple test suite for HTTPFS GW
Posted by GitBox <gi...@apache.org>.
fapifta commented on a change in pull request #2609:
URL: https://github.com/apache/ozone/pull/2609#discussion_r703692685
##########
File path: hadoop-ozone/dist/src/main/smoketest/httpfs/operations_tests.robot
##########
@@ -0,0 +1,103 @@
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements. See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License. You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+*** Settings ***
+Documentation HttpFS gateway test with curl commands
+Library Process
+Library String
+Library BuiltIn
+Resource operations.robot
+
+*** Variables ***
+${URL} http://httpfs:14000/webhdfs/v1/
+${USERNAME} hdfs
+
+*** Test Cases ***
+Create volume
+ ${volume} = Execute curl command vol1 MKDIRS -X PUT
+ Should contain ${volume} true
+
+Create first bucket
+ ${bucket} = Execute curl command vol1/buck1 MKDIRS -X PUT
+ Should contain ${bucket} true
+
+Create second bucket
+ ${bucket} = Execute curl command vol1/buck2 MKDIRS -X PUT
+ Should contain ${bucket} true
+
+Create local testfile
+ Create file testfile
+
+Create testfile
+ ${file} = Execute create file command vol1/buck1/testfile testfile
+ Should contain ${file} http://httpfs:14000/webhdfs/v1/vol1/buck1/testfile
+
+Read file
+ ${file} = Execute curl command vol1/buck1/testfile OPEN -L
+ Should contain ${file} Hello world!
+
+Delete bucket
+ ${bucket} = Execute curl command vol1/buck2 DELETE -X DELETE
+ Should contain ${bucket} true
+
+Get status of bucket
+ ${status} = Execute curl command vol1/buck1 GETFILESTATUS ${EMPTY}
+ Should contain ${status} FileStatus DIRECTORY
+
+Get status of file
+ ${status} = Execute curl command vol1/buck1/testfile GETFILESTATUS ${EMPTY}
+ Should contain ${status} FileStatus FILE 13
+
+List bucket
+ ${list} = Execute curl command vol1/buck1 LISTSTATUS ${EMPTY}
+ Should contain ${list} FileStatus testfile FILE 13
+
+List file
+ ${list} = Execute curl command vol1/buck1/testfile LISTSTATUS ${EMPTY}
+ Should contain ${list} FileStatus FILE 13
+
+List directory iteratively
+ ${list} = Execute curl command vol1 LISTSTATUS_BATCH&startAfter=buck1 ${EMPTY}
+ Should contain ${list} DirectoryListing buck1
+ Should not contain ${list} buck2
+
+Get content summary of directory
+ ${summary} = Execute curl command vol1 GETCONTENTSUMMARY ${EMPTY}
+ Should contain ${summary} ContentSummary "directoryCount":2 "fileCount":1
+
+Get quota usage of directory
+ ${usage} = Execute curl command vol1 GETQUOTAUSAGE ${EMPTY}
+ Should contain ${usage} QuotaUsage "fileAndDirectoryCount":3
+
+Get home directory
+ ${home} = Execute curl command ${EMPTY} GETHOMEDIRECTORY ${EMPTY}
+ Should contain ${home} "Path":"\\/user\\/hdfs"
+
+Get trash root
+ ${trash} = Execute curl command vol1/buck1/testfile GETTRASHROOT ${EMPTY}
+ Should contain ${trash} "Path":"\\/vol1\\/buck1\\/.Trash\\/hdfs"
+
+Set permission of bucket
+ Execute curl command vol1/buck1 SETPERMISSION&permission=7 -X PUT
+
+Set replication factor of bucket
+ ${cmd} = Execute curl command vol1/buck1 SETREPLICATION&replication=2 -X PUT
+ Should contain ${cmd} true
+
+Set access and modification time of bucket
Review comment:
Again, it would be nice to check in a subsequent test if the modification really has an effect, and the effects are visible via httpfs if we query to see the changes.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For additional commands, e-mail: issues-help@ozone.apache.org
[GitHub] [ozone] fapifta commented on pull request #2609: HDDS-5615 Add a simple test suite for HTTPFS GW
Posted by GitBox <gi...@apache.org>.
fapifta commented on pull request #2609:
URL: https://github.com/apache/ozone/pull/2609#issuecomment-912805584
Hi @dombizita thank you for your work on this one. I have merged back the master into HDDS-5447-httpfs branch, and also merged the changes to your PR's branch, in order to have the CI changes and reduce our testing footprint on the branches and PRs whenever it is possible.
Please don't forget to update your local branches.
I owe you a review on this, and plan to do it next week.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@ozone.apache.org
For additional commands, e-mail: issues-help@ozone.apache.org