You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by GitBox <gi...@apache.org> on 2021/02/23 15:31:31 UTC

[GitHub] [hadoop] vinaysbadami commented on a change in pull request #2698: HADOOP-17527. ABFS: Fix boundary conditions in InputStream seek and skip

vinaysbadami commented on a change in pull request #2698:
URL: https://github.com/apache/hadoop/pull/2698#discussion_r580787165



##########
File path: hadoop-tools/hadoop-azure/src/test/java/org/apache/hadoop/fs/azurebfs/ITestAbfsInputStreamStatistics.java
##########
@@ -100,28 +100,31 @@ public void testSeekStatistics() throws IOException {
     AbfsOutputStream out = null;
     AbfsInputStream in = null;
 
+    int readBufferSize = getConfiguration().getReadBufferSize();
+    byte[] buf = new byte[readBufferSize + 1];

Review comment:
       why +1 ?

##########
File path: hadoop-tools/hadoop-azure/src/test/java/org/apache/hadoop/fs/azurebfs/ITestAzureBlobFileSystemRandomRead.java
##########
@@ -405,6 +418,27 @@ public void testSkipAndAvailableAndPosition() throws Exception {
     }
   }
 
+  @Test
+  public void testZeroByteFile() throws Exception {
+    Path emptyFile = new Path("/emptyFile");
+    getFileSystem().create(emptyFile);
+    FSDataInputStream in = getFileSystem().open(emptyFile);
+    assertEquals("Initial position of inputstream in empty file is 0", 0,
+        in.getPos());
+    in.seek(0);
+    assertEquals("Seek to 0 should succeed", 0, in.getPos());
+    in.skip(0);

Review comment:
       skip returns a long ==> u should assert it is the correct value




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org