You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by "Ethan Guo (Jira)" <ji...@apache.org> on 2022/02/24 23:00:00 UTC

[jira] [Updated] (HUDI-3498) S3AInputStream failure upon fetching metadata: org.apache.http.ConnectionClosedException: Premature end of Content-Length delimited message body

     [ https://issues.apache.org/jira/browse/HUDI-3498?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Ethan Guo updated HUDI-3498:
----------------------------
    Description: 
Environment: spark 2.4.4 + aws-java-sdk-1.7.4 + hadoop-aws-2.7.4, Hudi 0.11.0-SNAPSHOT, metadata table enabled.

openjdk version "1.8.0_292"
OpenJDK Runtime Environment (AdoptOpenJDK)(build 1.8.0_292-b10)
OpenJDK 64-Bit Server VM (AdoptOpenJDK)(build 25.292-b10, mixed mode)

After applying the fix for HUDI-3341, when reading metadata table from hudi-cli with "metadata list-partitions", the command intermittently fails with the following exception:
{code:java}
Failed to retrieve list of partition from metadata
org.apache.hudi.exception.HoodieMetadataException: Failed to retrieve list of partition from metadata
    at org.apache.hudi.metadata.BaseTableMetadata.getAllPartitionPaths(BaseTableMetadata.java:110)
    at org.apache.hudi.cli.commands.MetadataCommand.listPartitions(MetadataCommand.java:208)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.springframework.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:216)
    at org.springframework.shell.core.SimpleExecutionStrategy.invoke(SimpleExecutionStrategy.java:68)
    at org.springframework.shell.core.SimpleExecutionStrategy.execute(SimpleExecutionStrategy.java:59)
    at org.springframework.shell.core.AbstractShell.executeCommand(AbstractShell.java:134)
    at org.springframework.shell.core.JLineShell.promptLoop(JLineShell.java:533)
    at org.springframework.shell.core.JLineShell.run(JLineShell.java:179)
    at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.hudi.exception.HoodieException: Exception when reading log file 
    at org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scan(AbstractHoodieLogRecordReader.java:335)
    at org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scan(AbstractHoodieLogRecordReader.java:180)
    at org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.performScan(HoodieMergedLogRecordScanner.java:104)
    at org.apache.hudi.metadata.HoodieMetadataMergedLogRecordReader.<init>(HoodieMetadataMergedLogRecordReader.java:71)
    at org.apache.hudi.metadata.HoodieMetadataMergedLogRecordReader.<init>(HoodieMetadataMergedLogRecordReader.java:51)
    at org.apache.hudi.metadata.HoodieMetadataMergedLogRecordReader$Builder.build(HoodieMetadataMergedLogRecordReader.java:246)
    at org.apache.hudi.metadata.HoodieBackedTableMetadata.getLogRecordScanner(HoodieBackedTableMetadata.java:377)
    at org.apache.hudi.metadata.HoodieBackedTableMetadata.lambda$openReadersIfNeeded$4(HoodieBackedTableMetadata.java:293)
    at java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1660)
    at org.apache.hudi.metadata.HoodieBackedTableMetadata.openReadersIfNeeded(HoodieBackedTableMetadata.java:283)
    at org.apache.hudi.metadata.HoodieBackedTableMetadata.lambda$getRecordsByKeys$0(HoodieBackedTableMetadata.java:139)
    at java.util.HashMap.forEach(HashMap.java:1289)
    at org.apache.hudi.metadata.HoodieBackedTableMetadata.getRecordsByKeys(HoodieBackedTableMetadata.java:138)
    at org.apache.hudi.metadata.HoodieBackedTableMetadata.getRecordByKey(HoodieBackedTableMetadata.java:128)
    at org.apache.hudi.metadata.BaseTableMetadata.fetchAllPartitionPaths(BaseTableMetadata.java:275)
    at org.apache.hudi.metadata.BaseTableMetadata.getAllPartitionPaths(BaseTableMetadata.java:108)
    ... 12 more
Caused by: org.apache.hudi.exception.HoodieIOException: IOException when reading logblock from log file HoodieLogFile{pathStr='s3a://hudi-testing/metadata_test_table_2/.hoodie/metadata/files/.files-0000_00000000000000.log.1_0-0-0', fileLen=-1}
    at org.apache.hudi.common.table.log.HoodieLogFileReader.next(HoodieLogFileReader.java:376)
    at org.apache.hudi.common.table.log.HoodieLogFormatReader.next(HoodieLogFormatReader.java:120)
    at org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scan(AbstractHoodieLogRecordReader.java:212)
    ... 27 more
Caused by: org.apache.http.ConnectionClosedException: Premature end of Content-Length delimited message body (expected: 1; received: 0
    at org.apache.http.impl.io.ContentLengthInputStream.read(ContentLengthInputStream.java:180)
    at org.apache.http.impl.io.ContentLengthInputStream.read(ContentLengthInputStream.java:200)
    at org.apache.http.impl.io.ContentLengthInputStream.close(ContentLengthInputStream.java:103)
    at org.apache.http.conn.BasicManagedEntity.streamClosed(BasicManagedEntity.java:164)
    at org.apache.http.conn.EofSensorInputStream.checkClose(EofSensorInputStream.java:228)
    at org.apache.http.conn.EofSensorInputStream.close(EofSensorInputStream.java:174)
    at java.io.FilterInputStream.close(FilterInputStream.java:181)
    at java.io.FilterInputStream.close(FilterInputStream.java:181)
    at java.io.FilterInputStream.close(FilterInputStream.java:181)
    at java.io.FilterInputStream.close(FilterInputStream.java:181)
    at com.amazonaws.services.s3.model.S3ObjectInputStream.abort(S3ObjectInputStream.java:90)
    at org.apache.hadoop.fs.s3a.S3AInputStream.reopen(S3AInputStream.java:72)
    at org.apache.hadoop.fs.s3a.S3AInputStream.seek(S3AInputStream.java:115)
    at org.apache.hadoop.fs.FSDataInputStream.seek(FSDataInputStream.java:62)
    at org.apache.hadoop.fs.FSDataInputStream.seek(FSDataInputStream.java:62)
    at org.apache.hudi.common.table.log.HoodieLogFileReader.isBlockCorrupted(HoodieLogFileReader.java:274)
    at org.apache.hudi.common.table.log.HoodieLogFileReader.readBlock(HoodieLogFileReader.java:154)
    at org.apache.hudi.common.table.log.HoodieLogFileReader.next(HoodieLogFileReader.java:374)
    ... 29 more  {code}

> S3AInputStream failure upon fetching metadata: org.apache.http.ConnectionClosedException: Premature end of Content-Length delimited message body
> ------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: HUDI-3498
>                 URL: https://issues.apache.org/jira/browse/HUDI-3498
>             Project: Apache Hudi
>          Issue Type: Bug
>            Reporter: Ethan Guo
>            Assignee: Ethan Guo
>            Priority: Blocker
>             Fix For: 0.11.0
>
>
> Environment: spark 2.4.4 + aws-java-sdk-1.7.4 + hadoop-aws-2.7.4, Hudi 0.11.0-SNAPSHOT, metadata table enabled.
> openjdk version "1.8.0_292"
> OpenJDK Runtime Environment (AdoptOpenJDK)(build 1.8.0_292-b10)
> OpenJDK 64-Bit Server VM (AdoptOpenJDK)(build 25.292-b10, mixed mode)
> After applying the fix for HUDI-3341, when reading metadata table from hudi-cli with "metadata list-partitions", the command intermittently fails with the following exception:
> {code:java}
> Failed to retrieve list of partition from metadata
> org.apache.hudi.exception.HoodieMetadataException: Failed to retrieve list of partition from metadata
>     at org.apache.hudi.metadata.BaseTableMetadata.getAllPartitionPaths(BaseTableMetadata.java:110)
>     at org.apache.hudi.cli.commands.MetadataCommand.listPartitions(MetadataCommand.java:208)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:498)
>     at org.springframework.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:216)
>     at org.springframework.shell.core.SimpleExecutionStrategy.invoke(SimpleExecutionStrategy.java:68)
>     at org.springframework.shell.core.SimpleExecutionStrategy.execute(SimpleExecutionStrategy.java:59)
>     at org.springframework.shell.core.AbstractShell.executeCommand(AbstractShell.java:134)
>     at org.springframework.shell.core.JLineShell.promptLoop(JLineShell.java:533)
>     at org.springframework.shell.core.JLineShell.run(JLineShell.java:179)
>     at java.lang.Thread.run(Thread.java:748)
> Caused by: org.apache.hudi.exception.HoodieException: Exception when reading log file 
>     at org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scan(AbstractHoodieLogRecordReader.java:335)
>     at org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scan(AbstractHoodieLogRecordReader.java:180)
>     at org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.performScan(HoodieMergedLogRecordScanner.java:104)
>     at org.apache.hudi.metadata.HoodieMetadataMergedLogRecordReader.<init>(HoodieMetadataMergedLogRecordReader.java:71)
>     at org.apache.hudi.metadata.HoodieMetadataMergedLogRecordReader.<init>(HoodieMetadataMergedLogRecordReader.java:51)
>     at org.apache.hudi.metadata.HoodieMetadataMergedLogRecordReader$Builder.build(HoodieMetadataMergedLogRecordReader.java:246)
>     at org.apache.hudi.metadata.HoodieBackedTableMetadata.getLogRecordScanner(HoodieBackedTableMetadata.java:377)
>     at org.apache.hudi.metadata.HoodieBackedTableMetadata.lambda$openReadersIfNeeded$4(HoodieBackedTableMetadata.java:293)
>     at java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1660)
>     at org.apache.hudi.metadata.HoodieBackedTableMetadata.openReadersIfNeeded(HoodieBackedTableMetadata.java:283)
>     at org.apache.hudi.metadata.HoodieBackedTableMetadata.lambda$getRecordsByKeys$0(HoodieBackedTableMetadata.java:139)
>     at java.util.HashMap.forEach(HashMap.java:1289)
>     at org.apache.hudi.metadata.HoodieBackedTableMetadata.getRecordsByKeys(HoodieBackedTableMetadata.java:138)
>     at org.apache.hudi.metadata.HoodieBackedTableMetadata.getRecordByKey(HoodieBackedTableMetadata.java:128)
>     at org.apache.hudi.metadata.BaseTableMetadata.fetchAllPartitionPaths(BaseTableMetadata.java:275)
>     at org.apache.hudi.metadata.BaseTableMetadata.getAllPartitionPaths(BaseTableMetadata.java:108)
>     ... 12 more
> Caused by: org.apache.hudi.exception.HoodieIOException: IOException when reading logblock from log file HoodieLogFile{pathStr='s3a://hudi-testing/metadata_test_table_2/.hoodie/metadata/files/.files-0000_00000000000000.log.1_0-0-0', fileLen=-1}
>     at org.apache.hudi.common.table.log.HoodieLogFileReader.next(HoodieLogFileReader.java:376)
>     at org.apache.hudi.common.table.log.HoodieLogFormatReader.next(HoodieLogFormatReader.java:120)
>     at org.apache.hudi.common.table.log.AbstractHoodieLogRecordReader.scan(AbstractHoodieLogRecordReader.java:212)
>     ... 27 more
> Caused by: org.apache.http.ConnectionClosedException: Premature end of Content-Length delimited message body (expected: 1; received: 0
>     at org.apache.http.impl.io.ContentLengthInputStream.read(ContentLengthInputStream.java:180)
>     at org.apache.http.impl.io.ContentLengthInputStream.read(ContentLengthInputStream.java:200)
>     at org.apache.http.impl.io.ContentLengthInputStream.close(ContentLengthInputStream.java:103)
>     at org.apache.http.conn.BasicManagedEntity.streamClosed(BasicManagedEntity.java:164)
>     at org.apache.http.conn.EofSensorInputStream.checkClose(EofSensorInputStream.java:228)
>     at org.apache.http.conn.EofSensorInputStream.close(EofSensorInputStream.java:174)
>     at java.io.FilterInputStream.close(FilterInputStream.java:181)
>     at java.io.FilterInputStream.close(FilterInputStream.java:181)
>     at java.io.FilterInputStream.close(FilterInputStream.java:181)
>     at java.io.FilterInputStream.close(FilterInputStream.java:181)
>     at com.amazonaws.services.s3.model.S3ObjectInputStream.abort(S3ObjectInputStream.java:90)
>     at org.apache.hadoop.fs.s3a.S3AInputStream.reopen(S3AInputStream.java:72)
>     at org.apache.hadoop.fs.s3a.S3AInputStream.seek(S3AInputStream.java:115)
>     at org.apache.hadoop.fs.FSDataInputStream.seek(FSDataInputStream.java:62)
>     at org.apache.hadoop.fs.FSDataInputStream.seek(FSDataInputStream.java:62)
>     at org.apache.hudi.common.table.log.HoodieLogFileReader.isBlockCorrupted(HoodieLogFileReader.java:274)
>     at org.apache.hudi.common.table.log.HoodieLogFileReader.readBlock(HoodieLogFileReader.java:154)
>     at org.apache.hudi.common.table.log.HoodieLogFileReader.next(HoodieLogFileReader.java:374)
>     ... 29 more  {code}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)