You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@commons.apache.org by "Gary D. Gregory (Jira)" <ji...@apache.org> on 2022/05/06 13:20:00 UTC

[jira] [Commented] (COMPRESS-619) Large SevenZFile fails When Next Header Size is Greater than Max Int

    [ https://issues.apache.org/jira/browse/COMPRESS-619?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17532856#comment-17532856 ] 

Gary D. Gregory commented on COMPRESS-619:
------------------------------------------

Hi All,

In git master, I reimplemented CRC-32 computation of NextHeader in SevenZFile.initializeArchive(StartHeader, byte[], boolean) as streaming but... to-do is to implement streaming for the rest of the method, which is less simple.

All of that to say, that this problem is still present because the buffer is still read in full, just later in the method...

> Large SevenZFile fails When Next Header Size is Greater than Max Int
> --------------------------------------------------------------------
>
>                 Key: COMPRESS-619
>                 URL: https://issues.apache.org/jira/browse/COMPRESS-619
>             Project: Commons Compress
>          Issue Type: Bug
>          Components: Archivers
>    Affects Versions: 1.21
>            Reporter: Brian Miller
>            Priority: Minor
>
> When reading a large file (42GB) the following stack trace is produced:
>  
> {code:java}
> java.io.IOException: Cannot handle nextHeaderSize 4102590414
>     at org.apache.commons.compress.archivers.sevenz.SevenZFile.assertFitsIntoNonNegativeInt(SevenZFile.java:2076) ~[classes/:?]
>     at org.apache.commons.compress.archivers.sevenz.SevenZFile.initializeArchive(SevenZFile.java:528) ~[classes/:?]
>     at org.apache.commons.compress.archivers.sevenz.SevenZFile.readHeaders(SevenZFile.java:474) ~[classes/:?]
>     at org.apache.commons.compress.archivers.sevenz.SevenZFile.<init>(SevenZFile.java:343) ~[classes/:?]
>     at org.apache.commons.compress.archivers.sevenz.SevenZFile.<init>(SevenZFile.java:136) ~[classes/:?]
>     at org.apache.commons.compress.archivers.sevenz.SevenZFile.<init>(SevenZFile.java:376) ~[classes/:?]
>     at org.apache.commons.compress.archivers.sevenz.SevenZFile.<init>(SevenZFile.java:364) ~[classes/:?] {code}
>  
> The file was produced using the SevenZOutputFile class and contains a large number of very small files all inserted using copy compression. It passes the 7z tests and has the following statistics:
>  
> {code:java}
> Files: 40872560
> Size:       43708874326
> Compressed: 47811464772
>  {code}
> It is failing because a ByteBuffer can't be created that is large enough with something over max integer in size to do the CRC check.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)