You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@commons.apache.org by "Stefan Bodewig (JIRA)" <ji...@apache.org> on 2010/02/18 16:24:28 UTC
[jira] Updated: (COMPRESS-16) unable to extract a TAR file that
contains an entry which is 10 GB in size
[ https://issues.apache.org/jira/browse/COMPRESS-16?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Stefan Bodewig updated COMPRESS-16:
-----------------------------------
Fix Version/s: (was: 1.1)
> unable to extract a TAR file that contains an entry which is 10 GB in size
> --------------------------------------------------------------------------
>
> Key: COMPRESS-16
> URL: https://issues.apache.org/jira/browse/COMPRESS-16
> Project: Commons Compress
> Issue Type: Bug
> Environment: I am using win xp sp3, but this should be platform independent.
> Reporter: Sam Smith
> Attachments: ant-8GB-tar.patch, patch-for-compress.txt
>
>
> I made a TAR file which contains a file entry where the file is 10 GB in size.
> When I attempt to extract the file using TarInputStream, it fails with the following stack trace:
> java.io.IOException: unexpected EOF with 24064 bytes unread
> at org.apache.commons.compress.archivers.tar.TarInputStream.read(TarInputStream.java:348)
> at org.apache.commons.compress.archivers.tar.TarInputStream.copyEntryContents(TarInputStream.java:388)
> So, TarInputStream does not seem to support large (> 8 GB?) files.
> Here is something else to note: I created that TAR file using TarOutputStream , which did not complain when asked to write a 10 GB file into the TAR file, so I assume that TarOutputStream has no file size limits? That, or does it silently create corrupted TAR files (which would be the worst situation of all...)?
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.