You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-commits@hadoop.apache.org by cu...@apache.org on 2007/05/24 23:56:43 UTC
svn commit: r541443 - in /lucene/hadoop/branches/branch-0.13: CHANGES.txt
src/java/org/apache/hadoop/io/compress/GzipCodec.java
Author: cutting
Date: Thu May 24 14:56:42 2007
New Revision: 541443
URL: http://svn.apache.org/viewvc?view=rev&rev=541443
Log:
Merge -r 541441:541442 from trunk to 0.13 branch. Fixes: HADOOP-1427.
Modified:
lucene/hadoop/branches/branch-0.13/CHANGES.txt
lucene/hadoop/branches/branch-0.13/src/java/org/apache/hadoop/io/compress/GzipCodec.java
Modified: lucene/hadoop/branches/branch-0.13/CHANGES.txt
URL: http://svn.apache.org/viewvc/lucene/hadoop/branches/branch-0.13/CHANGES.txt?view=diff&rev=541443&r1=541442&r2=541443
==============================================================================
--- lucene/hadoop/branches/branch-0.13/CHANGES.txt (original)
+++ lucene/hadoop/branches/branch-0.13/CHANGES.txt Thu May 24 14:56:42 2007
@@ -419,6 +419,9 @@
meant failed tasks didn't cause the job to fail.
(Arun C Murthy via tomwhite)
+126. HADOOP-1427. Fix a typo that caused GzipCodec to incorrectly use
+ a very small input buffer. (Espen Amble Kolstad via cutting)
+
Release 0.12.3 - 2007-04-06
Modified: lucene/hadoop/branches/branch-0.13/src/java/org/apache/hadoop/io/compress/GzipCodec.java
URL: http://svn.apache.org/viewvc/lucene/hadoop/branches/branch-0.13/src/java/org/apache/hadoop/io/compress/GzipCodec.java?view=diff&rev=541443&r1=541442&r2=541443
==============================================================================
--- lucene/hadoop/branches/branch-0.13/src/java/org/apache/hadoop/io/compress/GzipCodec.java (original)
+++ lucene/hadoop/branches/branch-0.13/src/java/org/apache/hadoop/io/compress/GzipCodec.java Thu May 24 14:56:42 2007
@@ -172,7 +172,7 @@
if (ZlibFactory.isNativeZlibLoaded()) {
Decompressor decompressor =
new ZlibDecompressor(ZlibDecompressor.CompressionHeader.AUTODETECT_GZIP_ZLIB,
- 64*1-24);
+ 64*1024);
compInStream = new DecompressorStream(in, decompressor,
conf.getInt("io.file.buffer.size", 4*1024));