You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by "Paul Yang (JIRA)" <ji...@apache.org> on 2010/01/25 22:59:34 UTC

[jira] Updated: (HIVE-1097) groupby_bigdata.q sometimes throws out of memory exception

     [ https://issues.apache.org/jira/browse/HIVE-1097?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Paul Yang updated HIVE-1097:
----------------------------

    Attachment: HVIE-1097.1.patch

* Increased mem from 256MB to 300MB for groupby_bigdata

> groupby_bigdata.q sometimes throws out of memory exception
> ----------------------------------------------------------
>
>                 Key: HIVE-1097
>                 URL: https://issues.apache.org/jira/browse/HIVE-1097
>             Project: Hadoop Hive
>          Issue Type: Bug
>    Affects Versions: 0.5.0, 0.6.0
>            Reporter: Paul Yang
>            Assignee: Paul Yang
>         Attachments: HVIE-1097.1.patch
>
>
> I would get out of memory errors like the following when running groupby_bigdata.q.
> {code}
>   
>     [junit] plan = /data/users/pyang/task2/trunk/VENDOR.hive/trunk/build/ql/scratchdir/plan38413.xml
>     [junit] Exception in thread "Thread-15" java.lang.OutOfMemoryError: Java heap space
>     [junit]     at java.util.Arrays.copyOf(Arrays.java:2882)
>     [junit]     at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:100)
>     [junit]     at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:390)
>     [junit]     at java.lang.StringBuffer.append(StringBuffer.java:224)
>     [junit]     at java.io.StringWriter.write(StringWriter.java:84)
>     [junit]     at java.io.PrintWriter.newLine(PrintWriter.java:436)
>     [junit]     at java.io.PrintWriter.println(PrintWriter.java:585)
>     [junit]     at java.io.PrintWriter.println(PrintWriter.java:696)
>     [junit]     at java.lang.Throwable.printStackTrace(Throwable.java:512)
>     [junit]     at org.apache.hadoop.util.StringUtils.stringifyException(StringUtils.java:60)
>     [junit]     at org.apache.hadoop.hive.ql.exec.ScriptOperator$StreamThread.run(ScriptOperator.java:561)
>     [junit] Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
>     [junit]     at java.nio.HeapCharBuffer.<init>(HeapCharBuffer.java:39)
>     [junit]     at java.nio.CharBuffer.allocate(CharBuffer.java:312)
>     [junit]     at java.nio.charset.CharsetEncoder.isLegalReplacement(CharsetEncoder.java:319)
>     [junit]     at java.nio.charset.CharsetEncoder.replaceWith(CharsetEncoder.java:267)
>     [junit]     at java.nio.charset.CharsetEncoder.<init>(CharsetEncoder.java:186)
>     [junit]     at java.nio.charset.CharsetEncoder.<init>(CharsetEncoder.java:209)
>     [junit]     at sun.nio.cs.ISO_8859_1$Encoder.<init>(ISO_8859_1.java:116)
>     [junit]     at sun.nio.cs.ISO_8859_1$Encoder.<init>(ISO_8859_1.java:113)
>     [junit]     at sun.nio.cs.ISO_8859_1.newEncoder(ISO_8859_1.java:46)
>     [junit]     at java.lang.StringCoding$StringEncoder.<init>(StringCoding.java:215)
>     [junit]     at java.lang.StringCoding$StringEncoder.<init>(StringCoding.java:207)
>     [junit]     at java.lang.StringCoding.encode(StringCoding.java:266)
>     [junit]     at java.lang.String.getBytes(String.java:947)
>     [junit]     at java.io.UnixFileSystem.getLength(Native Method)
>     [junit]     at java.io.File.length(File.java:848)
>     [junit]     at org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.<init>(RawLocalFileSystem.java:375)
>     [junit]     at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:359)
>     [junit]     at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:245)
>     [junit]     at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:643)
>     [junit]     at org.apache.hadoop.hive.ql.exec.Utilities.clearMapRedWork(Utilities.java:114)
>     [junit]     at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:680)
>     [junit]     at org.apache.hadoop.hive.ql.exec.ExecDriver.main(ExecDriver.java:936)
>     [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>     [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>     [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
>     [junit]     at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>     [junit] Traceback (most recent call last):
>     [junit]   File "../data/scripts/dumpdata_script.py", line 6, in <module>
>     [junit]     print 20000 * i + k
> {code}

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.