You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by "Zheng Shao (JIRA)" <ji...@apache.org> on 2010/01/24 23:49:34 UTC
[jira] Created: (HIVE-1089) Intermittent test failure in
groupby_bigdata.q
Intermittent test failure in groupby_bigdata.q
----------------------------------------------
Key: HIVE-1089
URL: https://issues.apache.org/jira/browse/HIVE-1089
Project: Hadoop Hive
Issue Type: Bug
Reporter: Zheng Shao
{code}
ant test -Dtestcase=TestCliDriver -Dqfile=groupby_bigdata.q
{code}
This sometimes fail with out-ot-memory exception in java.
We might want to tweak the default hash table size in map-side aggregation.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.
[jira] Commented: (HIVE-1089) Intermittent test failure in
groupby_bigdata.q
Posted by "Zheng Shao (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/HIVE-1089?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12804324#action_12804324 ]
Zheng Shao commented on HIVE-1089:
----------------------------------
Stack trace:
{code}
[junit] Begin query: groupby_bigdata.q
[junit] plan = /data/users/zshao/hadoop_hive/trunk/build/ql/scratchdir/plan59991.xml
[junit] Exception in thread "Thread-10" java.lang.OutOfMemoryError: Java heap space
[junit] at java.lang.AbstractStringBuilder.<init>(AbstractStringBuilder.java:45)
[junit] at java.lang.StringBuilder.<init>(StringBuilder.java:68)
[junit] at java.lang.StackTraceElement.toString(StackTraceElement.java:157)
[junit] at java.lang.String.valueOf(String.java:2827)
[junit] at java.lang.StringBuilder.append(StringBuilder.java:115)
[junit] at java.lang.Throwable.printStackTrace(Throwable.java:512)
[junit] at org.apache.hadoop.util.StringUtils.stringifyException(StringUtils.java:51)
[junit] at org.apache.hadoop.hive.ql.exec.ScriptOperator$StreamThread.run(ScriptOperator.java:561)
[junit] java.lang.OutOfMemoryError: Java heap space
[junit] at java.nio.ByteBuffer.wrap(ByteBuffer.java:350)
[junit] at java.nio.ByteBuffer.wrap(ByteBuffer.java:373)
[junit] at java.lang.StringCoding$StringEncoder.encode(StringCoding.java:237)
[junit] at java.lang.StringCoding.encode(StringCoding.java:272)
[junit] at java.lang.String.getBytes(String.java:947)
[junit] at java.io.UnixFileSystem.getBooleanAttributes0(Native Method)
[junit] at java.io.UnixFileSystem.getBooleanAttributes(UnixFileSystem.java:228)
[junit] at java.io.File.exists(File.java:733)
[junit] at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:399)
[junit] at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:242)
[junit] at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:600)
[junit] at org.apache.hadoop.hive.ql.exec.Utilities.clearMapRedWork(Utilities.java:114)
[junit] at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:680)
[junit] at org.apache.hadoop.hive.ql.exec.ExecDriver.main(ExecDriver.java:936)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
[junit] at org.apache.hadoop.mapred.JobShell.run(JobShell.java:194)
[junit] at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
[junit] at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
[junit] at org.apache.hadoop.mapred.JobShell.main(JobShell.java:220)
[junit]
[junit] Traceback (most recent call last):
[junit] File "../data/scripts/dumpdata_script.py", line 6, in <module>
[junit] print 20000 * i + k
[junit] IOError: [Errno 32] Broken pipe
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: Hit error while closing ..
[junit] at org.apache.hadoop.hive.ql.exec.ScriptOperator.close(ScriptOperator.java:444)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.ExecMapper.close(ExecMapper.java:228)
[junit] at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
[junit] at org.apache.hadoop.mapred.MapTask.run(MapTask.java:219)
[junit] at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:157)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: Hit error while closing ..
[junit] at org.apache.hadoop.hive.ql.exec.ScriptOperator.close(ScriptOperator.java:444)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.ExecMapper.close(ExecMapper.java:228)
[junit] at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
[junit] at org.apache.hadoop.mapred.MapTask.run(MapTask.java:219)
[junit] at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:157)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: Hit error while closing ..
[junit] at org.apache.hadoop.hive.ql.exec.ScriptOperator.close(ScriptOperator.java:444)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.ExecMapper.close(ExecMapper.java:228)
[junit] at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
[junit] at org.apache.hadoop.mapred.MapTask.run(MapTask.java:219)
[junit] at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:157)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: Hit error while closing ..
[junit] at org.apache.hadoop.hive.ql.exec.ScriptOperator.close(ScriptOperator.java:444)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.ExecMapper.close(ExecMapper.java:228)
[junit] at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
[junit] at org.apache.hadoop.mapred.MapTask.run(MapTask.java:219)
[junit] at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:157)
[junit] org.apache.hadoop.hive.ql.metadata.HiveException: Hit error while closing ..
[junit] at org.apache.hadoop.hive.ql.exec.ScriptOperator.close(ScriptOperator.java:444)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:512)
[junit] at org.apache.hadoop.hive.ql.exec.ExecMapper.close(ExecMapper.java:228)
[junit] at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
[junit] at org.apache.hadoop.mapred.MapTask.run(MapTask.java:219)
[junit] at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:157)
[junit] Exception: Client Execution failed with error code = 9
[junit] junit.framework.AssertionFailedError: Client Execution failed with error code = 9
[junit] at junit.framework.Assert.fail(Assert.java:47)
[junit] at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby_bigdata(TestCliDriver.java:3660)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[junit] at java.lang.reflect.Method.invoke(Method.java:597)
[junit] at junit.framework.TestCase.runTest(TestCase.java:154)
[junit] at junit.framework.TestCase.runBare(TestCase.java:127)
[junit] at junit.framework.TestResult$1.protect(TestResult.java:106)
[junit] at junit.framework.TestResult.runProtected(TestResult.java:124)
[junit] at junit.framework.TestResult.run(TestResult.java:109)
[junit] at junit.framework.TestCase.run(TestCase.java:118)
[junit] at junit.framework.TestSuite.runTest(TestSuite.java:208)
[junit] at junit.framework.TestSuite.run(TestSuite.java:203)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
[junit] at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
{code}
> Intermittent test failure in groupby_bigdata.q
> ----------------------------------------------
>
> Key: HIVE-1089
> URL: https://issues.apache.org/jira/browse/HIVE-1089
> Project: Hadoop Hive
> Issue Type: Bug
> Reporter: Zheng Shao
>
> {code}
> ant test -Dtestcase=TestCliDriver -Dqfile=groupby_bigdata.q
> {code}
> This sometimes fail with out-ot-memory exception in java.
> We might want to tweak the default hash table size in map-side aggregation.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.
[jira] Resolved: (HIVE-1089) Intermittent test failure in
groupby_bigdata.q
Posted by "Zheng Shao (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/HIVE-1089?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Zheng Shao resolved HIVE-1089.
------------------------------
Resolution: Duplicate
> Intermittent test failure in groupby_bigdata.q
> ----------------------------------------------
>
> Key: HIVE-1089
> URL: https://issues.apache.org/jira/browse/HIVE-1089
> Project: Hadoop Hive
> Issue Type: Bug
> Reporter: Zheng Shao
>
> {code}
> ant test -Dtestcase=TestCliDriver -Dqfile=groupby_bigdata.q
> {code}
> This sometimes fail with out-ot-memory exception in java.
> We might want to tweak the default hash table size in map-side aggregation.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.