You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by sunww <sp...@outlook.com> on 2014/09/28 09:47:27 UTC

hive client OutOfMemoryError

Hi:   I'm using hive0.11, When I run a sql with 2 union all, hive client throw exception:Exception in thread "main" java.lang.OutOfMemoryError: Java
heap space 

at java.lang.StringCoding$StringEncoder.encode(StringCoding.java:232) 

at java.lang.StringCoding.encode(StringCoding.java:272) 

at java.lang.String.getBytes(String.java:946) 

at org.apache.hadoop.hive.ql.optimizer.physical.CommonJoinResolver$CommonJoinTaskDispatcher.processCurrentTask(CommonJoinResolver.java:589)


at
org.apache.hadoop.hive.ql.optimizer.physical.CommonJoinResolver$CommonJoinTaskDispatcher.dispatch(CommonJoinResolver.java:743)


at org.apache.hadoop.hive.ql.lib.TaskGraphWalker.dispatch(TaskGraphWalker.java:111)


at org.apache.hadoop.hive.ql.lib.TaskGraphWalker.walk(TaskGraphWalker.java:194)


at
org.apache.hadoop.hive.ql.lib.TaskGraphWalker.startWalking(TaskGraphWalker.java:139)


at org.apache.hadoop.hive.ql.optimizer.physical.CommonJoinResolver.resolve(CommonJoinResolver.java:112)


at
org.apache.hadoop.hive.ql.optimizer.physical.PhysicalOptimizer.optimize(PhysicalOptimizer.java:79)


at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genMapRedTasks(SemanticAnalyzer.java:8399)


at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:8741)


at
org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:278)


at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:433) 

at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337) 

at org.apache.hadoop.hive.ql.Driver.run(Driver.java:902) 

at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259) 

at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216) 

at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413) 

at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:756) 

at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614) 

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 

at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) 

at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) 
at java.lang.reflect.Method.invoke(Method.java:597) 

at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

I use "export HADOOP_CLIENT_OPTS=-Xmx1024m" to increase hive client heapsize, then the sql is ok.I saw CommonJoinResolver  line 589:InputStream in = new ByteArrayInputStream(xml.getBytes("UTF-8"));
hive client need so many heapsize, is this normalAnd where Can I find this large xml

Thanks 		 	   		  

RE: hive client OutOfMemoryError

Posted by sunww <sp...@outlook.com>.
I always think hive client consume little resource, Thanks
Date: Sun, 28 Sep 2014 12:40:04 -0400
Subject: Re: hive client OutOfMemoryError
From: okeyes@wikimedia.org
To: user@hive.apache.org

Bah. *similar error.

On 28 September 2014 12:39, Oliver Keyes <ok...@wikimedia.org> wrote:
I run into that error, or a simpler error, rather a lot; I consider it pretty normal (although experiences may differ).

A more direct way of increasing the heapsize would be export HADOOP_HEAPSIZE=N, where N is...well, the heapsize. HADOOP_HEAPSIZE=1024 handles the queries I run, but I don't know how big your data store is and how big your operations are.

On 28 September 2014 03:47, sunww <sp...@outlook.com> wrote:



Hi:   I'm using hive0.11, When I run a sql with 2 union all, hive client throw exception:Exception in thread "main" java.lang.OutOfMemoryError: Java
heap space 

at java.lang.StringCoding$StringEncoder.encode(StringCoding.java:232) 

at java.lang.StringCoding.encode(StringCoding.java:272) 

at java.lang.String.getBytes(String.java:946) 

at org.apache.hadoop.hive.ql.optimizer.physical.CommonJoinResolver$CommonJoinTaskDispatcher.processCurrentTask(CommonJoinResolver.java:589)


at
org.apache.hadoop.hive.ql.optimizer.physical.CommonJoinResolver$CommonJoinTaskDispatcher.dispatch(CommonJoinResolver.java:743)


at org.apache.hadoop.hive.ql.lib.TaskGraphWalker.dispatch(TaskGraphWalker.java:111)


at org.apache.hadoop.hive.ql.lib.TaskGraphWalker.walk(TaskGraphWalker.java:194)


at
org.apache.hadoop.hive.ql.lib.TaskGraphWalker.startWalking(TaskGraphWalker.java:139)


at org.apache.hadoop.hive.ql.optimizer.physical.CommonJoinResolver.resolve(CommonJoinResolver.java:112)


at
org.apache.hadoop.hive.ql.optimizer.physical.PhysicalOptimizer.optimize(PhysicalOptimizer.java:79)


at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genMapRedTasks(SemanticAnalyzer.java:8399)


at
org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:8741)


at
org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:278)


at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:433) 

at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337) 

at org.apache.hadoop.hive.ql.Driver.run(Driver.java:902) 

at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259) 

at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216) 

at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413) 

at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:756) 

at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614) 

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 

at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) 

at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) 
at java.lang.reflect.Method.invoke(Method.java:597) 

at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

I use "export HADOOP_CLIENT_OPTS=-Xmx1024m" to increase hive client heapsize, then the sql is ok.I saw CommonJoinResolver  line 589:InputStream in = new ByteArrayInputStream(xml.getBytes("UTF-8"));
hive client need so many heapsize, is this normalAnd where Can I find this large xml

Thanks 		 	   		  


-- 
Oliver Keyes
Research Analyst
Wikimedia Foundation




-- 
Oliver Keyes
Research Analyst
Wikimedia Foundation

 		 	   		  

Re: hive client OutOfMemoryError

Posted by Oliver Keyes <ok...@wikimedia.org>.
Bah. *similar error.

On 28 September 2014 12:39, Oliver Keyes <ok...@wikimedia.org> wrote:

> I run into that error, or a simpler error, rather a lot; I consider it
> pretty normal (although experiences may differ).
>
> A more direct way of increasing the heapsize would be export
> HADOOP_HEAPSIZE=N, where N is...well, the heapsize. HADOOP_HEAPSIZE=1024
> handles the queries I run, but I don't know how big your data store is and
> how big your operations are.
>
> On 28 September 2014 03:47, sunww <sp...@outlook.com> wrote:
>
>> Hi:
>>    I'm using hive0.11, When I run a sql with 2 union all, hive client
>> throw exception:
>> Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
>> at java.lang.StringCoding$StringEncoder.encode(StringCoding.java:232)
>> at java.lang.StringCoding.encode(StringCoding.java:272)
>> at java.lang.String.getBytes(String.java:946)
>> at
>> org.apache.hadoop.hive.ql.optimizer.physical.CommonJoinResolver$CommonJoinTaskDispatcher.processCurrentTask(CommonJoinResolver.java:589)
>>
>> at
>> org.apache.hadoop.hive.ql.optimizer.physical.CommonJoinResolver$CommonJoinTaskDispatcher.dispatch(CommonJoinResolver.java:743)
>>
>> at
>> org.apache.hadoop.hive.ql.lib.TaskGraphWalker.dispatch(TaskGraphWalker.java:111)
>>
>> at
>> org.apache.hadoop.hive.ql.lib.TaskGraphWalker.walk(TaskGraphWalker.java:194)
>>
>> at
>> org.apache.hadoop.hive.ql.lib.TaskGraphWalker.startWalking(TaskGraphWalker.java:139)
>>
>> at
>> org.apache.hadoop.hive.ql.optimizer.physical.CommonJoinResolver.resolve(CommonJoinResolver.java:112)
>>
>> at
>> org.apache.hadoop.hive.ql.optimizer.physical.PhysicalOptimizer.optimize(PhysicalOptimizer.java:79)
>>
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genMapRedTasks(SemanticAnalyzer.java:8399)
>>
>> at
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:8741)
>>
>> at
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:278)
>>
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:433)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:902)
>> at
>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:756)
>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>
>> at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> at java.lang.reflect.Method.invoke(Method.java:597)
>> at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
>>
>>
>> I use "export HADOOP_CLIENT_OPTS=-Xmx1024m" to increase hive client
>> heapsize, then the sql is ok.
>> I saw CommonJoinResolver  line 589:
>> InputStream in = new ByteArrayInputStream(xml.getBytes("UTF-8"));
>>
>> hive client need so many heapsize, is this normal
>> And where Can I find this large xml
>>
>>
>> Thanks
>>
>
>
>
> --
> Oliver Keyes
> Research Analyst
> Wikimedia Foundation
>



-- 
Oliver Keyes
Research Analyst
Wikimedia Foundation

Re: hive client OutOfMemoryError

Posted by Oliver Keyes <ok...@wikimedia.org>.
I run into that error, or a simpler error, rather a lot; I consider it
pretty normal (although experiences may differ).

A more direct way of increasing the heapsize would be export
HADOOP_HEAPSIZE=N, where N is...well, the heapsize. HADOOP_HEAPSIZE=1024
handles the queries I run, but I don't know how big your data store is and
how big your operations are.

On 28 September 2014 03:47, sunww <sp...@outlook.com> wrote:

> Hi:
>    I'm using hive0.11, When I run a sql with 2 union all, hive client
> throw exception:
> Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
> at java.lang.StringCoding$StringEncoder.encode(StringCoding.java:232)
> at java.lang.StringCoding.encode(StringCoding.java:272)
> at java.lang.String.getBytes(String.java:946)
> at
> org.apache.hadoop.hive.ql.optimizer.physical.CommonJoinResolver$CommonJoinTaskDispatcher.processCurrentTask(CommonJoinResolver.java:589)
>
> at
> org.apache.hadoop.hive.ql.optimizer.physical.CommonJoinResolver$CommonJoinTaskDispatcher.dispatch(CommonJoinResolver.java:743)
>
> at
> org.apache.hadoop.hive.ql.lib.TaskGraphWalker.dispatch(TaskGraphWalker.java:111)
>
> at
> org.apache.hadoop.hive.ql.lib.TaskGraphWalker.walk(TaskGraphWalker.java:194)
>
> at
> org.apache.hadoop.hive.ql.lib.TaskGraphWalker.startWalking(TaskGraphWalker.java:139)
>
> at
> org.apache.hadoop.hive.ql.optimizer.physical.CommonJoinResolver.resolve(CommonJoinResolver.java:112)
>
> at
> org.apache.hadoop.hive.ql.optimizer.physical.PhysicalOptimizer.optimize(PhysicalOptimizer.java:79)
>
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genMapRedTasks(SemanticAnalyzer.java:8399)
>
> at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:8741)
>
> at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:278)
>
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:433)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:902)
> at
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:756)
> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
>
>
> I use "export HADOOP_CLIENT_OPTS=-Xmx1024m" to increase hive client
> heapsize, then the sql is ok.
> I saw CommonJoinResolver  line 589:
> InputStream in = new ByteArrayInputStream(xml.getBytes("UTF-8"));
>
> hive client need so many heapsize, is this normal
> And where Can I find this large xml
>
>
> Thanks
>



-- 
Oliver Keyes
Research Analyst
Wikimedia Foundation