You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flume.apache.org by Mohit Anchlia <mo...@gmail.com> on 2013/01/16 19:52:59 UTC
OutOfMemory
I often get out of memory even when there is no load on the system. I am
wondering what's the best way to debug this. I have heap size set to 2G and
memory capacity is 10000
*13/01/16* *09:09:38* *ERROR* *hdfs.HDFSEventSink:* *process* *failed*
*java.lang.OutOfMemoryError:* *Java* *heap* *space*
*at* *java.util.Arrays.copyOf*(*Arrays.java:2786*)
*at* *java.io.ByteArrayOutputStream.write*(*ByteArrayOutputStream.java:94*)
*at* *java.io.DataOutputStream.write*(*DataOutputStream.java:90*)
*at* *org.apache.hadoop.io.Text.write*(*Text.java:282*)
*...* *11* *lines* *omitted* *...*
*at* *java.lang.Thread.run*(*Thread.java:662*)
*Exception* *in* *thread*
"*SinkRunner-PollingRunner-DefaultSinkProcessor*"
*java.lang.OutOfMemoryError:* *Java* *heap* *space*
*at* *java.util.Arrays.copyOf*(*Arrays.java:2786*)
*at* *java.io.ByteArrayOutputStream.write*(*ByteArrayOutputStream.java:94*)
Re: OutOfMemory
Posted by Juhani Connolly <ju...@cyberagent.co.jp>.
How big are your events? 10000 capacity doesn't seem like it should run
into any issues, but since it is all on memory, it's possible your
channel is eating up all your memory.
Note: channel capacity is the number of events, not the physical size.
You can verify what is going on by setting up ganglia or using something
like jconsole to get counter data via jmx: you'll want to pull the
channelFillPercentage.
On 01/17/2013 07:15 AM, Mohit Anchlia wrote:
> channel transaction is 500 and I've not set any batchsize parameter.
>
> On Wed, Jan 16, 2013 at 1:49 PM, Bhaskar V. Karambelkar
> <bhaskarvk@gmail.com <ma...@gmail.com>> wrote:
>
> What is the channel transaction capacity and HDFS batch size ?
>
>
> On Wed, Jan 16, 2013 at 1:52 PM, Mohit Anchlia
> <mohitanchlia@gmail.com <ma...@gmail.com>> wrote:
>
> I often get out of memory even when there is no load on the
> system. I am wondering what's the best way to debug this. I
> have heap size set to 2G and memory capacity is 10000
>
> ///13///01////16// ///09/:/09//:/38// //ERROR// ///hdfs/./HDFSEventSink//:/ /process/ /failed/
>
>
>
> ////java/./lang//./OutOfMemoryError//:/ /Java/ /heap/ /space/
> /at/ ////java/./util//./Arrays//./copyOf//(///Arrays/./java//:/2786//)
>
>
>
> /at/ ////java/./io//./ByteArrayOutputStream//./write//(///ByteArrayOutputStream/./java//:/94//)
>
> /at/ ////java/./io//./DataOutputStream//./write//(///DataOutputStream/./java//:/90//)
>
> /at/ //////org/./apache//./hadoop//./io//./Text//./write//(///Text/./java//:/282//)
>
> ///./././ /11/ /lines/ /omitted/ ///./././
> /at/ ////java/./lang//./Thread//./run//(///Thread/./java//:/662//)
>
>
>
> /Exception/ /in/ /thread/ "///SinkRunner/-/PollingRunner//-/DefaultSinkProcessor//"////java/./lang//./OutOfMemoryError//:/ /Java/ /heap/ /space/
>
>
>
> /at/ ////java/./util//./Arrays//./copyOf//(///Arrays/./java//:/2786//)
>
> /at/ ////java/./io//./ByteArrayOutputStream//./write//(///ByteArrayOutputStream/./java//:/94//)
>
>
>
Re: OutOfMemory
Posted by "Bhaskar V. Karambelkar" <bh...@gmail.com>.
http://blogs.opcodesolutions.com/roller/java/entry/solve_java_lang_outofmemoryerror_java
https://blogs.oracle.com/alanb/entry/heap_dumps_are_back_with
Or use a profiler or visualvm etc. There are tons of tools you can use to
debug memory problems.
On Wed, Jan 16, 2013 at 5:15 PM, Mohit Anchlia <mo...@gmail.com>wrote:
> channel transaction is 500 and I've not set any batchsize parameter.
>
>
> On Wed, Jan 16, 2013 at 1:49 PM, Bhaskar V. Karambelkar <
> bhaskarvk@gmail.com> wrote:
>
>> What is the channel transaction capacity and HDFS batch size ?
>>
>>
>> On Wed, Jan 16, 2013 at 1:52 PM, Mohit Anchlia <mo...@gmail.com>wrote:
>>
>>> I often get out of memory even when there is no load on the system. I am
>>> wondering what's the best way to debug this. I have heap size set to 2G and
>>> memory capacity is 10000
>>>
>>>
>>> *13/01/16* *09:09:38* *ERROR* *hdfs.HDFSEventSink:* *process* *failed*
>>>
>>>
>>> *java.lang.OutOfMemoryError:* *Java* *heap* *space*
>>> *at* *java.util.Arrays.copyOf*(*Arrays.java:2786*)
>>>
>>>
>>>
>>> *at* *java.io.ByteArrayOutputStream.write*(*ByteArrayOutputStream.java:94*)
>>>
>>> *at* *java.io.DataOutputStream.write*(*DataOutputStream.java:90*)
>>>
>>> *at* *org.apache.hadoop.io.Text.write*(*Text.java:282*)
>>> *...* *11* *lines* *omitted* *...*
>>> *at* *java.lang.Thread.run*(*Thread.java:662*)
>>>
>>>
>>> *Exception* *in* *thread* "*SinkRunner-PollingRunner-DefaultSinkProcessor*" *java.lang.OutOfMemoryError:* *Java* *heap* *space*
>>>
>>>
>>>
>>> *at* *java.util.Arrays.copyOf*(*Arrays.java:2786*)
>>>
>>> *at* *java.io.ByteArrayOutputStream.write*(*ByteArrayOutputStream.java:94*)
>>>
>>>
>>
>
Re: OutOfMemory
Posted by Mohit Anchlia <mo...@gmail.com>.
channel transaction is 500 and I've not set any batchsize parameter.
On Wed, Jan 16, 2013 at 1:49 PM, Bhaskar V. Karambelkar <bhaskarvk@gmail.com
> wrote:
> What is the channel transaction capacity and HDFS batch size ?
>
>
> On Wed, Jan 16, 2013 at 1:52 PM, Mohit Anchlia <mo...@gmail.com>wrote:
>
>> I often get out of memory even when there is no load on the system. I am
>> wondering what's the best way to debug this. I have heap size set to 2G and
>> memory capacity is 10000
>>
>>
>> *13/01/16* *09:09:38* *ERROR* *hdfs.HDFSEventSink:* *process* *failed*
>>
>> *java.lang.OutOfMemoryError:* *Java* *heap* *space*
>> *at* *java.util.Arrays.copyOf*(*Arrays.java:2786*)
>>
>>
>> *at* *java.io.ByteArrayOutputStream.write*(*ByteArrayOutputStream.java:94*)
>>
>> *at* *java.io.DataOutputStream.write*(*DataOutputStream.java:90*)
>>
>> *at* *org.apache.hadoop.io.Text.write*(*Text.java:282*)
>> *...* *11* *lines* *omitted* *...*
>> *at* *java.lang.Thread.run*(*Thread.java:662*)
>>
>> *Exception* *in* *thread* "*SinkRunner-PollingRunner-DefaultSinkProcessor*" *java.lang.OutOfMemoryError:* *Java* *heap* *space*
>>
>>
>> *at* *java.util.Arrays.copyOf*(*Arrays.java:2786*)
>>
>> *at* *java.io.ByteArrayOutputStream.write*(*ByteArrayOutputStream.java:94*)
>>
>>
>
Re: OutOfMemory
Posted by "Bhaskar V. Karambelkar" <bh...@gmail.com>.
What is the channel transaction capacity and HDFS batch size ?
On Wed, Jan 16, 2013 at 1:52 PM, Mohit Anchlia <mo...@gmail.com>wrote:
> I often get out of memory even when there is no load on the system. I am
> wondering what's the best way to debug this. I have heap size set to 2G and
> memory capacity is 10000
>
>
> *13/01/16* *09:09:38* *ERROR* *hdfs.HDFSEventSink:* *process* *failed*
> *java.lang.OutOfMemoryError:* *Java* *heap* *space*
> *at* *java.util.Arrays.copyOf*(*Arrays.java:2786*)
>
> *at* *java.io.ByteArrayOutputStream.write*(*ByteArrayOutputStream.java:94*)
>
> *at* *java.io.DataOutputStream.write*(*DataOutputStream.java:90*)
>
> *at* *org.apache.hadoop.io.Text.write*(*Text.java:282*)
> *...* *11* *lines* *omitted* *...*
> *at* *java.lang.Thread.run*(*Thread.java:662*)
> *Exception* *in* *thread* "*SinkRunner-PollingRunner-DefaultSinkProcessor*" *java.lang.OutOfMemoryError:* *Java* *heap* *space*
>
> *at* *java.util.Arrays.copyOf*(*Arrays.java:2786*)
>
> *at* *java.io.ByteArrayOutputStream.write*(*ByteArrayOutputStream.java:94*)
>
>