You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@avro.apache.org by David Kincaid <ki...@gmail.com> on 2016/01/28 16:51:02 UTC

OutOfMemoryError while writing a large map to a file

I am getting an OutOfMemoryError while writing a very large map to a file.
It appears that it is trying to grow the size of the buffer at some point
and trying to make a copy of the data already in the buffer, but I could be
misinterpreting the stack trace. I have the JVM heap set to 24gb right now
and can't really go any higher.

I'm using the Clojure library "abracad" to do the writing. It's using a
DataFileWriter and ClojureDatumWriter which is a subclass of
GenericDatumWriter. I have a Clojure map that I am simply passing to the
append method of the DataFileWriter. Does anyone have a suggestion for
making the write not try to copy the data in memory as it's writing?

Here is the relevant piece of the stack trace:

java.lang.OutOfMemoryError: Requested array size exceeds VM limit
        at java.util.Arrays.copyOf(Arrays.java:3236)
        at
java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:113)
        at
java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93)
        at
java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:140)
        at
org.apache.avro.io.BufferedBinaryEncoder$OutputStreamSink.innerWrite(BufferedBinaryEncoder.java:216)
        at
org.apache.avro.io.BufferedBinaryEncoder.flushBuffer(BufferedBinaryEncoder.java:93)
        at
org.apache.avro.io.BufferedBinaryEncoder.ensureBounds(BufferedBinaryEncoder.java:108)
        at
org.apache.avro.io.BufferedBinaryEncoder.writeLong(BufferedBinaryEncoder.java:129)
        at abracad.avro.ClojureDatumWriter.write(ClojureDatumWriter.java:47)
        at
org.apache.avro.generic.GenericDatumWriter.writeMap(GenericDatumWriter.java:180)
        at
org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:69)
        at abracad.avro.ClojureDatumWriter.write(ClojureDatumWriter.java:51)
        at
org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:58)
        at
org.apache.avro.file.DataFileWriter.append(DataFileWriter.java:290)
        at
com.idexx.medical_notes.read_notes$map__GT_avro_file.invoke(read_notes.clj:46)
        at
com.idexx.medical_notes.read_notes$listener$fn__5159$fn__5161.invoke(read_notes.clj:244)
        at
com.idexx.medical_notes.read_notes.proxy$akka.actor.UntypedActor$ff19274a.onReceive(Unknown
Source)
        at
akka.actor.UntypedActor$$anonfun$receive$1.applyOrElse(UntypedActor.scala:167)
        at akka.actor.Actor$class.aroundReceive(Actor.scala:467)
        at akka.actor.UntypedActor.aroundReceive(UntypedActor.scala:97)
        at
com.idexx.medical_notes.read_notes.proxy$akka.actor.UntypedActor$ff19274a.aroundReceive(Unknown
Source)
        at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
        at akka.actor.ActorCell.invoke(ActorCell.scala:487)
        at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
        at akka.dispatch.Mailbox.run(Mailbox.scala:220)
        at
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
        at
scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
        at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
        at
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
        at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
        at abracad.avro.ClojureDatumWriter.write(ClojureDatumWriter.java:47)
        at
org.apache.avro.generic.GenericDatumWriter.writeMap(GenericDatumWriter.java:180)
        at
org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:69)
        at abracad.avro.ClojureDatumWriter.write(ClojureDatumWriter.java:51)
        at
org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:58)
        at
org.apache.avro.file.DataFileWriter.append(DataFileWriter.java:290)

Re: OutOfMemoryError while writing a large map to a file

Posted by David Kincaid <ki...@gmail.com>.
Thanks, Doug. I'll see if I can give that a try. It seems like there isn't
a way to use that with the DataFileWriter though, so I'm a bit unsure how
to go about writing data using a different encoder.

- Dave

On Thu, Jan 28, 2016 at 10:59 AM, Doug Cutting <cu...@apache.org> wrote:

> On Thu, Jan 28, 2016 at 7:51 AM, David Kincaid <ki...@gmail.com>
> wrote:
> > Does anyone have a suggestion for making the write not try to copy the
> data
> > in memory as it's writing?
>
> BlockingBinaryEncoder is meant to do this, but I don't know how much
> it's been used.
>
>
> https://avro.apache.org/docs/current/api/java/org/apache/avro/io/BlockingBinaryEncoder.html
>
> Doug
>

Re: OutOfMemoryError while writing a large map to a file

Posted by Doug Cutting <cu...@apache.org>.
On Thu, Jan 28, 2016 at 7:51 AM, David Kincaid <ki...@gmail.com> wrote:
> Does anyone have a suggestion for making the write not try to copy the data
> in memory as it's writing?

BlockingBinaryEncoder is meant to do this, but I don't know how much
it's been used.

https://avro.apache.org/docs/current/api/java/org/apache/avro/io/BlockingBinaryEncoder.html

Doug