You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sandy Ryza (JIRA)" <ji...@apache.org> on 2014/12/18 16:31:13 UTC

[jira] [Created] (SPARK-4885) Enable fetched blocks to exceed 2 GB by chaining buffers

Sandy Ryza created SPARK-4885:
---------------------------------

             Summary: Enable fetched blocks to exceed 2 GB by chaining buffers
                 Key: SPARK-4885
                 URL: https://issues.apache.org/jira/browse/SPARK-4885
             Project: Spark
          Issue Type: Improvement
          Components: Spark Core
    Affects Versions: 1.2.0
            Reporter: Sandy Ryza


{code}
14/12/18 09:53:13 ERROR executor.ExecutorUncaughtExceptionHandler: Uncaught exception in thread Thread[handle-message-executor-12,5,main]

java.lang.OutOfMemoryError: Requested array size exceeds VM limit

                at java.util.Arrays.copyOf(Arrays.java:2271)

                at java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:113)

                at java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93)

                at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:140)

                at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)

                at java.io.BufferedOutputStream.write(BufferedOutputStream.java:126)

                at com.esotericsoftware.kryo.io.Output.flush(Output.java:155)

                at com.esotericsoftware.kryo.io.Output.require(Output.java:135)

                at com.esotericsoftware.kryo.io.Output.writeLong(Output.java:477)

                at com.esotericsoftware.kryo.io.Output.writeDouble(Output.java:596)

                at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$DoubleArraySerializer.write(DefaultArraySerializers.java:212)

                at com.esotericsoftware.kryo.serializers.DefaultArraySerializers$DoubleArraySerializer.write(DefaultArraySerializers.java:200)

                at com.esotericsoftware.kryo.Kryo.writeObjectOrNull(Kryo.java:549)

                at com.esotericsoftware.kryo.serializers.FieldSerializer$ObjectField.write(FieldSerializer.java:570)

                at com.esotericsoftware.kryo.serializers.FieldSerializer.write(FieldSerializer.java:213)

                at com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:568)

                at org.apache.spark.serializer.KryoSerializationStream.writeObject(KryoSerializer.scala:119)

                at org.apache.spark.serializer.SerializationStream.writeAll(Serializer.scala:110)

                at org.apache.spark.storage.BlockManager.dataSerializeStream(BlockManager.scala:1054)

                at org.apache.spark.storage.BlockManager.dataSerialize(BlockManager.scala:1063)

                at org.apache.spark.storage.MemoryStore.getBytes(MemoryStore.scala:154)

                at org.apache.spark.storage.BlockManager.doGetLocal(BlockManager.scala:428)

                at org.apache.spark.storage.BlockManager.getLocalBytes(BlockManager.scala:394)

                at org.apache.spark.storage.BlockManagerWorker.getBlock(BlockManagerWorker.scala:100)

                at org.apache.spark.storage.BlockManagerWorker.processBlockMessage(BlockManagerWorker.scala:79)

                at org.apache.spark.storage.BlockManagerWorker$$anonfun$2.apply(BlockManagerWorker.scala:48)

                at org.apache.spark.storage.BlockManagerWorker$$anonfun$2.apply(BlockManagerWorker.scala:48)

                at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)

                at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)

                at scala.collection.Iterator$class.foreach(Iterator.scala:727)

                at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)

                at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
{code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org