You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2018/10/19 02:17:00 UTC
[jira] [Assigned] (SPARK-25776) The disk write buffer size must be
greater than 12.
[ https://issues.apache.org/jira/browse/SPARK-25776?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-25776:
------------------------------------
Assignee: Apache Spark
> The disk write buffer size must be greater than 12.
> ---------------------------------------------------
>
> Key: SPARK-25776
> URL: https://issues.apache.org/jira/browse/SPARK-25776
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 3.0.0
> Reporter: liuxian
> Assignee: Apache Spark
> Priority: Minor
>
> In {color:#205081}{{UnsafeSorterSpillWriter.java}}{color}, when we write a record to a spill file wtih {{ {color:#205081}void write(Object baseObject, long baseOffset, int recordLength, long keyPrefix{color})}}, {color:#205081}{{recordLength}} {color}and {color:#205081}{{keyPrefix}} {color}will be written the disk write buffer first, and these will take 12 bytes, so the disk write buffer size must be greater than 12.
> If {color:#205081}{{diskWriteBufferSize}} {color}is 10, it will print this exception info:
> _java.lang.ArrayIndexOutOfBoundsException: 10_
> _at org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillWriter.writeLongToBuffer (UnsafeSorterSpillWriter.java:91)_
> _at org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillWriter.write(UnsafeSorterSpillWriter.java:123)_
> _at org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.spillIterator(UnsafeExternalSorter.java:498)_
> _at org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.spill(UnsafeExternalSorter.java:222)_
> _at org.apache.spark.memory.MemoryConsumer.spill(MemoryConsumer.java:65)_
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org