You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Reynold Xin (JIRA)" <ji...@apache.org> on 2015/09/18 09:44:04 UTC
[jira] [Resolved] (SPARK-10684) StructType.interpretedOrdering need
not to be serialized
[ https://issues.apache.org/jira/browse/SPARK-10684?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Reynold Xin resolved SPARK-10684.
---------------------------------
Resolution: Fixed
Assignee: Navis
Fix Version/s: 1.6.0
> StructType.interpretedOrdering need not to be serialized
> --------------------------------------------------------
>
> Key: SPARK-10684
> URL: https://issues.apache.org/jira/browse/SPARK-10684
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 1.5.0
> Reporter: Navis
> Assignee: Navis
> Priority: Minor
> Fix For: 1.6.0
>
>
> Kryo fails with buffer overflow even with max value (2G).
> {noformat}
> org.apache.spark.SparkException: Kryo serialization failed: Buffer overflow. Available: 0, required: 1
> Serialization trace:
> containsChild (org.apache.spark.sql.catalyst.expressions.BoundReference)
> child (org.apache.spark.sql.catalyst.expressions.SortOrder)
> array (scala.collection.mutable.ArraySeq)
> ordering (org.apache.spark.sql.catalyst.expressions.InterpretedOrdering)
> interpretedOrdering (org.apache.spark.sql.types.StructType)
> schema (org.apache.spark.sql.catalyst.expressions.GenericRowWithSchema). To avoid this, increase spark.kryoserializer.buffer.max value.
> at org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:263)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:240)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
> {noformat}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org