You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yin Huai (JIRA)" <ji...@apache.org> on 2015/08/06 04:20:04 UTC

[jira] [Resolved] (SPARK-9611) UnsafeFixedWidthAggregationMap.destructAndCreateExternalSorter will add an empty entry to if the map is empty.

     [ https://issues.apache.org/jira/browse/SPARK-9611?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Yin Huai resolved SPARK-9611.
-----------------------------
    Resolution: Fixed

Issue resolved by pull request 7948
[https://github.com/apache/spark/pull/7948]

> UnsafeFixedWidthAggregationMap.destructAndCreateExternalSorter will add an empty entry to if the map is empty.
> --------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-9611
>                 URL: https://issues.apache.org/jira/browse/SPARK-9611
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>            Reporter: Yin Huai
>            Assignee: Yin Huai
>            Priority: Blocker
>             Fix For: 1.5.0
>
>
> There are two corner cases related to the destructAndCreateExternalSorter (class UnsafeKVExternalSorter) returned by UnsafeFixedWidthAggregationMap.
> 1. The constructor of UnsafeKVExternalSorter tries to first create a UnsafeInMemorySorter based on the BytesToBytesMap of UnsafeFixedWidthAggregationMap. However, when there is no entry in the map, UnsafeInMemorySorter will throw an AssertionError because we are using the size of map (0 at here) as the initialSize of UnsafeInMemorySorter, which is not allowed.
> 2. Once we fixes the first problem, when we use UnsafeKVExternalSorter's KVSorterIterator loads data back, you can find there is one extra records, which is an empty record.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org