You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (JIRA)" <ji...@apache.org> on 2018/11/28 15:45:00 UTC

[jira] [Resolved] (SPARK-25824) Remove duplicated map entries in `showString`

     [ https://issues.apache.org/jira/browse/SPARK-25824?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Wenchen Fan resolved SPARK-25824.
---------------------------------
    Resolution: Fixed

> Remove duplicated map entries in `showString`
> ---------------------------------------------
>
>                 Key: SPARK-25824
>                 URL: https://issues.apache.org/jira/browse/SPARK-25824
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Dongjoon Hyun
>            Priority: Minor
>
> `showString` doesn't eliminate the duplication. So, it looks different from the result of `collect` and select from saved rows.
> *Spark 2.2.2*
> {code}
> spark-sql> select map(1,2,1,3);
> {1:3}
> scala> sql("SELECT map(1,2,1,3)").collect
> res0: Array[org.apache.spark.sql.Row] = Array([Map(1 -> 3)])
> scala> sql("SELECT map(1,2,1,3)").show
> +---------------+
> |map(1, 2, 1, 3)|
> +---------------+
> |    Map(1 -> 3)|
> +---------------+
> {code}
> *Spark 2.3.0 ~ 2.4.0-rc4*
> {code}
> spark-sql> select map(1,2,1,3);
> {1:3}
> scala> sql("SELECT map(1,2,1,3)").collect
> res1: Array[org.apache.spark.sql.Row] = Array([Map(1 -> 3)])
> scala> sql("CREATE TABLE m AS SELECT map(1,2,1,3) a")
> scala> sql("SELECT * FROM m").show
> +--------+
> |       a|
> +--------+
> |[1 -> 3]|
> +--------+
> scala> sql("SELECT map(1,2,1,3)").show
> +----------------+
> | map(1, 2, 1, 3)|
> +----------------+
> |[1 -> 2, 1 -> 3]|
> +----------------+
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org