You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "David Allsopp (JIRA)" <ji...@apache.org> on 2017/07/18 15:51:00 UTC

[jira] [Created] (SPARK-21459) Some aggregation functions change the case of nested field names

David Allsopp created SPARK-21459:
-------------------------------------

             Summary: Some aggregation functions change the case of nested field names
                 Key: SPARK-21459
                 URL: https://issues.apache.org/jira/browse/SPARK-21459
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 1.6.0
            Reporter: David Allsopp
            Priority: Minor


When working with DataFrames with nested schemas, the behavior of the aggregation functions is inconsistent with respect to preserving the case of the nested field names.

For example, {{first()}} preserves the case of the field names, but {{collect_set()}} and {{collect_list()}} force the field names to lowercase.

Expected behavior: Field name case is preserved (or is at least consistent and documented)

Spark-shell session to reproduce:


{code:java}
scala> case class Inner(Key:String, Value:String)
scala> case class Outer(ID:Long, Pairs:Array[Inner])

val rdd = sc.parallelize(Seq(Outer(1L, Array(Inner("foo", "bar")))))
val df = sqlContext.createdataFrame(rdd)

scala> df
... = [ID: bigint, Pairs: array<struct<Key:string,Value:string>>]

scala>df.groupBy("ID").agg(first("Pairs"))
... = [ID: bigint, first(Pairs)(): array<struct<Key:string,Value:string>>]
// Note that Key and Value preserve their original case

scala>df.groupBy("ID").agg(collect_set("Pairs"))
... = [ID: bigint, collect_set(Pairs): array<struct<key:string,value:string>>]
// Note that key and value are now lowercased

{code}

Additionally, the column name is inconsistent: {{first(Pairs)()}} versus {{collect_set(Pairs)}} - note the extra parentheses for in the first name.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org