You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:14:19 UTC

[jira] [Resolved] (SPARK-23439) Ambiguous reference when selecting column inside StructType with same name that outer colum

     [ https://issues.apache.org/jira/browse/SPARK-23439?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-23439.
----------------------------------
    Resolution: Incomplete

> Ambiguous reference when selecting column inside StructType with same name that outer colum
> -------------------------------------------------------------------------------------------
>
>                 Key: SPARK-23439
>                 URL: https://issues.apache.org/jira/browse/SPARK-23439
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.2.0
>         Environment: Scala 2.11.8, Spark 2.2.0
>            Reporter: Alejandro Trujillo Caballero
>            Priority: Minor
>              Labels: bulk-closed
>
> Hi.
> I've seen that when working with nested struct fields in a DataFrame and doing a select operation the nesting is lost and this can result in collisions between column names.
> For example:
>  
> {code:java}
> case class Foo(a: Int, b: Bar)
> case class Bar(a: Int)
> val items = List(
>   Foo(1, Bar(1)),
>   Foo(2, Bar(2))
> )
> val df = spark.createDataFrame(items)
> val df_a_a = df.select($"a", $"b.a").show
> //+---+---+
> //|  a|  a|
> //+---+---+
> //|  1|  1|
> //|  2|  2|
> //+---+---+
> df.select($"a", $"b.a").printSchema
> //root
> //|-- a: integer (nullable = false)
> //|-- a: integer (nullable = true)
> df.select($"a", $"b.a").select($"a")
> //org.apache.spark.sql.AnalysisException: Reference 'a' is ambiguous, could be: a#9, a#{code}
>  
>  
> Shouldn't the second column be named "b.a"?
>  
> Thanks.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org