You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ivan Sadikov (Jira)" <ji...@apache.org> on 2022/10/14 04:28:00 UTC

[jira] [Commented] (SPARK-39783) Wrong column backticks in UNRESOLVED_COLUMN error

    [ https://issues.apache.org/jira/browse/SPARK-39783?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17617460#comment-17617460 ] 

Ivan Sadikov commented on SPARK-39783:
--------------------------------------

This is by design if I am not mistaken. Such columns need to be escaped, for example, this works just fine: 
{code:java}
Seq((0)).toDF("the.id").select("`the.id`").show() {code}
IMHO, it is not an issue.

> Wrong column backticks in UNRESOLVED_COLUMN error
> -------------------------------------------------
>
>                 Key: SPARK-39783
>                 URL: https://issues.apache.org/jira/browse/SPARK-39783
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.4.0
>            Reporter: Enrico Minack
>            Priority: Major
>
> The following code references a nested value {{{}`the`.`id`{}}}, that does not exist. The proposed / existing column names are not correctly wrapped in backticks:
> {code:scala}
> Seq((0)).toDF("the.id")
>   .select("the.id").show()
> org.apache.spark.sql.AnalysisException: [UNRESOLVED_COLUMN] A column or function parameter with name `the`.`id` cannot be resolved. Did you mean one of the following? [`the`.`id`];
> {code}
> Instead of {{{}[`the`.`id`]{}}}, you would expect {{{}[`the.id`]{}}}.
> Here an example where the nested column exists:
> {code:scala}
> spark.range(1).select(struct("id").as("the"))
>   .select("the.id").show()
> +---+
> | id|
> +---+
> |  0|
> +---+
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org