You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Kousuke Saruta (Jira)" <ji...@apache.org> on 2021/03/16 15:18:00 UTC

[jira] [Created] (SPARK-34763) col(), $"" and df("name") should handle quoted column name properly.

Kousuke Saruta created SPARK-34763:
--------------------------------------

             Summary: col(), $"<name>" and df("name") should handle quoted column name properly.
                 Key: SPARK-34763
                 URL: https://issues.apache.org/jira/browse/SPARK-34763
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 3.1.1, 3.0.2, 2.4.7, 3.2.0
            Reporter: Kousuke Saruta
            Assignee: Kousuke Saruta


Quoted column names like `a``b.c` cannot be represented with col(), $"<name>" and df("") because they don't handle such column names properly.

For example, if we have a following DataFrame.
{code}
val df1 = spark.sql("SELECT 'col1' AS `a``b.c`")
{code}

For the DataFrame, this query is successfully executed.
{code}
scala> df1.selectExpr("`a``b.c`").show
+-----+
|a`b.c|
+-----+
| col1|
+-----+
{code}

But the following query will fail because df1("`a``b.c`") throws an exception.
{code}
scala> df1.select(df1("`a``b.c`")).show
org.apache.spark.sql.AnalysisException: syntax error in attribute name: `a``b.c`;
  at org.apache.spark.sql.catalyst.analysis.UnresolvedAttribute$.e$1(unresolved.scala:152)
  at org.apache.spark.sql.catalyst.analysis.UnresolvedAttribute$.parseAttributeName(unresolved.scala:162)
  at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveQuoted(LogicalPlan.scala:121)
  at org.apache.spark.sql.Dataset.resolve(Dataset.scala:221)
  at org.apache.spark.sql.Dataset.col(Dataset.scala:1274)
  at org.apache.spark.sql.Dataset.apply(Dataset.scala:1241)
  ... 49 elided
{code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org