You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (Jira)" <ji...@apache.org> on 2022/01/13 13:32:00 UTC

[jira] [Commented] (SPARK-37895) Error while joining two tables with non-english field names

    [ https://issues.apache.org/jira/browse/SPARK-37895?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17475360#comment-17475360 ] 

Wenchen Fan commented on SPARK-37895:
-------------------------------------

This bug is only in JDBC v2. In the v2 code path, we always enable nested column in filter pushdown, and the column name in the predicate follows SQL style, which may have quotes.

In the long term, this problem can be fixed by using v2 filters, which has native support for nested columns, so that we don't need to encode nested column into a single string and introduce quotes. For now, I think we should fix the v1 filter pushdown code path in JDBC v2, which is `JDBCScanBuilder.pushFilters`.

> Error while joining two tables with non-english field names
> -----------------------------------------------------------
>
>                 Key: SPARK-37895
>                 URL: https://issues.apache.org/jira/browse/SPARK-37895
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.2.0, 3.3.0
>            Reporter: Marina Krasilnikova
>            Priority: Minor
>
> While trying to join two tables with non-english field names in postgresql with query like
> "select view1.`Имя1` , view1.`Имя2`, view2.`Имя3` from view1 left join  view2 on view1.`Имя2`=view2.`Имя4`"
> we get an error which says that there is no field "`Имя4`" (field name is surrounded by backticks).
> It appears that to get the data from the second table it constructs query like
> SELECT "Имя3","Имя4" FROM "public"."tab2"  WHERE ("`Имя4`" IS NOT NULL) 
> and these backticks are redundant in WHERE clause.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org