You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (JIRA)" <ji...@apache.org> on 2016/10/01 18:05:20 UTC

[jira] [Comment Edited] (SPARK-17749) Unresolved columns when nesting SQL join clauses

    [ https://issues.apache.org/jira/browse/SPARK-17749?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15538938#comment-15538938 ] 

Dongjoon Hyun edited comment on SPARK-17749 at 10/1/16 6:04 PM:
----------------------------------------------------------------

Hi, [~andreasdamm].

Thank you for reporting issue. I saw that 2.0.0 raises the following as you mentioned.
{code}
org.apache.spark.sql.AnalysisException: cannot resolve '`sf_opportunity.currencyisocode`' given input columns: [isocode, id]; line 1 pos
{code}

For current master branch (Spark 2.1.0-SNAPSHOT), it's a little bit different.
{code}
org.apache.spark.sql.catalyst.parser.ParseException:
extraneous input 'ON' expecting
{code}

For Spark 2.0.1-rc4, I didn't test on that, but I guess it has the similar errors.

Could you try with the following query by replacing the second 'ON' with 'AND'? The following query is tested on Spark 2.0.0 and 2.1.0-SNAPSHOT.
{code}
SELECT 0
FROM `sf_datedconversionrate2` AS `sf_datedconversionrate`
LEFT JOIN `sf_account2` AS `sf_account`
LEFT JOIN `sf_opportunity2` AS `sf_opportunity`
ON `sf_account`.`id` = `sf_opportunity`.`accountid` AND `sf_datedconversionrate`.`isocode` = `sf_opportunity`.`currencyisocode`
{code}


was (Author: dongjoon):
Hi, [~andreasdamm].

Thank you for reporting issue. I saw that 2.0.0 raises the following as you mentioned.
```
org.apache.spark.sql.AnalysisException: cannot resolve '`sf_opportunity.currencyisocode`' given input columns: [isocode, id]; line 1 pos
```

For current master branch (Spark 2.1.0-SNAPSHOT), it's a little bit different.
```
org.apache.spark.sql.catalyst.parser.ParseException:
extraneous input 'ON' expecting
```

For Spark 2.0.1-rc4, I didn't test on that, but I guess it has the similar errors.

Could you try with the following query by replacing the second 'ON' with 'AND'? The following query is tested on Spark 2.0.0 and 2.1.0-SNAPSHOT.
```
SELECT 0
FROM `sf_datedconversionrate2` AS `sf_datedconversionrate`
LEFT JOIN `sf_account2` AS `sf_account`
LEFT JOIN `sf_opportunity2` AS `sf_opportunity`
ON `sf_account`.`id` = `sf_opportunity`.`accountid` AND `sf_datedconversionrate`.`isocode` = `sf_opportunity`.`currencyisocode`
```

> Unresolved columns when nesting SQL join clauses
> ------------------------------------------------
>
>                 Key: SPARK-17749
>                 URL: https://issues.apache.org/jira/browse/SPARK-17749
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: Andreas Damm
>
> Given tables
> CREATE TABLE `sf_datedconversionrate2`(`isocode` string)
> CREATE TABLE `sf_opportunity2`(`currencyisocode` string, `accountid` string)
> CREATE TABLE `sf_account2`(`id` string)
> the following SQL will cause an analysis exception (cannot resolve '`sf_opportunity.currencyisocode`' given input columns: [isocode, id])
> SELECT    0 
> FROM      `sf_datedconversionrate2` AS `sf_datedconversionrate` 
> LEFT JOIN `sf_account2`             AS `sf_account` 
> LEFT JOIN `sf_opportunity2`         AS `sf_opportunity` 
> ON        `sf_account`.`id` = `sf_opportunity`.`accountid` 
> ON        `sf_datedconversionrate`.`isocode` = `sf_opportunity`.`currencyisocode` 
> even though all columns referred to in the conditions should be in scope.
> Re-ordering the join and on clauses will make it work
> SELECT    0 
> FROM      `sf_datedconversionrate2` AS `sf_datedconversionrate` 
> LEFT JOIN `sf_opportunity2`         AS `sf_opportunity` 
> LEFT JOIN `sf_account2`             AS `sf_account` 
> ON        `sf_account`.`id` = `sf_opportunity`.`accountid` 
> ON        `sf_datedconversionrate`.`isocode` = `sf_opportunity`.`currencyisocode` 
> but the original should work also.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org