You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andreas Damm (JIRA)" <ji...@apache.org> on 2016/09/30 21:22:22 UTC

[jira] [Created] (SPARK-17749) Unresolved columns when nesting SQL join clauses

Andreas Damm created SPARK-17749:
------------------------------------

             Summary: Unresolved columns when nesting SQL join clauses
                 Key: SPARK-17749
                 URL: https://issues.apache.org/jira/browse/SPARK-17749
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.0.0
            Reporter: Andreas Damm


Given tables

CREATE TABLE `sf_datedconversionrate2`(`isocode` string)
CREATE TABLE `sf_opportunity2`(`currencyisocode` string, `accountid` string)
CREATE TABLE `sf_account2`(`id` string)

the following SQL will cause an analysis exception (cannot resolve '`sf_opportunity.currencyisocode`' given input columns: [isocode, id])

SELECT    0 
FROM      `sf_datedconversionrate2` AS `sf_datedconversionrate` 
LEFT JOIN `sf_account2`             AS `sf_account` 
LEFT JOIN `sf_opportunity2`         AS `sf_opportunity` 
ON        `sf_account`.`id` = `sf_opportunity`.`accountid` 
ON        `sf_datedconversionrate`.`isocode` = `sf_opportunity`.`currencyisocode` 

even though all columns referred to in the conditions should be in scope.

Re-ordering the join and on clauses will make it work

SELECT    0 
FROM      `sf_datedconversionrate2` AS `sf_datedconversionrate` 
LEFT JOIN `sf_opportunity2`         AS `sf_opportunity` 
LEFT JOIN `sf_account2`             AS `sf_account` 
ON        `sf_account`.`id` = `sf_opportunity`.`accountid` 
ON        `sf_datedconversionrate`.`isocode` = `sf_opportunity`.`currencyisocode` 

but the original should work also.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org