You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/10/07 20:40:20 UTC
[jira] [Assigned] (SPARK-17749) Unresolved columns when nesting SQL
join clauses
[ https://issues.apache.org/jira/browse/SPARK-17749?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-17749:
------------------------------------
Assignee: (was: Apache Spark)
> Unresolved columns when nesting SQL join clauses
> ------------------------------------------------
>
> Key: SPARK-17749
> URL: https://issues.apache.org/jira/browse/SPARK-17749
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.0.0
> Reporter: Andreas Damm
>
> Given tables
> CREATE TABLE `sf_datedconversionrate2`(`isocode` string)
> CREATE TABLE `sf_opportunity2`(`currencyisocode` string, `accountid` string)
> CREATE TABLE `sf_account2`(`id` string)
> the following SQL will cause an analysis exception (cannot resolve '`sf_opportunity.currencyisocode`' given input columns: [isocode, id])
> SELECT 0
> FROM `sf_datedconversionrate2` AS `sf_datedconversionrate`
> LEFT JOIN `sf_account2` AS `sf_account`
> LEFT JOIN `sf_opportunity2` AS `sf_opportunity`
> ON `sf_account`.`id` = `sf_opportunity`.`accountid`
> ON `sf_datedconversionrate`.`isocode` = `sf_opportunity`.`currencyisocode`
> even though all columns referred to in the conditions should be in scope.
> Re-ordering the join and on clauses will make it work
> SELECT 0
> FROM `sf_datedconversionrate2` AS `sf_datedconversionrate`
> LEFT JOIN `sf_opportunity2` AS `sf_opportunity`
> LEFT JOIN `sf_account2` AS `sf_account`
> ON `sf_account`.`id` = `sf_opportunity`.`accountid`
> ON `sf_datedconversionrate`.`isocode` = `sf_opportunity`.`currencyisocode`
> but the original should work also.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org