You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Pavan Kothamasu (Jira)" <ji...@apache.org> on 2020/06/26 06:44:00 UTC

[jira] [Created] (SPARK-32101) The name in the with clause when it is same as table name. And when that table name is used in the other places, it is not taking the table, it is considering the with clause.

Pavan Kothamasu created SPARK-32101:
---------------------------------------

             Summary: The name in the with clause when it is same as table name. And when that table name is used in the other places, it is not taking the table, it is considering the with clause.
                 Key: SPARK-32101
                 URL: https://issues.apache.org/jira/browse/SPARK-32101
             Project: Spark
          Issue Type: Bug
          Components: Spark Core, SQL
    Affects Versions: 2.2.1
            Reporter: Pavan Kothamasu


The name in the with clause when it is same as table name. And when that table name is used in the other places, it is not taking the table, it is considering the with clause sub table. The example is given below with explanation:

database1.sample structure:

columns: id,name

 

with sample as (

select id, 1 as cnt from database1.sample),

with amp as(

select name,id from database1sample)

select * from sample inner join amp on amp.id=sample.id

;

 

In this example, second alias will fail for missing "name" column, even though the database1.sample table has "name" column. Becuase it is referring column "name" from first alias sample. Because the tablename and alias name are same here and this error came.

The bug is it should not take the alias even i have given databasename along with tablename. It should consider from metadata of the table not from alias.

Here the solution is tablename should not be alias name. The first alias should be sample1 or some thing like that.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org