You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (Jira)" <ji...@apache.org> on 2019/11/14 15:14:00 UTC
[jira] [Created] (SPARK-29900) make relation lookup behavior
consistent within Spark SQL
Wenchen Fan created SPARK-29900:
-----------------------------------
Summary: make relation lookup behavior consistent within Spark SQL
Key: SPARK-29900
URL: https://issues.apache.org/jira/browse/SPARK-29900
Project: Spark
Issue Type: Improvement
Components: SQL
Affects Versions: 3.0.0
Reporter: Wenchen Fan
Currently, Spark has 2 different relation resolution behaviors:
1. try to look up temp view first, then try table/persistent view.
2. try to look up table/persistent view.
The first behavior is used in SELECT, INSERT and a few commands that support views, like DESC TABLE.
The second behavior is used in most commands.
It's confusing to have inconsistent relation resolution behaviors, and the benefit is super small. It's only useful when there are temp view and table with the same name, but users can easily use qualified table name to disambiguate.
In postgres, the relation resolution behavior is consistent
{code}
cloud0fan=# create schema s1;
CREATE SCHEMA
cloud0fan=# SET search_path TO s1;
SET
cloud0fan=# create table s1.t (i int);
CREATE TABLE
cloud0fan=# insert into s1.t values (1);
INSERT 0 1
# access table with qualified name
cloud0fan=# select * from s1.t;
i
---
1
(1 row)
# access table with single name
cloud0fan=# select * from t;
i
---
1
2
(2 rows)
# create a temp view with conflicting name
cloud0fan=# create temp view t as select 2 as i;
CREATE VIEW
# same as spark, temp view has higher proirity during resolution
cloud0fan=# select * from t;
i
---
2
(1 row)
# DROP TABLE also resolves temp view first
cloud0fan=# drop table t;
ERROR: "t" is not a table
# DELETE also resolves temp view first
cloud0fan=# delete from t where i = 0;
ERROR: cannot delete from view "t"
{code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org