You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Takeshi Yamamuro (Jira)" <ji...@apache.org> on 2019/11/01 13:29:00 UTC
[jira] [Created] (SPARK-29715) Support SELECT statements in VALUES
of INSERT INTO
Takeshi Yamamuro created SPARK-29715:
----------------------------------------
Summary: Support SELECT statements in VALUES of INSERT INTO
Key: SPARK-29715
URL: https://issues.apache.org/jira/browse/SPARK-29715
Project: Spark
Issue Type: Sub-task
Components: SQL
Affects Versions: 3.0.0
Reporter: Takeshi Yamamuro
In PgSQL, we can use SELECT statements in VALUES of INSERT INTO;
{code}
postgres=# create table t (c0 int, c1 int);
CREATE TABLE
postgres=# insert into t values (3, (select 1));
INSERT 0 1
postgres=# select * from t;
c0 | c1
----+----
3 | 1
(1 row)
{code}
{code}
scala> sql("""create table t (c0 int, c1 int) using parquet""")
scala> sql("""insert into t values (3, (select 1))""")
org.apache.spark.sql.AnalysisException: unresolved operator 'Project [unresolvedalias(1, None)];;
'InsertIntoStatement 'UnresolvedRelation [t], false, false
+- 'UnresolvedInlineTable [col1, col2], [List(3, scalar-subquery#0 [])]
+- 'Project [unresolvedalias(1, None)]
+- OneRowRelation
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.failAnalysis(CheckAnalysis.scala:47)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.failAnalysis$(CheckAnalysis.scala:46)
at org.apache.spark.sql.catalyst.analysis.Analyzer.failAnalysis(Analyzer.scala:122)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.$anonfun$checkAnalysis$36(CheckAnalysis.scala:540)
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.$anonfun$checkAnalysis$36$adapted(CheckAnalysis.scala:538)
at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:154)
{code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org