You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Michael Armbrust (JIRA)" <ji...@apache.org> on 2014/12/12 07:53:14 UTC
[jira] [Resolved] (SPARK-4825) CTAS fails to resolve when created
using saveAsTable
[ https://issues.apache.org/jira/browse/SPARK-4825?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Michael Armbrust resolved SPARK-4825.
-------------------------------------
Resolution: Fixed
Fix Version/s: 1.2.1
Target Version/s: 1.2.1 (was: 1.2.0)
> CTAS fails to resolve when created using saveAsTable
> ----------------------------------------------------
>
> Key: SPARK-4825
> URL: https://issues.apache.org/jira/browse/SPARK-4825
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.2.0
> Reporter: Michael Armbrust
> Assignee: Cheng Hao
> Priority: Critical
> Fix For: 1.2.1
>
>
> While writing a test for a different issue, I found that saveAsTable seems to be broken:
> {code}
> test("save join to table") {
> val testData = sparkContext.parallelize(1 to 10).map(i => TestData(i, i.toString))
> sql("CREATE TABLE test1 (key INT, value STRING)")
> testData.insertInto("test1")
> sql("CREATE TABLE test2 (key INT, value STRING)")
> testData.insertInto("test2")
> testData.insertInto("test2")
> sql("SELECT COUNT(a.value) FROM test1 a JOIN test2 b ON a.key = b.key").saveAsTable("test")
> checkAnswer(
> table("test"),
> sql("SELECT COUNT(a.value) FROM test1 a JOIN test2 b ON a.key = b.key").collect().toSeq)
> }
>
> sql("SELECT COUNT(a.value) FROM test1 a JOIN test2 b ON a.key = b.key").saveAsTable("test")
> org.apache.spark.sql.catalyst.errors.package$TreeNodeException: Unresolved plan found, tree:
> 'CreateTableAsSelect None, test, false, None
> Aggregate [], [COUNT(value#336) AS _c0#334L]
> Join Inner, Some((key#335 = key#339))
> MetastoreRelation default, test1, Some(a)
> MetastoreRelation default, test2, Some(b)
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org