You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Cheng Lian (JIRA)" <ji...@apache.org> on 2014/10/06 12:02:34 UTC
[jira] [Commented] (SPARK-3810) Rule PreInsertionCasts doesn't
handle partitioned table properly
[ https://issues.apache.org/jira/browse/SPARK-3810?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14160151#comment-14160151 ]
Cheng Lian commented on SPARK-3810:
-----------------------------------
This issue is marked as MINOR because it doesn't affect correctness. All the redundant {{Project}}s can be removed by the subsequent optimization phase.
> Rule PreInsertionCasts doesn't handle partitioned table properly
> ----------------------------------------------------------------
>
> Key: SPARK-3810
> URL: https://issues.apache.org/jira/browse/SPARK-3810
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.1.0
> Reporter: Cheng Lian
> Priority: Minor
>
> This issue can be reproduced by the following {{sbt/sbt hive/console}} session:
> {code}
> scala> loadTestTable("src")
> ...
> scala> loadTestTable("srcpart")
> ...
> scala> sql("INSERT INTO TABLE srcpart PARTITION (ds='1', hr='2') SELECT key, value FROM src").queryExecution
> ...
> == Parsed Logical Plan ==
> InsertIntoTable (UnresolvedRelation None, srcpart, None), Map(ds -> Some(hello), hr -> Some(world)), false
> Project ['key,'value]
> UnresolvedRelation None, src, None
> == Analyzed Logical Plan ==
> InsertIntoTable (MetastoreRelation default, srcpart, None), Map(ds -> Some(hello), hr -> Some(world)), false
> Project [key#50,value#51]
> Project [key#50,value#51]
> Project [key#50,value#51]
> Project [key#50,value#51]
> Project [key#50,value#51]
> Project [key#50,value#51]
> Project [key#50,value#51]
> Project [key#50,value#51]
> Project [key#50,value#51]
> Project [key#50,value#51]
> Project [key#50,value#51]
> Project [key...
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org