You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2017/02/01 14:49:51 UTC

[jira] [Commented] (SPARK-18841) PushProjectionThroughUnion exception when there are same column

    [ https://issues.apache.org/jira/browse/SPARK-18841?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15848444#comment-15848444 ] 

Apache Spark commented on SPARK-18841:
--------------------------------------

User 'hvanhovell' has created a pull request for this issue:
https://github.com/apache/spark/pull/16757

> PushProjectionThroughUnion exception when there are same column
> ---------------------------------------------------------------
>
>                 Key: SPARK-18841
>                 URL: https://issues.apache.org/jira/browse/SPARK-18841
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.2, 2.1.0
>            Reporter: Song Jun
>
> {noformat}
> DROP TABLE IF EXISTS p1 ;
> DROP TABLE IF EXISTS p2 ;
> DROP TABLE IF EXISTS p3 ;
> CREATE TABLE p1 (col STRING) ;
> CREATE TABLE p2 (col STRING) ;
> CREATE TABLE p3 (col STRING) ;
> set spark.sql.crossJoin.enabled = true;
> SELECT
>   1 as cste,
>   col
> FROM (
>   SELECT
>     col as col
>   FROM (
>     SELECT
>       p1.col as col
>     FROM p1
>     LEFT JOIN p2 
>     UNION ALL
>     SELECT
>       col
>     FROM p3
>   ) T1
> ) T2
> ;
> {noformat}
> it will throw exception:
> {noformat}
> key not found: col#16
> java.util.NoSuchElementException: key not found: col#16
>         at scala.collection.MapLike$class.default(MapLike.scala:228)
>         at org.apache.spark.sql.catalyst.expressions.AttributeMap.default(AttributeMap.scala:31)
>         at scala.collection.MapLike$class.apply(MapLike.scala:141)
>         at org.apache.spark.sql.catalyst.expressions.AttributeMap.apply(AttributeMap.scala:31)
>         at org.apache.spark.sql.catalyst.optimizer.PushProjectionThroughUnion$$anonfun$2.applyOrElse(Optimizer.scala:346)
>         at org.apache.spark.sql.catalyst.optimizer.PushProjectionThroughUnion$$anonfun$2.applyOrElse(Optimizer.scala:345)
>         at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:292)
>         at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:292)
>         at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:74)
>         at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:291)
>         at org.apache.spark.sql.catalyst.trees.TreeNode.transform(TreeNode.scala:281)
>         at org.apache.spark.sql.catalyst.optimizer.PushProjectionThroughUnion$.org$apache$spark$sql$catalyst$optimizer$PushProjectionThroughUnion$$pushToRight(Optimizer.scala:345)
>         at org.apache.spark.sql.catalyst.optimizer.PushProjectionThroughUnion$$anonfun$apply$4$$anonfun$8$$anonfun$apply$31.apply(Optimizer.scala:378)
>         at org.apache.spark.sql.catalyst.optimizer.PushProjectionThroughUnion$$anonfun$apply$4$$anonfun$8$$anonfun$apply$31.apply(Optimizer.scala:378)
>         at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>         at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>         at scala.collection.immutable.List.foreach(List.scala:381)
>         at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
>         at scala.collection.immutable.List.map(List.scala:285)
>         at org.apache.spark.sql.catalyst.optimizer.PushProjectionThroughUnion$$anonfun$apply$4$$anonfun$8.apply(Optimizer.scala:378)
>         at org.apache.spark.sql.catalyst.optimizer.PushProjectionThroughUnion$$anonfun$apply$4$$anonfun$8.apply(Optimizer.scala:376)
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org